Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FIRMWARE UPDATE! AP2 Local road driving...and holy crap

This site may earn commission on affiliate links.
AP2 should improve in a S-curve like fashion with the first stage being the hardest, looks like they could be getting close to the faster stage of the s-curve, this will probably be when they turn on the other cameras.

I doubt it. Autonomous driving development work is a slog. Slogs are not on an S curve.

If it there was going to be accelerated development it would have been with camera based cars working very well in developing precise 3D mapping. That technology appears to be a lot harder than what Musk expected. Also, Mobileye is the company most experienced at mapping with cameras only.

The likelihood that Tesla is only using one camera suggests to me more that they are hanging on by their fingertips as opposed to being on the brink of a breakout. Way harder to build a 360 model of the environment around the car using eight cameras compared to a single forward looking cone shaped view.

Both Tesla and Uber probably get too much credit for being leaders in autonomous driving. Mercurial CEO's make news but usually don't run long term development projects well.
 
  • Informative
Reactions: NerdUno
Am in the tech user interface field and been looking at AI based UI like talking machines (Siri+Echo+copies) and self driving cars, it seems these AI's trained pattern recognition has the ability to provide a good demo under good conditions. But when conditions are more ambiguous, the AI has to give up. For humans users to trust an UI, it has to work under nearly all conditions.

Test drove a service loaner with AP2 last couple of days. It gets lost on lane markers unless they are good/perfectly visible (not the case in real world on local roads). And attempts to stay in center of lanes on gradual turns which feels like turning late to human drivers as we tend to cut into corners on turns.

Detecting/determining the driving lanes under all conditions will be quite a challenge if self driving car developers ever hope to get human out of the car.

The blunt pattern matching using GPU (with end to end NN) is a toddler's basic skill or ancient human being/apes learning skill. We gotta appreciate MBLY's technology, which is not only matching patterns (detecting objects) but also making sense of the patterns (knowing what they are and what they implies). All these contribute to the smoothness of AP1 and the 40% accident reduction.

 
Something of interest, I took delivery of an x90d yesterday with EAP+FSDC.

I was autosteering up to a T-junction, and the car started to slow down as I approached it. Not sure if it was because of the red light, or the barrier in front (as there is no road to continue on).

Will try and test it again and will record if I can
 
Something of interest, I took delivery of an x90d yesterday with EAP+FSDC.

I was autosteering up to a T-junction, and the car started to slow down as I approached it. Not sure if it was because of the red light, or the barrier in front (as there is no road to continue on).

Will try and test it again and will record if I can
This has been discusses in other threads. One can't be sure but I'm fairly certain it's the barrier. I had a stop sign that I suspected it was reading but realized 20 ft later were a bunch of trees and it is probably responding to that...
 
I'm enjoying my Tesla steering itself into curbs, toward parked cars, across lane markings, etc.

I know I'm not the person you're asking, but just in case!

But, I confess, it is AWESOME at going straight down a freeway... unless it sees a ghost and hits the breaks. But hey, Elon's smile....sigh.... I'm going to draw a picture of him on my new TeslaPaint app. I'll look at it and gaze into his eyes when I get frightened. I'm sure that'll make it better.

Are you long TSLA?