Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla.com - "Transitioning to Tesla Vision"

This site may earn commission on affiliate links.
Thank you for posting this Mike. Great info!

I found the following lines particularly interesting (Emphasis added). Just a guess here, but it sounds like Tesla is very much depending on real time network processing to implement some important "stuff".

"Please note that if you opt out from the collection of telematics log data or any other data from your Tesla vehicle, we will not be able to notify you of issues applicable to your vehicle in real time. This may result in your vehicle suffering from reduced functionality, serious damage, or inoperability, and it may also disable many features of your vehicle..."
 
If I see that, maybe send 30 seconds of camera/telemetry back for further analysis. So when an event like "take over now", or AEB, or override of NAV while on a city street, or ABS use, etc.
30 seconds of data from 8 cameras is about 150 MB of data. It's not small or easy to send in real time.

Given AP today has no idea how to navigate many things, and isn't even all that great at changing lanes on the highway, I probably override the AP every couple miles or so on average, not because it messed up, but because it's incapable. So if I drive 50 miles in a day, it needs to upload 1.5GB of video. For no reason. If you do this over LTE, it's about $3 in bulk volumes.

Recording when you AEB is kinda pointless too- that's a case where the car did what it was supposed to. The interesting cases are where the system didn't do what it was supposed to- like stop for a red light, a pedestrian, or a cone, or where it didn't do the right thing when the lanes split. You need a way to trigger on this. Disengagements are a reasonable proxy for this, but 95% of them are for other reasons.

This is likely why Tesla doesn't currently do what you say- which is to upload a video every time someone disengages. They have said what they are doing today is to train classifiers to see things like stop signs, and then they upload cases where they see an interesting stop sign that is in a location that is unexpected, or partially obscured, because they're much more focused on expanding the capabilities of the system than they are at improving the current performance of what is there.
 
Thank you for posting this Mike. Great info!

I found the following lines particularly interesting (Emphasis added). Just a guess here, but it sounds like Tesla is very much depending on real time network processing to implement some important "stuff".

"Please note that if you opt out from the collection of telematics log data or any other data from your Tesla vehicle, we will not be able to notify you of issues applicable to your vehicle in real time. This may result in your vehicle suffering from reduced functionality, serious damage, or inoperability, and it may also disable many features of your vehicle..."
“real time” is a lax definition. Tesla reports warning on 12v battery issues in seconds to minutes. Tesla does not make Autopilot driving real time processing on remote servers. Yes, they communicate back and forth from the car mostly over wifi while the car is parked. People have measured gigabytes of data transferred at night when a big upgrade was installed. Most measurements of home root transfers are in the tens of megabytes. (References in this forum somewhere).

basically the car works best with full premium Connectivity.
 
Please understand I am merely playing devil's advocate here for the sake of meaningful debate. I have spent most of my carrier designing and implementing embedded realtime control systems for various platforms. Most of these "phone home" from time to time, some much more than others. I get the latency issue.

But the fact is I have yet to see any authentic technical documentation or other proof as to how Tesla's on board systems utilize their network connectivity, yet I am reading some pretty strong assertions about how they must work.

Are these facts, or are these educated guesses?

Other than the official statements mikes_fsd posted above, which I understand is more of a legal disclaimer than a technical description, do we really know anything for sure?
 
Please understand I am merely playing devil's advocate here for the sake of meaningful debate. I have spent most of my carrier designing and implementing embedded realtime control systems for various platforms. Most of these "phone home" from time to time, some much more than others. I get the latency issue.

But the fact is I have yet to see any authentic technical documentation or other proof as to how Tesla's on board systems utilize their network connectivity, yet I am reading some pretty strong assertions about how they must work.

Are these facts, or are these educated guesses?

Other than the official statements mikes_fsd posted above, which I understand is more of a legal disclaimer than a technical description, do we really know anything for sure?
I've heard Elon talk about I before saying that it will work with no connection at all. You could be in the middle of nowhere with no signal at all and the car will have to drive itself.
 
Other than the official statements mikes_fsd posted above, which I understand is more of a legal disclaimer than a technical description, do we really know anything for sure?
We have examples of Karpathy saying they request data from the fleet for Autopilot/FSD development. [emphasis mine]
11:50
actually train these kinds of detectors
offline so we can train a small detector
that detects an occluded stop sign by
trees and then what we do with that
11:58
detector is that we can beam it down to
the fleet
and we can ask the fleet
please apply this detector on top of
everything else you're doing and is that
this detector scores high then please
send us an image and then the fleet
responds with somewhat noisy set but
they boosted the amount of examples we
have of stop signs that are occluded
and
maybe 10% of them are actual occluded
stop signs that we get from that stream
and this requires no firmware upgrade
this is completely dynamic and can just
be done by the team extremely quickly is
the bread and butter
of how we actually
get any of these tasks to work just
accumulating these large data sets in
the full tail that distribution
so we
have tens of thousands of occluded stop
signs the fleet can send us as many as
it takes
we have the biggest data set
for "except right turn" on stops
I'm
basically certain of that
 
  • Like
Reactions: DGSteig
the notion that Tesla with a lousy LTE connection (if even that depending on the country) would communicate in real-time for driving features / control systems is beyond funny... reminds me of the privacy concerns in Germany that Tesla records videos and sends them "back home". Yeah ... totally going to work with the spotty highspeed mobile 4G network in Germany ;)
 
But the fact is I have yet to see any authentic technical documentation or other proof as to how Tesla's on board systems utilize their network connectivity, yet I am reading some pretty strong assertions about how they must work.
And you don't have any authentic technical documentation saying it DOES need a connection. And guess what? Most companies don't put all the negatives in their manuals (Don't worry, your ABS works even if the seat heaters are broken. Don't worry, your headlights don't need washer fluid to work). You're asking the world to prove a negative when you don't have any data showing a positive at all.

You can go prove this yourself though, like lots of us have via natural experiments. Go drive the car in a a place with no cell coverage. AP works fine. The map tiles do not. The map routing keeps working (those are local). Local voice commands work (seat heater on) but asking where the closest Krispy Kreme is does not.

You can even do this somewhere with LTE- just hook to a WiFi hotspot that has no connection to the internet. See that music doesn't work, and map tiles don't load. Now go use AP. Works fine, no warnings/errors/limits.
 
@rjpjnk - As far as any of the neural-nets used by Tesla, I can assure you that there is no way those could rely on network connectivity for any sort of "real-time" feedback. Others have already pointed out the large number of issues around latency, network availability and how dangerous it would be to have any control system be reliant on a network like this. But setting all of that aside, neural-nets cannot typically be updated in any incremental way usually. The weights associated with neural nets are going to be several tens of megabytes if not hundreds of megabytes in size. Now, perhaps they might freeze their backbone and only retrain all the various "heads" that do different tasks, but even still, the weights for all the heads will need updating and are not trivially small.

The computing capabilities on a Tesla are insufficient for on-the-fly model (re)-training and updates based on input imagery. Heck, most consumer grade gaming GPUs probably won't suffice to train Tesla's networks and I'm sure they rely on large clusters of expensive GPUs like the V100/A100 to train their networks. Even if you had instantaneous communications and infinite bandwidth, I don't think Tesla wants to be in the business of maintaining hundreds of thousands of instances of their neural nets on the cloud that are being trained on-the-fly with frames from customer's cars so the "personalized" weights can be updated and communicated back to be reloaded on the car.

Karpathy has also talked extensively about all the performance metrics and infrastructure they have developed to validate any new models they train to ensure they are improving and not regressing on any front. It would be madness to let models on cars walk away from their starting point with any sort of on-the-fly updates without always re-running all validation metrics to ensure the quality of your algorithms. Running these regression tests are extremely compute heavy as you are typically running on hundreds of thousands of video frames (if not more) and making sure all the component systems still play well together.

I think Tesla's current approach with "shadow" mode running new alpha/beta versions in the background and providing some feedback from that back to Tesla when bandwidth is available so they can continuously refine and improve their algorithms is the smartest way to manage things with the limitations and constraints in play while enabling fast updates in the quality of their algorithms since everything is streamlined and integrated well. Tesla has built a lot of impressive infrastructure to enable this, and that is certainly a big competitive advantage they have over any other car manufacturer today.
 
Is this a chinese made example, driving in China?
I checked with google translate with auto language detection and it says Japanese, not Chinese which is consistent with what @powertoold said.
Do China cars also lack radar? I
No. Currently only North America would have the new Tesla radarless features. The rest of the world does not have it.
 
  • Like
Reactions: daktari
Just chiming in with my latest AP experience. I'm running the 2021.4.18.2, and activated Navigate on Autopilot for the freeway portion of my commute, as is usual. This is the first time in months (since my HW2 days, basically) that my car was lane hunting on a straight freeway. The steering corrections were noticeable but not so abrupt to where I needed to immediately take over. It did, at one point, move uncomfortably close to a big rig in the next lane over, and AP finally freaked out and asked me to take the wheel.

I don't think the new/tweaked vision software is 100% at fault. I received the update this past Monday, so I've done my morning commute with this version of the software without issue until today. I was commuting earlier in the morning than normal. So the roads had less cars, but the sun was in my face, so I'm sure the car's cameras were blinded too. You could even see the Nav on AP planned path (blue line) dancing around a few pixels in either direction (it normally tracks straight)
 
...Yes, lead car can help marginally - firm up the lane detection. But not traffic lights (mostly). Even with lane detection - it doesn't help much at junctions. Overall, difficult decision to continue driving if the overall confidence is reduced...

Tonight I was on Navigation on Autopilot with my 2017 Model X and it was approaching a car in front that had no rear lights at all. It's at the section that there were no street lights so it was really dark and I would have missed it if I didn't pay attention. So I experimented and turned my headlights off. The Autosteer alarmed and the system went offline immediately but the TACC still worked fine. I looked at the instrument cluster, the front car icon was still there constantly but the lane lines were gone. I couldn't actually see the lane lines on the actual road because the headlights were off but I could steer and followed the car in front fine while the TACC would keep a safe distance automatically.

So my conclusion is: For my car, when the vision is impaired, the Autosteer can stop working but the TACC can still go on working fine. I think losing 1 out of 2 functions is better than losing both.
 
With 2021.4.18.2 I’ve been able to drive in Autopilot with my Model Y without touching the steering wheel for 3-4 minutes. It seems like the internal camera is watching me. When I started look around, the screen started flashing the blue signal to touch the wheel. It was interesting.

I had set the internal camera to active a month or so ago. i think it might impact how soon FSD could be distributed to me.

The car drives well on rural roads, but still twitches abit at intersections where the paint lines disappear.
 
Well, I just did a 190 mile round trip (mostly freeway, which is what this report is about) from Orange County into San Diego Wine Country.

Our 2020 MY is on 2021.4.18.3, and OMG, autopilot and FSD were noticeably improved, especially lane changes. These are much more snappy and sure-footed (and safe, IMO). Center in the lane was pretty much the rule, though in an area with chopped up concrete and missing lane markers, it did the TAKE OVER sound and was confused as to where the lane was. (This human was similarly confused at this 1/4 mile strip of road being a hot mess re paint/Botts Dots, at least two inches of concrete missing, and west facing directly into the sun.) Fortunately, the damaged area was relatively short and it (and I) found the lane boundaries once again.

Very noticeable changes, I'm assuming this is running vision /- radar because it behaves so differently than the last version a couple of months ago on this same route.
 
Well, I just did a 190 mile round trip (mostly freeway, which is what this report is about) from Orange County into San Diego Wine Country.

Our 2020 MY is on 2021.4.18.3, and OMG, autopilot and FSD were noticeably improved, especially lane changes. These are much more snappy and sure-footed (and safe, IMO). Center in the lane was pretty much the rule, though in an area with chopped up concrete and missing lane markers, it did the TAKE OVER sound and was confused as to where the lane was. (This human was similarly confused at this 1/4 mile strip of road being a hot mess re paint/Botts Dots, at least two inches of concrete missing, and west facing directly into the sun.) Fortunately, the damaged area was relatively short and it (and I) found the lane boundaries once again.

Very noticeable changes, I'm assuming this is running vision /- radar because it behaves so differently than the last version a couple of months ago on this same route.
You are correct. The speed determination using vision has improved.

Take a look at this which shows the return values from both radar and vision on .18.3