Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

V8.0 Autopilot

This site may earn commission on affiliate links.
Elon's blog post...

Upgrading Autopilot: Seeing the World in Radar

While there are dozens of small refinements with Version 8 of our software, described in addendum below, the most significant upgrade to Autopilot will be the use of more advanced signal processing to create a picture of the world using the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but was only meant to be a supplementary sensor to the primary camera and image processing system.

After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easy through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.

On the other hand, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but you would definitely not want to slam on the brakes to avoid it.

Therefore, the big problem in using radar to stop the car is avoiding false alarms. Slamming on the brakes is critical if you are about to hit something large and solid, but not if you are merely about to run over a soda can. Having lots of unnecessary braking events would at best be very annoying and at worst cause injury.

The first part of solving that problem is having a more detailed point cloud. Software 8.0 unlocks access to six times as many radar objects with the same hardware with a lot more information per object.

The second part consists of assembling those radar snapshots, which take place every tenth of a second, into a 3D “picture” of the world. It is hard to tell from a single frame whether an object is moving or stationary or to distinguish spurious reflections. By comparing several contiguous frames against vehicle velocity and expected path, the car can tell if something is real and assess the probability of collision.

The third part is a lot more difficult. When the car is approaching an overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath, this often looks like a collision course. The navigation data and height accuracy of the GPS are not enough to know whether the car will pass under the object or not. By the time the car is close and the road pitch changes, it is too late to brake.

This is where fleet learning comes in handy. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist.

When the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn’t notice the object ahead. As the system confidence level rises, the braking force will gradually increase to full strength when it is approximately 99.99% certain of a collision. This may not always prevent a collision entirely, but the impact speed will be dramatically reduced to the point where there are unlikely to be serious injuries to the vehicle occupants.

The net effect of this, combined with the fact that radar sees through most visual obscuration, is that the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions.

Taking this one step further, a Tesla will also be able to bounce the radar signal under a vehicle in front – using the radar pulse signature and photon time of flight to distinguish the signal – and still brake even when trailing a car that is opaque to both vision and radar. The car in front might hit the UFO in dense fog, but the Tesla will not.

Additional Autopilot Release Notes

  • TACC braking max ramp rate increased and latency reduced by a factor of five
  • Now controls for two cars ahead using radar echo, improving cut-out response and reaction time to otherwise-invisible heavy braking events
  • Will take highway exit if indicator on (8.0) or if nav system active (8.1). Available in the United States initially
  • Car offsets in lane when overtaking a slower vehicle driving close to its lane edge
  • Interface alerts are much more prominent, including flashing white border on instrument panel
  • Improved cut-in detection using blinker on vehicle ahead
  • Reduced likelihood of overtaking in right lane in Europe
  • Improved auto lane change availability
  • Car will not allow reengagement of Autosteer until parked if user ignores repeated warnings
  • Automatic braking will now amplify user braking in emergencies
  • In manual mode, alerts driver if about to leave the road and no torque on steering wheel has been detected since Autosteer was deactivated
  • With further data gathering, car will activate Autosteer to avoid collision when probability ~100%
  • Curve speed adaptation now uses fleet-learned roadway curvature
  • Approximately 200 small enhancements that aren’t worth a bullet point
 
I wonder what function/s remain for the front camera?
Reading speed restriction signs I guess or is this to be governed by GPS and the 'fleet' knowledge (over time) as well.
Surely it would use the camera for this function in transition at least, there is a lot of data already accumulated.

It would be great to know about any of the 200 small enhancements as well
 
I wonder what function/s remain for the front camera?
Reading speed restriction signs I guess or is this to be governed by GPS and the 'fleet' knowledge (over time) as well.
Surely it would use the camera for this function in transition at least, there is a lot of data already accumulated.

It would be great to know about any of the 200 small enhancements as well

I think it's still doing the Autosteer heavy lifting - finding and following lane lines. Also automatic high beams. I'm sure it's still tied in to the automatic emergency braking, even though the radar is doing a lot of that.
 
Agreed. The camera is the only practical way to "see" the lines on the road, and, along with GPS, the speed signs/limits with arbitration in software between what the camera "sees" and what the GPS "thinks" ought to be seen at a certain location. Radar is more flexible at "seeing" the 3D world around the car (so far as the scope of the beam allows) and making us aware of vehicles left right and two doors up :)
 
It's a reflection of the type of guy i am that I'm looking forward to v8. I just like new toys.
Because last Friday, I "drove" my P85DL from Sydney's North Shore to Bowral (lunch at Biota) and back.
I put "drove" in inverted commas because I basically put the bloody thing in Autopilot on the Hills Motorway and took it out of Autopilot at the Bowral exit.
Manual nudge for lane changes aside, I did nothing. Absolutely fantastic, and a relaxing trip to boot.
I didn't set the speed too slow (speed factor 1.1, if you know EVtripplanner...)- it just went beautifully. So whatever v8 holds, it CANNOT be a massive breakthrough for this kind of drive- maybe for more nasty / rare / dangerous situations, but really, this was sooo good. 290kms of easy swift quiet transport, 4 passengers, 50kWhrs. Hamilton soundtrack extremely loud.
It is helpful to be thankful for what one has already.
 
Hamilton rocks :cool:

Actually Elon’s post is fascinating in the software only modifications that will provide what would seem to be huge improvements - more detailed cloud point through expanded fleet learning, 3D radar image sequencing, logic improvements through expected path and fleet response comparisons and bouncing radar under a leading vehicle - all mindboggling especially without any additional hardware requirements.

Just one thing - everyone knows UFO’s employ super advanced stealth technologies o_O
 
Last edited:
I am particularly looking forward to the modification when a car or truck crowds your lane. I got caught out one evening when going past a truck, the truck was very close to the lane line and my car spooked and braked, which may have resulted in me nearly being a BMW mascot. Got lots of flashing lights and the finger from the BMW driver who though I had brake checked him.
The new version should move to one side of the lane to avoid this situation rather than brake, at least that my interpretation.
Still the BMW shouldn't have been following so close anyway.
 
Exactly. They follow too close, beep when you've indicated you're going to move into that lane and then back off (or sometimes speed up). If the lane is (ever) clear, just zip in front; but it's just plain inconsiderate when the driver behind ought to be able to see the conditions (if they're looking) nearly as well as you can.

This update, by the sounds of it, must have some interesting screen display updates for the driver -- six times as much radar target data. Should be useful to display some of the gross output of that :)
 
I am particularly looking forward to the modification when a car or truck crowds your lane. I got caught out one evening when going past a truck, the truck was very close to the lane line and my car spooked and braked, which may have resulted in me nearly being a BMW mascot. Got lots of flashing lights and the finger from the BMW driver who though I had brake checked him.
The new version should move to one side of the lane to avoid this situation rather than brake, at least that my interpretation.
Still the BMW shouldn't have been following so close anyway.

Yes, that is a real biggie for me too. It is my single greatest irritation and need to take over with the current system. Made me realise how much we just naturally as human drivers move over to far edge of our lane when crowded by a truck or wayward car in the adjoining lane.
 
Got an update pushed to the car by the service centre today, but not installed whilst it was there. Jokingly asked if it was "the big one" and was told unfortunately no. I suggested that it was probably 3 months away, but they seemed confident it would be sooner than that. They also advised it will be 8.1 when we get it as 8.0 is the test version rather than public release.

The update today was to 7.1 2.36.17 (logged on ev-fw)
 
On Sunday we noticed our AP now shows the car in front of the car in front , , . and 3 lanes wide on freeways, and cars in the adjacent lanes, and picks up cars 150m ahead - like super forward vision. I'm guessing we're in the Beta phase, anyone else notice this?

Hasn't version 7.x always done these things? Also, you wouldn't be on a beta version unless you'd agreed to beta Ts & Cs and, if you did that, you wouldn't be able to discuss the beta... ;)
 
Last edited:
V8.0 is distinguishable (apparently) when you are driving around a curve, for example, those cars either side follow the curve at whatever angle they are w.r.t. your car, not just "target to the left, target to the right" display in parallel all the time. Probably more sophisticated than that once we see it in a variety of circumstances though.