Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Firmware 8.0

This site may earn commission on affiliate links.
For example: the camera has built-in hardware for deciding what is a car, what is a speed limit sign, etc. The radar does not. Where in the current setup is the 3D point-cloud going to be turned from a mathematical model in the radar hardware into "this is a coke can"?

Good question... as far as I understand, the MCU is not potent enough. Maybe they are using the Camera HW to do that processing? Or does the radar unit come with its own signal processor that would be capable of generating that 3D model plus the necessary interpretation of data? Maybe @Ingineer can shed some light.
 
Exactly. 8.0 is starting to look a lot like hubris & ego.

Autopilot started life as a 3rd party system that Telsa refined into great experience.

Instead of chasing down a rabbit hole trying to rebuild the house after everyone has moved in, the AP team should be working on AP2, fine-tuning a new solution with a new supplier, new sensors, etc.

For example: the camera has built-in hardware for deciding what is a car, what is a speed limit sign, etc. The radar does not. Where in the current setup is the 3D point-cloud going to be turned from a mathematical model in the radar hardware into "this is a coke can"?

Yes it seems like they are improving for now but focusing on the long-term game somewhat already. Cameras are highly limited to their field of vision versus radar/lidar which you see companies testing for truly more autonomous environmental mapping of environments around a car, 360 degrees around car in some cases, farther than the car in front of what a camera can see, etc. The dependability to use radar/lidar in rain/snow/bright sunshine is key also - every morning on the way to work AP isn't really available to me as I head E/NE into work as the sun is too bright for it.

I guess lets see if there is some improvement as stated in 8+
 
Exactly. 8.0 is starting to look a lot like hubris & ego.

Autopilot started life as a 3rd party system that Telsa refined into great experience.

Instead of chasing down a rabbit hole trying to rebuild the house after everyone has moved in, the AP team should be working on AP2, fine-tuning a new solution with a new supplier, new sensors, etc.

For example: the camera has built-in hardware for deciding what is a car, what is a speed limit sign, etc. The radar does not. Where in the current setup is the 3D point-cloud going to be turned from a mathematical model in the radar hardware into "this is a coke can"?
From accounts I read upstream, Elon explicitly used the coke can as an example of shortcomings of conventional radar use, and explained how comparing multiple radar exposures allows them to identify a coke can. No idea where the processing is happening.
 
From accounts I read upstream, Elon explicitly used the coke can as an example of shortcomings of conventional radar use, and explained how comparing multiple radar exposures allows them to identify a coke can. No idea where the processing is happening.

Yep. The camera already deals with situations like that, for example, the side-view mirror of a car in front reflecting the sun straight into the camera.

Thinking through it some more, the only way to do this kind of processing will be in the EyeQ3 chip. Perhaps they are going to try to emulate a second (low resolution) visual feed to the EyeQ3 from the radar using the point cloud to generate an artifical image.

It would explain the 8.0 "fleet learning" approach, which would be needed for hazard awareness as well as whitelisting. I imagine this would require some heavy duty (but throw away) investment in EyeQ3 development though.
 
Yeah, it kinda sounds like Tesla is committed to the current approach for a good long time. If they were planning to push out a triple camera version next month that would solve all the issues and extend the capabilities greatly, would they really be invest large amounts of resources into building this geotagged whitelist for radar navigation, which will presumably take months to put together even with all of us contributing?

I'm sure we'll see a second generation AP hardware set at some point, but after seeing the depth of Tesla's commitment here I'm thinking they may skip the triple camera version and go straight to a 360 camera and radar set that will eventually be able to handle full autonomy - sometime next year or even early in 2018, and with no upgrade for current cars. (Full Autonomy won't be available for at least a couple years after that I suspect.)
Another possible reason for Tesla’s current v8 tack on AP 1.0 is that they were priority-focused on solving the problem that caused the Florida accident, and the target platform had to be the two-year-old installed base of AP 1.0 cars. We’ve heard Elon say he believes you can do a lot with software and what they ended up doing for v8 is pretty clever, e.g., reversing the role of the camera with radar enhanced with new SP algorithms to create a point cloud (likely unprecedented) and GPS-whitelisting of radar-detected objects to avoid false positives. No doubt that AP 2.0 is under development, but this impressive release of v8 lessens the urgency, giving them more runway to better think through how they should design and implement the next-gen AP 2.0 hardware platform.
 
An opinion that is not warranted until there is some evidence it doesn't perform. I don't believe it is hubris, because no other company has either the experience (miles logged) in this level of auto application, or the database.

I dunno. Tesla has a system that is nearing EOL because they no longer have a relationship with the vendor. They have done such a good job with it, that it is currently considered the best of breed.

However, given the choice between (a) some mild tweeks before a major hardware change; or (b) redesigning the current system outside of the original vendor's design parameters, they seem to have decided to go for (b), despite the added commercial risk and cost of this approach over (a).

Why would they do that?
 
Made me wonder:

a) will the processing speed of the "early AP cars" be able to keep up with all the additional signal processing in this updated firmware? That's a lot of additional processing for the existing computer to handle I would think?

b) Will there be another "learning curve" for the AP system now that the system will have to "whitelabel" objects. I remember when AP first came out and the car wanted to exit every off ramp and we kept correcting the system by taking over steering. It learned and now works very well. Will we go through this again?

c) Why no mention of any other non-AP features of v8.0/8.1? Like audio or nav enhancements?
 
The development of AP 1.1 (can we just call it that?) started long before the accident in Florida. Improving performance of the AP1 sensor suite helps a Tesla build AP2.

What are you calling AP1.1? The firmware 8 approach to using AP1 sensors? The triple cam expansion of an otherwise AP1 system?

The only place I've ever seen AP1.1 proposed was when I started using it to describe the current Model X system, which has physical and architectural differences from the Model S but as far as we know has the same types and locations of sensors and similar capabilities.
 
I bet X and S will get AP2.0 from Model 3 project. It does not make much sense (and it would waste resources) to make incremental hardware changes to current system when you gonna dump it all in 12 months time.

EDIT: I think it's great that Tesla is still improving their system instead of investing all their resources towards Model 3. Tesla could have just claimed that they rely on Mobileye and they are limited to it's capabilities but they found a way to improve it beyond it's design capabilities and raised the bar once again.
 
Last edited:
Yesterday's announcement was a combination of safety improvements and new functionality.

The safety improvements are important, to provide assurances to drivers and the media that AP is safe to be enabled. Without those safety improvements, Tesla could face negative publicity on the AP 1.0 hardware - and risk being asked to disable the hardware, if it posed an accident risk to Tesla drivers and other vehicles on the road.

AP 1.0 will always have limitations. AP 2.0 will likely have more and improved sensors along with "supercomputer" processing power - but until Tesla is ready to put that into manufacturing, we shouldn't expect Tesla to say much, if anything, about AP 2.0 - because once they start talking about it, they risk undercutting sales of the AP 1.0 cars.

Yesterday's announcement should be great news for AP 1.0 owners. What they have today will soon be safer and do more, without having to do any hardware upgrades.

What isn't clear is why that announcement had to be done yesterday - on a Sunday, in the middle of opening NFL weekend, and on 9/11. Couldn't that announcement have been made today, with the same impact?
 
Thinking through it some more, the only way to do this kind of processing will be in the EyeQ3 chip. Perhaps they are going to try to emulate a second (low resolution) visual feed to the EyeQ3 from the radar using the point cloud to generate an artifical image.

That's my guess as well - but that would pose the question on how strategic this would be, given the break up with MobilEye.
 
What isn't clear is why that announcement had to be done yesterday - on a Sunday, in the middle of opening NFL weekend, and on 9/11. Couldn't that announcement have been made today, with the same impact?

Let's see if we hear from NHTSA today about their investigation - I wouldn't be surprised. The AP update yesterday would set the stage for Tesla's answer.
 
I dunno. Tesla has a system that is nearing EOL because they no longer have a relationship with the vendor. They have done such a good job with it, that it is currently considered the best of breed.

However, given the choice between (a) some mild tweeks before a major hardware change; or (b) redesigning the current system outside of the original vendor's design parameters, they seem to have decided to go for (b), despite the added commercial risk and cost of this approach over (a).

Why would they do that?
Because they had to due to involvement of regulators looking into AP after the fatal crash. It's likely they tried "mild tweaks" but they didn't solve the problem.
 
Let's see if we hear from NHTSA today about their investigation - I wouldn't be surprised. The AP update yesterday would set the stage for Tesla's answer.

Because they had to due to involvement of regulators looking into AP after the fatal crash. It's likely they tried "mild tweaks" but they didn't solve the problem.

I doubt announcement or improvements is related much to NHTSA investigation.

I think it's a mistake to assume that Tesla's actions are reactive. Bosh and Tesla have been working on radar improvements for a long time - and we are likely to see more of it in Model 3 implementation. This is why Mobileye did not see much benefit of working with Tesla any further.
 
Last edited:
Elon’s blog today announced some fairly significant advances that have changed my overall perception. His mentioning that they elevated the radar sensors beyond the camera to serve in primary detection, enhanced with some new robust signal processing, suggests that the AP 1.0 platform has actually been underutilized, and maybe there’s still more they can do with it. It further begs the question why people think that AP 2.0 is right around the corner when Tesla is making such major advances on the current AP 1.0 platform. I’ve read half a dozen post-blog accounts today and there’s still no mention of imminent hardware changes.

In a different blog, tweet, or interview (I've lost track at this point), Elon mentioned working on enhanced autopilot hardware that was "months" away. B/c of Elon time, I would imagine that is over a year away (from then). I'm not surprised they are working on "the next thing" b/c it takes times to properly vet and test these things out. In that process, they likely wrote some new code on handing the information and that led to thinking on smoothing out the data over time, which then could be back-ported to the current hardware.

Now this is just theorizing on my part, but totally could be how we got to the new Autopilot software for 8.0
 
  • Funny
  • Like
Reactions: Joel and Vitold
mischien mis ik iets, maar alle genoemde items zijn AP related.
is er al info te vinden over de 200 verbeter punten? die niet in het blogpost staan?

Neen, niet meer dan dat het de grootste update wordt die ze ooit gedaan hebben.

Vanwege de "belangstelling" voor AP heeft Elon de aanpassingen die daarop betrekking hebben apart naar voren gehaald en daar duidelijkheid over gegeven.

De rest komst straks met de update. We zullen nog even geduld moeten hebben.

en dit: Approximately 200 small enhancements that aren't worth a bullet point heeft nog steeds betrekking op Autopilot.
 
I doubt announcement or improvements is related much to NHTSA investigation.

I think it's a mistake to assume that Tesla's actions are reactive. Bosh and Tesla have been working on radar improvements for a long time - and we are likely to see more of it in Model 3 implementation. This is why Mobileye did not see much benefit of working with Tesla any further.
You should read Elons blog again. It's all about detecting objects that are moving and overhead objects plus keeping the drivers hands on the wheel and focused on what's going on. All of those things are what Elon described at the time as being factors in the crash. If you don't think the NHTSA is all over this I have a bridge to sell you in Brooklyn.