Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
If Mobileye is willing to sell map data with a standarized api similar to how Tomtom and Mapbox is already doing with Tesla, without requiring the Tesla to install Mobileye hardware, I don't see why Tesla can't buy such data from them, since they would just be another map vendor. Are they offering such a service?

Sorta. Mobileye is offering a product called "Cloud Based Driver Assist" which uses 1 front camera for basic lane keeping and cruise control but accesses the Mobileye maps in the cloud to improve performance. I think VW is doing this with Travel Assist. So Tesla could maybe do basic AP with Mobileye's maps. I don't know if Mobileye would be willing to just sell their maps for Tesla to use with FSD. I think for anything more complicated, Mobileye would likely require that Tesla use their hardware which I doubt Tesla would do.
 
U.S. auto safety regulators will extend a deadline for public input on General Motors and Ford Motor petitions seeking to deploy a limited number of self-driving vehicles without human controls like steering wheels and brake pedals.
The National Highway Traffic Safety Administration (NHTSA) on Thursday made the 30-day extension of the public comment on the automaker requests after cities like San Francisco and Oakland, California, state transportation agencies, the National Association of City Transportation Officials and others sought further time to analyze the exemption requests.
NHTSA has authority to grant petitions to allow a limited number of vehicles to operate on U.S. roads without required human controls.
Both automakers want to deploy up to 2,500 vehicles annually, the maximum allowed under law, for ride sharing and delivery services. Neither seek approval to sell self-driving vehicles to consumers.

 
But presumably the idea as posted above is they gather this data using the whole fleet (not just the L4 vehicles) and incorporate that data into REM map tiles, right?
Yes. the maps are one and the same.
If this is not the case, then the L4 solution would require generating a different map of sorts on top of REM and would scale a lot slower than the REM map coverage suggests (which as linked by others appear to almost cover all major roads in the US already)
The maps are one and the same. Nevertheless HD Map isn't a current barrier to scaling so it's a moot point.
If the year was say 2030 then yeah not having enough HD map coverage would be a barrier to scaling.
 
At CVPR 2022, Cruise gave the keynote speech and they shared their roadmap for expansion in SF. As of this May, Cruise is now covering ~70% of SF at night with dozens of cars. Cruise plans to expand to 100% of SF, 24/7, with hundreds of cars "soon".

6lWFK1G.png


Here is the full keynote. It is very informative on the progress of Cruise and their future research:

 
I’m not sure if there should be a separate thread for this neural network and FSD architecture stuff, I did not see one with this bumped (did see the stuff posted elsewhere with Ashok at CVPR; that video is also linked within this tweet):


Anyway please feel free to revive/discuss in another more appropriate thread if it already exists.
 
  • Informative
Reactions: stopcrazypp
I tend to agree with you. AP would probably handle it if the car was on a highway, but NOA or FSD beta would likely bail out. Curious how they all have different rain tolerances.

My guess is that the systems have different reliability tolerances. For example, NOA needs to be able to do auto lane changes to follow route. Tesla likely does not want NOA to do auto lane changes if it is not confident as that could be a serious safety hazard. And Tesla uses the side cameras for lane change which can get easily covered by rain drops. So Tesla probably sets the tolerance threshold higher for NOA. Same with FSD Beta that has to be able to do turns at intersections. You need high confidence. But with regular AP, it's just lane keeping and adaptive cruise control. There are 3 front cameras (?) so likely they are able to see lane well enough. So the tolerance threshold probably does not need to be as high, especially since the driver should have their hands on the wheel and be paying attention.
 
My guess is that the systems have different reliability tolerances. For example, NOA needs to be able to do auto lane changes to follow route. Tesla likely does not want NOA to do auto lane changes if it is not confident as that could be a serious safety hazard. And Tesla uses the side cameras for lane change which can get easily covered by rain drops. So Tesla probably sets the tolerance threshold higher for NOA. Same with FSD Beta that has to be able to do turns at intersections. You need high confidence. But with regular AP, it's just lane keeping and adaptive cruise control. There are 3 front cameras (?) so likely they are able to see lane well enough. So the tolerance threshold probably does not need to be as high, especially since the driver should have their hands on the wheel and be paying attention.
I find curious the message that FSD beta puts up in the rain that states that it "may be degraded". It might be helpful for Tesla to clue us in on what exactly is degraded, though I've never really noticed a difference.
 
I find curious the message that FSD beta puts up in the rain that states that it "may be degraded". It might be helpful for Tesla to clue us in on what exactly is degraded, though I've never really noticed a difference.
I believe it stops doing NOA so no lane changes or merges but it still does basic AP: lane keeping and traffic-aware cruise control.
 
My guess is that the systems have different reliability tolerances. For example, NOA needs to be able to do auto lane changes to follow route. Tesla likely does not want NOA to do auto lane changes if it is not confident as that could be a serious safety hazard. And Tesla uses the side cameras for lane change which can get easily covered by rain drops. So Tesla probably sets the tolerance threshold higher for NOA. Same with FSD Beta that has to be able to do turns at intersections. You need high confidence. But with regular AP, it's just lane keeping and adaptive cruise control. There are 3 front cameras (?) so likely they are able to see lane well enough. So the tolerance threshold probably does not need to be as high, especially since the driver should have their hands on the wheel and be paying attention.
Does the Supervision system above do auto lane changes in these conditions?
 
No, it would not. I've been in rain like that many times, and FSD cancels on me.
Likely because it's beta and they are being cautious. See the AI day when Ashok showed *sugar* falling out of vehicles.

I have been driving on Autopilot in so heavy and dark rain that I couldn't see well and felt unsafe but autopilot could see and drive much better than me. I trust that it can see better than me in the dark: