Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Maybe I'm wording it poorly, but my point is more that "Supervision" L2 solution can function solely on the maps they are making right now and they can plop a car with that basically anywhere and have that function fine.

I don't believe this is the case for their L4 solution, which is still restricted to very heavily geofenced areas and needs a certain amount of time of driving in that area before they work well (at least that has been the case for the cars they demoed so far).

So even though both of them will utilize these maps (just like how they also use data from basic navigation maps also), the L4 solution doesn't work as reliably just on that map.

Due to their "true redundancy" approach, Mobileye has to validate 2 different stacks, the vision stack and the radar-lidar stack. They are testing the vision stack and deploying SuperVision as L2. They are testing the radar-lidar stack. They are also testing the full stack (vision+radar-lidar) for L4. L4 does require additional validation since they need to validate the radar-lidar stack. Additionally, Mobileye intends to deploy driverless robotaxis for ride-hailing. That will naturally require testing before a public launch. The geofenced testing you are referring to is the testing for their robotaxi ride-hailing. But Mobileye also plans for L4 on personal cars. Shashua said at CES that the L4 on personal cars will not be geofenced. Put simply, L4 will require additional validation of the radar-lidar stack. But L4 and L2 use the same maps.

Put another way, if there was no difference, the supervision demo that was done should have been able to done as L4 with zero interventions if the REM maps was all that was required.

There are differences. L4 has radar & lidar sensors that L2 does not have. So, L4 does require additional validation. But L4 does not require additional mapping. The REM maps are the same, so yes, in theory, Mobileye should be able to do the SuperVision demo as L4 by simply adding the radar-lidar stack. The key is that they have to finish validating the vision stack and the radar-lidar stack.

I guess we'll see. If the L4 solution really needs no additional data beyond the same maps the L2 one is using, the Mobileye solution should be able to launch an L4 solution that has essentially no geofencing as long as it is in that map (which according to Mobileye seems like it should practically cover every public road in the US (beyond any legal barriers like individual state laws). That has not been the case so far for any of the L4 solutions out there. They test in very heavily geofenced areas.

Yes, Mobileye believes that their AV maps are a scalable path to L4 everywhere and give them a big advantage over other companies like Waymo that are very geofenced. Mobileye plans to deploy L4 on personal cars that works basically everywhere. Of course, Mobileye still needs to validate the vision stack and the radar-lidar stack. And they need to validate that the L4 is safe before they can deploy it. So having scalable maps for L4 does not automatically mean they can deploy L4 everywhere right away. But they believe that using the same scalable maps as L2 will help them scale L4 faster. Also, since the L2 stack is a subset of the L4 stack, all the cars with SuperVision are basically testing part of L4 already. So the large L2 fleet should also help them validate L4 faster. With the way Mobileye does "true redundancy", L4 is just a more reliable L2. So L4 should be better anywhere L2 is good enough, assuming the radar-lidar stack is properly validated. Shashua has said that the vision stack will have a MTBF of "way above 1000 hours" by the end of this year. So they are working on making their vision stack more reliable. So Mobileye still has work to do before they can just combine both stacks and deploy L4 everywhere.
 
Last edited:
  • Like
Reactions: diplomat33
Germans call cars like the BMW iX Stadtpanzer. Would you want an armored car run
autonomously through a busy street? Other road users ought to be protected firstly from AVs.
The industry might learn from the Nuro's delivery bot and Waymo's initial robo-taxi. Both vehicles
with limited outer dimensions. Below: Nuro.

1660659899053.png
 

Attachments

  • 1660659662511.png
    1660659662511.png
    3.2 MB · Views: 126
Last edited:
Germans call cars like the BMW iX Stadtpanzer. Would you want an armored car run
autonomously through a busy street? Other road users ought to be protected firstly from AVs.
The industry might learn from the Nuro's delivery bot and Waymo's initial robo-taxi. Both vehicles
with limited outer dimensions. Below: Nuro.

View attachment 841515

Man, you are still on this "AVs need to be small little pods" thing? I get that a small pod crashing will cause less damage than a bigger AV crashing but if you have reliable perception/prediction/planning, your AV will crash less in the first place. So ultimately, solving perception, prediction and planning is the key to safety, not making AVs super small. Plus, bigger AVs will protect their occupants better. I am not saying that AVs have to be large vehicles, but I see nothing wrong with normal four door sedans like a Ford Escort as an AV. And small pods like the Waymo Firefly only seat 2 people. They won't be able to carry a whole family that wants to go somewhere. The Nuro pod is great for deliveries but they are too small to carry people. You need AVs that can seat 4-5 people for when larger groups want to rideshare.
 
Last edited:
Yes, as long as autonomous has not proven itself fail safe (will take decades),
the AVs themselves should be as low a rolling risk as possible. Nothing wrong with that.

Going from A to B is always about physical displacement.
The bigger the AV, the less margin for error and to evade other road users.

If I read the sometimes ridiculously semantics-flavored discussions here on the forum,
I get a feeling that people totally forget about that.

https://youtube.com/watch?v=RzNXnYqkV7w
 
  • A fatal crash involving four vehicles on a German highway did not involve a self-driving car, BMW said on Tuesday, refuting a police statement that had questioned whether the driver had been actively steering the vehicle at the time
  • police said involved an autonomous test vehicle
  • BMW confirmed that the crash had involved one of its models but said the car in question had no self-driving capabilities
  • The vehicle is equipped with Level 2 driver assistance systems, in which case "the driver always remains responsible", a spokesperson said in an emailed statement
  • Such systems can brake automatically, accelerate and, unlike Level 1 systems, take over steering, according to BMW's website
  • BMW added that the vehicle was required to be marked as a test car for data protection purposes, because it was recording footage.
So it comes down to bickering already. The BMW iX (reportedly) has Level 2 and can take over steering. The driver may have been using this feature. BMW is saying it's not an autonomous car and not a "self-driving" car. That is true, but the initial question is 'was the driver steering or the car'?

Yes, if these reports are true, the driver is responsible for ensuring safe steering, and Level 2 cars have no autonomous features. BMW is trying to hide behind 'hey we didn't say it was a self-driving / autonomous car'. That is not the question, the question is "a police statement that had questioned whether the driver had been actively steering the vehicle at the time".

Since this happened in Germany, which is seemingly more restrictive with car regulations we should see some interesting discussion about Level 2 cars, one way or another.


"Electric Test Vehicle"

Is the BMX iX self-driving?
It is not self-driving... the iX has an SAE Level 2 driver-assist system. The lane-keeping assist works up to 124 mph and uses visible lane markings and vehicles in the lane ahead to provide steering assistance. It uses data from both front- and side-mounted cameras, the front-facing long-range radar, and four more radar sensors facing to the sides of the vehicle.

Turn on the Active Cruise function and the dashboard screen will show images of cars, trucks, and motorcycles recognized by the camera and radar sensors at the front of the vehicle. This confirms that the iX is seeing everything it should. Though you can’t drive it completely hands-free, it will allow for brief periods where you can take your hand off the wheel (though you would never do that, of course).
 
Last edited:
  • Informative
Reactions: diplomat33
Motional has launched driverless robotaxis in Las Vegas. Similar to Waymo's "trusted rider program", the driverless rides are free for now to get feedback. Motional will apply for a permit to charge money for driverless rides. They are aiming to launch the commercial driverless service next year.

Motional, the Aptiv-Hyundai joint venture that’s working to commercialize autonomous driving technology, has launched its new all-electric IONIQ 5-based robotaxi for driverless ride-hail operations on the Lyft network in Las Vegas.
Similar to Motional’s autonomous ride-hailing service that the company launched with Via in February, riders will not be charged for autonomous Lyft rides — the companies are mutually focused on rider feedback, said a company spokesperson. The spokesperson also noted that Motional has a permit to conduct fully driverless testing anywhere in Nevada, and that Motional and Lyft will secure the appropriate permits to start conducting commercial rides in fully driverless vehicles ahead of the launch in 2023.

 
More PR confusion on the BMW iX crash that may have been a Level 2 steering issue (driver or car).

BMW is keen to say that their Level 2 cars are not Autonomous. Yet the cars are driving with a WWW.BMW.COM/AUTONOMOUSDRIVING sticker. However this does not mean they are Autonomous. Gee that type of argument sounds familiar. I wonder why people are confused.

Also, as several people pointed out on social media, the “autonomous driving” label does not necessarily mean that the car was autonomous. Instead, it’s a legal requirement to tell people the car has active surveillance equipment on board.
bmw-ix-with-an-autonomous-driving-label-crashes-in-germany-killing-one-injuring-nine_2.jpg
 
This confirms that the iX is seeing everything it should. Though you can’t drive it completely hands-free, it will allow for brief periods where you can take your hand off the wheel (though you would never do that, of course).
To further confuse things, the stock iX system is not Hands-Free, but BMW has introduced Hands-Free on the i7. It's possible this iX 'test vehicle' had the i7 system, or even an i7 system with relaxed constraints that let it operate on more road types.
 
Beat me to it. Gotta be careful with these L2 systems, reinforcing why Waymo skipped directly to developing L4.

The German L2 systems veering into oncoming traffic is fairly common, I remember the initial reviews of AP vs the state of the art Mercedes system at the time, and the Mercedes system would happily go into oncoming traffic when the road is curved. I think it has to do with restricted steering input and likely a less sophisticated lane keeping algorithm (which has some ping-pong). I would be a little surprised however if a fairly new car like the iX is still like that.
 
I perceive the discussion as a bit weird and off-topic. The question is not what kind of driver assist system the BMW had. The question is, quite simply, whether the steering force that caused the collision came from the human or from the machine.

Of course, as usual, the media articles do not answer this crucial question. Instead they discuss irrelevant car design considerations.

I drive a Tesla and use the Autopilot a lot. I'm not sure whether I could always prevent an accident, if it suddenly steered into oncoming traffic. It has never done this though, not even under rather adverse conditions, where I already held the steering wheel firmly out of fear that the car's cameras might not see as well as I do. But the autopilot has always stayed in its lane for 60,000 km so far, sometimes at night and in rain. Under very bad conditions the autopilot sometimes gives up, but even then safely calls me to the task before steering improperly. Reading the BMW accident story, I get the idea that Tesla's autopilot might be better than BMW's.
 
The German L2 systems veering into oncoming traffic is fairly common, I remember the initial reviews of AP vs the state of the art Mercedes system at the time, and the Mercedes system would happily go into oncoming traffic when the road is curved. I think it has to do with restricted steering input and likely a less sophisticated lane keeping algorithm (which has some ping-pong). I would be a little surprised however if a fairly new car like the iX is still like that.
I seem to have read about other ADAS systems that would disengage in a tight turn. I would find it slightly annoying should my car run off the road or into oncoming traffic in the apex of a curve. Perhaps European regulators don't see that as a problem.
 
Does the UN-ECE regulation not ask for a 0.3 g limit on sideways curve acceleration? It would be hilarious if the car makers took that as an instruction to increase the steered curve radius as soon as the sideways acceleration exceeded that limit. As long as it wouldn't kill people, anyway.
 
Shashua compliments Tesla's FSD for their progress and for "doubling down on computer vision", while accentuating Mobileye's approach:


I think Shashua is being very diplomatic because he is subtly saying Mobileye is better while still being complimentary to Tesla. It is interesting to see Mobileye being friendly to Tesla on FSD, especially considering that they broke up a few years back and they are competitors. But I guess they share some common ground on vision-only.

But I like when competitors can be friendly because ultimately the consumer wins when there is competition and choice. The more good ADAS and good FSD out there from different brands, the better for everybody IMO.
 
Last edited:
  • Like
Reactions: Dan D.
Shashua compliments Tesla's FSD for their progress and for "doubling down on computer vision", while accentuating Mobileye's approach:


I think Shashua is being very diplomatic because he is subtly saying Mobileye is better while still being complimentary to Tesla. It is interesting to see Mobileye being friendly to Tesla on FSD, especially considering that they broke up a few years back and they are competitors. But I guess they share some common ground on vision-only.

But I like when competitors can be friendly because ultimately the consumer wins when there is competition and choice. The more good ADAS and good FSD out there from different brands, the better for everybody IMO.
Maybe he's thinking about a post-Elon partnership between Tesla and Mobileye. I see FSD as more of a placeholder these days, I can't see them every getting it to work well at this rate. At some point they may have to concede defeat and install a functional system. That could be Mobileye's.

Or Tesla may solve it, but it can't hurt to have a collaboration between the companies. Post-Elon of course.

Hey, it's just a theory. 🤷‍♂️
 
  • Like
Reactions: diplomat33
I guess we'll see. If the L4 solution really needs no additional data beyond the same maps the L2 one is using, the Mobileye solution should be able to launch an L4 solution that has essentially no geofencing as long as it is in that map (which according to Mobileye seems like it should practically cover every public road in the US (beyond any legal barriers like individual state laws). That has not been the case so far for any of the L4 solutions out there. They test in very heavily geofenced areas.
There's a misconception that scaling map is the limiting factor in deployed self driving systems and its simply not true.
Its about MTBF so no Mobileye's L4 system won't have the same MTBF everywhere. Road structure, rules, driving behavior of other agents (vehicles, pedestrians, etc) all affect MTBF even with a map.

Think about a 6 way/8 way intersection with unique road structures and rules. Just because you have a map doesn't mean your driving policy have been trained to navigate structures/rules similar to that and handle unique behaviors that could emerge due to the uniqueness of that road structure and driving rules surrounding it. But its not just to handle it acouple times, but do it to the point of satisfying your MTBF (1 million miles per failure for example).

swindon-aerial.jpg
 
Maybe he's thinking about a post-Elon partnership between Tesla and Mobileye. I see FSD as more of a placeholder these days, I can't see them every getting it to work well at this rate. At some point they may have to concede defeat and install a functional system. That could be Mobileye's.

Or Tesla may solve it, but it can't hurt to have a collaboration between the companies. Post-Elon of course.

Hey, it's just a theory. 🤷‍♂️

I wonder if Mobileye's maps and RSS would be compatible with Tesla Vision. I am guessing yes but it would likely require some work. That could be one way that the two companies could collaborate. Tesla keeps their computer vision but adds Mobileye's maps and RSS. Shashua seems complimentary of Tesla's computer vision so they seem like they might be ok with that. I think Tesla would be happy with that arrangement since it would allow them to keep all the work they've done with vision and get the added safety from Mobileye's maps and RSS without any additional work. In fact, Mobileye would likely insist that Tesla use their maps and RSS for safety reasons. Mobileye would want to make sure the system met their standards. And Mobileye would be happy since it would be a new customer using their maps. Furthermore, Mobileye could leverage the entire Tesla fleet to expand their maps which would be a big boost for Mobileye. It seems like that arrangement would suit both sides well. I think Tesla would be wise to partner with Mobileye.

Personally, I would love that arrangement because it would be the best of both worlds. I love my Model 3 and I don't have a problem with Tesla Vision per se. I just wish FSD beta were more reliable. I love Mobileye's maps and RSS for the added safety that they bring. So if I could have the benefits of Mobileye's maps and RSS on my Tesla to make FSD Beta even better, that would be fantastic.

I think Tesla would be wise to partner with Mobileye. But I agree that any partnership between Tesla and Mobileye would likely be post-Elon since I don't think Elon's ego would allow him to admit that Tesla needs Mobileye's help.
 
Last edited:
  • Helpful
Reactions: Dan D.
I wonder if Mobileye's maps and RSS would be compatible with Tesla Vision. I am guessing yes but it would likely require some work. That could be one way that the two companies could collaborate. Tesla keeps their computer vision but adds Mobileye's maps and RSS. Shashua seems complimentary of Tesla's computer vision so they seem like they might be ok with that. I think Tesla would be happy with that arrangement since it would allow them to keep all the work they've done with vision and get the added safety from Mobileye's maps and RSS without any additional work. In fact, Mobileye would likely insist that Tesla use their maps and RSS for safety reasons. Mobileye would want to make sure the system met their standards. And Mobileye would be happy since it would be a new customer using their maps. Furthermore, Mobileye could leverage the entire Tesla fleet to expand their maps which would be a big boost for Mobileye. It seems like that arrangement would suit both sides well. I think Tesla would be wise to partner with Mobileye.

Personally, I would love that arrangement because it would be the best of both worlds. I love my Model 3 and I don't have a problem with Tesla Vision per se. I just wish FSD beta were more reliable. I love Mobileye's maps and RSS for the added safety that they bring. So if I could have the benefits of Mobileye's maps and RSS on my Tesla to make FSD Beta even better, that would be fantastic.

I think Tesla would be wise to partner with Mobileye. But I agree that any partnership between Tesla and Mobileye would likely be post-Elon since I don't think Elon's ego would allow him to admit that Tesla needs Mobileye's help.
If Mobileye is willing to sell map data with a standarized api similar to how Tomtom and Mapbox is already doing with Tesla, without requiring the Tesla to install Mobileye hardware, I don't see why Tesla can't buy such data from them, since they would just be another map vendor. Are they offering such a service?

I agree any sort of full partnership is unlikely with Elon at the helm, especially using their hardware, but just buying map data without partnering seems like it's still possible.
 
Last edited:
There's a misconception that scaling map is the limiting factor in deployed self driving systems and its simply not true.
Its about MTBF so no Mobileye's L4 system won't have the same MTBF everywhere. Road structure, rules, driving behavior of other agents (vehicles, pedestrians, etc) all affect MTBF even with a map.

Think about a 6 way/8 way intersection with unique road structures and rules. Just because you have a map doesn't mean your driving policy have been trained to navigate structures/rules similar to that and handle unique behaviors that could emerge due to the uniqueness of that road structure and driving rules surrounding it. But its not just to handle it acouple times, but do it to the point of satisfying your MTBF (1 million miles per failure for example).
But presumably the idea as posted above is they gather this data using the whole fleet (not just the L4 vehicles) and incorporate that data into REM map tiles, right? If this is not the case, then the L4 solution would require generating a different map of sorts on top of REM and would scale a lot slower than the REM map coverage suggests (which as linked by others appear to almost cover all major roads in the US already), which would go back to my original point (this scaling for L2 does not necessarily suggest L4 will scale just as fast).