Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Let's be real.

If the cameras begin failing on any of these autonomous cars, they are going to try to pull over to the side of the road and stop. None of these cars are going to keep driving.
Pulling over may be the correct decision. If you have a lidar/radar redundancy in a L3-L5 car then it should be able to get off the highway to a safe location before stopping. If you have vision-only and it fails then the system may be unable to safely pull over.
 
Another consumer LIDAR is coming: This time with 2022 Mercedes-Benz EQS L3 driving and L2 parking!

More info needed because why parking is that difficult to be an L2?

Pricings not disclosed but it will likely be above $100,000 for a 107.8 kWh battery pack and a 478-mile range.

Last year it also announced the 2021 L3 Drive Pilot for traffic jams but I am not sure what happened to it.
https://youtu.be/AyOW4kazy54?t=923
1618584408693.png
 
Chris Urmson says that Google technology from 2010 is better than Tesla today.
Chris Urmson from Aurora and Google said:
Musk has said Tesla owners will eventually be able to use their vehicles as robo-taxis during downtime. Urmson is skeptical. “It’s just not going to happen,” he says. “It’s technically very impressive what they’ve done, but we were doing better in 2010.”
LOL, Chris sending a barb Elon's way. I've heard Chris talk over many years and I've chatted with members of his team. I think he has his billions at stake in that comment. I do agree with Chris that Tesla robotaxis isn't happening in the next 5 years in a wide fashion. But Tesla has more going for it then that. L3 will be exciting. Even L2 with full driving capabilities but needs supervision is very exciting.
 
Last edited:
But Tesla has more going for it then that. L3 will be exciting. Even L2 with full driving capabilities but needs supervision is very exciting.

I agree. That's what excites me the most about Tesla's "FSD".

I think a reason people mention robotaxis is because Elon keeps talking about robotaxis. Elon is the one that keeps making grand claims of Tesla having 1M L5 robotaxis so people respond to that.
 
Highway driving does not have traffic lights.

But you are missing the point: the radar/lidar system offers redundancy for all the driving tasks other than traffic lights.

No driving is _solely_ driving on restricted highways. In any practical sense of working towards autonomous vehicles, it's not redundant.

By having a redundant system for all the other driving tasks, it still improves the overall reliability of the FSD. For example, the radar/lidar offers redundancy for making unprotected left turns, for detecting pedestrians and other vehicles etc... So the radar/lidar increases the overall reliability for those tasks, thus improving the overall reliability.

Basically, adding the radar/lidar is not about doing all the driving without cameras, but it can do a lot of driving tasks and by offering redundancy for those driving tasks, it improves the overall reliability!
Redundancy is being able to have a failure without unacceptable outcome.
If they _need_ both systems to produce an acceptable vehicle then their system isn't redundant.

Equip a taxi with their autonomy system. Put it in a city. Now remove the forward camera.

What are you left with?
 
  • Like
Reactions: WattPwr
Pulling over may be the correct decision. If you have a lidar/radar redundancy in a L3-L5 car then it should be able to get off the highway to a safe location before stopping. If you have vision-only and it fails then the system may be unable to safely pull over.

If that is the reason people think you need backup sensors, then fine. But someone is going to come up with a much cheaper backup plan, like mounting an cell phone with camera to the back and allow remote operator to see and control car to get it off the road.

This obsession with redundancy right now is silly. When there are actual capable cars on the road encountering these problems, people are going to come up with ways to solve the problem cheaply.

If you don't need lidar / radar for the fundamental task of driving the vehicle (a big if), they don't have that much value.
 
No driving is _solely_ driving on restricted highways. In any practical sense of working towards autonomous vehicles, it's not redundant.


Redundancy is being able to have a failure without unacceptable outcome.
If they _need_ both systems to produce an acceptable vehicle then their system isn't redundant.

Equip a taxi with their autonomy system. Put it in a city. Now remove the forward camera.

What are you left with?

You are arguing over the word "redundant".

The bottom line is that the sum is more reliable than the individual parts. In other words, having 2 distinct systems (one radar/lidar and one camera) is more reliable overall than just camera only or just radar/lidar only.
 
  • Like
Reactions: GZDongles
In other words, having 2 distinct systems (one radar/lidar and one camera) is more reliable overall than just camera only or just radar/lidar only.
Redundancy is generally considered to be against a hardware failure of that device. You can just slap two of the same thing on and have redundancy. But you still have common mode failures (fog for cameras, snow on the bumper for radar) that multiple identical devices doesn't solve. This is all standard stuff for analyzing safety critical systems. The FAA requires a Common Mode Analysis (CMA) for systems for this very reason. The common mode can even just be sharing a power supply. Who cares how good your cameras and radar are if they are hooked to power that fails every 1000 hours?

Dissimilarity or independence adds the idea that a system is robust against single external failures that may cause the system to no longer maintain the required level of safety.
 
...If they _need_ both systems to produce an acceptable vehicle then their system isn't redundant...

Boeing 737 Max crashes because those airlines didn't pay extra to get a redundancy system to verify whether the Angle of Attack reading was correct or it was an error. Without that extra payment for the redundancy error checker, the error was automatically accepted by the machine as the correct reading and it nosed the plane down and killed everyone on board.

If they paid extra money, they would get an angle of attack disagreement indicator Light, the pilots would know that there's a reading error. Also if they paid extra for the extra exterior sensor on the other side of the plane, then both the machine and the pilots would know there's a reading error between the 2 sensors.

So in the case above, redundancy is an error checking routine/protocol and the redundancy system requires many parts as well as money (regulators try to make safety standard and not optional with extra charge. So the laws can also be included in the redundancy system).

I guess the "redundant" definition depends on who defines it. Bosch defines it as:

"Everything in view at all times
Cameras, lidar and radar sensors are the eyes of the assistance systems. With their 360-degree viewing capabilities, they supply important information about the surroundings. By backing each other up, the vehicle still has everything in view even if the single sensors fail."

That despite the LIDAR cannot read traffic lights, the camera cannot read darkness, and the radar ignores stationary objects.

It's just when MobilEye defines it, some people just don't allow such a definition.
 
Last edited:
  • Like
Reactions: diplomat33
Boeing 737 Max crashes because those airlines didn't pay extra to get a redundant system
This is a very simplified version of what happened. Boeing didn't tell regulators or customers that the Max used this sensor for a new function. Previous 737's only used AoA as a display to the pilot, and most airlines don't fly AoA as a primary instrument. Suddenly it is part of the flight control logic. You can't fault airlines for not buying redundancy when they weren't told that it was flight critical.

Boeing paid $2.5B in fines for lying and misleading the FAA during the 737 Max cert.
 
  • Like
Reactions: NHK X and rxlawdude
You are arguing over the word "redundant".

The bottom line is that the sum is more reliable than the individual parts. In other words, having 2 distinct systems (one radar/lidar and one camera) is more reliable overall than just camera only or just radar/lidar only.

It would be radar/lidar/forward camera or camera only..

And multiple systems are not necessarily more reliable. Just potentially better. One you get to "won't crash", everything on top of that is gravy.
 
And multiple systems are not necessarily more reliable. Just potentially better. One you get to "won't crash", everything on top of that is gravy.

I think that would be my priority in Autonomous Vehicles. How to make sure the machine doesn't crash and kill.

Waymo demonstrates that well but it's only comfortable under the strict geofencing within the 50 square miles in Chandler, Arizona.

Thus, in general, I doubt that the industry has reached that 99% crash-free goal just yet.
 
I think that would be my priority in Autonomous Vehicles. How to make sure the machine doesn't crash and kill.
This is trivial. The vehicle that never moves is perfectly safe. It's also perfectly useless.
All safety engineering is trading off safety for utility. You can't pick only one priority or you end up with a useless system. Safety and utility also doesn't stop at your own vehicle. You need to make sure you don't interfere or endanger other drivers with your actions. We could build a vehicle that goes everywhere at 5 MPH forcing everyone else to avoid it but that is not a safe use of public roads.

Lots of people hate driving near the Waymo vehicles- enough that they have attacked them.

This is a place where picking the right metrics is important. Injuries per mile is good. Injuries per hour or trip are bad because you can game the hours by driving slow or the trips by only doing 1/4 mile trips.
 
  • Like
Reactions: WattPwr
Lots of people hate driving near the Waymo vehicles- enough that they have attacked them.

That's not really a relevant metric though. Tesla cars get attacked too. There are always going to be jerks that attack vehicles they don't like.

I think that would be my priority in Autonomous Vehicles. How to make sure the machine doesn't crash and kill.

Waymo demonstrates that well but it's only comfortable under the strict geofencing within the 50 square miles in Chandler, Arizona.

Thus, in general, I doubt that the industry has reached that 99% crash-free goal just yet.

Waymo cars are able to drive autonomously outside the 50 sq mi, it's just Waymo is not sure if the safety is good enough to actually remove the driver yet. In the 100 sq mi area in Phoenix, they released 6M miles of data that show pretty good safety and they released millions of miles of disengagement data in CA that shows really good disengagement rate but would the cars be safe enough in different conditions, like a different city, bad weather, highway speeds etc..? So yeah, I think proving safety is the big challenge for Waymo.
 
That's not really a relevant metric though. Tesla cars get attacked too. There are always going to be jerks that attack vehicles they don't like.
People are mad at the way Waymo vehicles are actually driving on the road. Slow driving, stopping in intersections and just general weirdness. They don't just "don't like" them for no reason. My point was that you can't just go out in the world and blindly claim "look at how safe I am because I am so conservative!" as you get rear ended from slamming on the brakes randomly and blame other drivers, or just block traffic.

Angry Residents, Abrupt Stops: Waymo Vehicles Are Still Causing Problems in Arizona
 
  • Like
Reactions: WattPwr
People are mad at the way Waymo vehicles are actually driving on the road. Slow driving, stopping in intersections and just general weirdness. They don't just "don't like" them for no reason. My point was that you can't just go out in the world and blindly claim "look at how safe I am because I am so conservative!" as you get rear ended from slamming on the brakes randomly and blame other drivers, or just block traffic.

Angry Residents, Abrupt Stops: Waymo Vehicles Are Still Causing Problems in Arizona

That article is a bit click bait IMO. Yes, there were some incidents with Waymo being "weird" but it was over a 15 month period. It only mentions one incident where the Waymo abruptly stopped for apparently no reason. And many of the incidents were actually the fault of the other driver, not the Waymo. And it mentions only 4 times people were angry at a Waymo. Many of the incidents happened months ago too so not super recent. So making it sound like a lot of residents are angry because Waymo is causing a ton of problems now, is a bit of an exaggeration IMO.
 
Ugh- any flaw pointed out towards self driving is seen as an attack on all self driving forever and needs to be defended.
My whole point was that the only rule cannot be "don't crash." It needs to be "don't crash while actually getting a person to their destination in a reasonable time while also not causing other drivers to crash." People get reasonably angry when a self driving car prioritizes itself over everything else, and we've seen it a few times already in the real world.
This stuff is complex and can rarely be boiled down to "don't crash".
 
  • Like
Reactions: diplomat33
Ugh- any flaw pointed out towards self driving is seen as an attack on all self driving forever and needs to be defended.
My whole point was that the only rule cannot be "don't crash." It needs to be "don't crash while actually getting a person to their destination in a reasonable time while also not causing other drivers to crash." People get reasonably angry when a self driving car prioritizes itself over everything else, and we've seen it a few times already in the real world.
This stuff is complex and can rarely be boiled down to "don't crash".

Oh I agree 100%. "don't crash" is only one requirement. Autonomous cars need to follow traffic laws, get to the destination in a reasonable amount of time, avoid crashes and avoid causing others to crash as much as possible, respect other drivers, and also drive in a comfortable, natural way for the passengers. It's one big reason why autonomous driving is so complex. It's not just about navigating from A to B without crashing.
 
Boeing 737 Max crashes because those airlines didn't pay extra to get a redundancy system to verify whether the Angle of Attack reading was correct or it was an error. Without that extra payment for the redundancy error checker, the error was automatically accepted by the machine as the correct reading and it nosed the plane down and killed everyone on board.

If they paid extra money, they would get an angle of attack disagreement indicator Light, the pilots would know that there's a reading error. Also if they paid extra for the extra exterior sensor on the other side of the plane, then both the machine and the pilots would know there's a reading error between the 2 sensors.

So in the case above, redundancy is an error checking routine/protocol and the redundancy system requires many parts as well as money (regulators try to make safety standard and not optional with extra charge. So the laws can also be included in the redundancy system).
'
I guess the "redundant" definition depends on who defines it. Bosch defines it as:

"Everything in view at all times
Cameras, lidar and radar sensors are the eyes of the assistance systems. With their 360-degree viewing capabilities, they supply important information about the surroundings. By backing each other up, the vehicle still has everything in view even if the single sensors fail."

That despite the LIDAR cannot read traffic lights, the camera cannot read darkness, and the radar ignores stationary objects.

It's just when MobilEye defines it, some people just don't allow such a definition.

The Bosch page is a bit clearer. It give two examples of redundancy and what it comes down to generically is:
- in areas where both technologies are sufficient, they back each other up to provide redundancy
- in areas where only 1 technology is sufficient, it is doubled up (as in the MCAS system).

However, the page is still rather weaselly. It only uses the redundancy when talking about the braking and steering. For the sensors it just says that an object is always seen, which doesn't imply redundancy. To be redundant it would need to be seen and handled by a sufficient system.
 
People are mad at the way Waymo vehicles are actually driving on the road. Slow driving, stopping in intersections and just general weirdness. They don't just "don't like" them for no reason. My point was that you can't just go out in the world and blindly claim "look at how safe I am because I am so conservative!" as you get rear ended from slamming on the brakes randomly and blame other drivers, or just block traffic.

Angry Residents, Abrupt Stops: Waymo Vehicles Are Still Causing Problems in Arizona
That article is a bit click bait IMO. Yes, there were some incidents with Waymo being "weird" but it was over a 15 month period. It only mentions one incident where the Waymo abruptly stopped for apparently no reason. And many of the incidents were actually the fault of the other driver, not the Waymo. And it mentions only 4 times people were angry at a Waymo. Many of the incidents happened months ago too so not super recent. So making it sound like a lot of residents are angry because Waymo is causing a ton of problems now, is a bit of an exaggeration IMO.
I sense that the Level 4 crashes are going to be much more punishing than the Level 2 crashes. They will likely be investigated deeper and have very high civil costs.

Level 4 crashes are "clearly" the fault of the wicked company and Level 2 crashes are going to be very strongly defended against by the companies, who are going to say "the driver is to blame not us". Even if the truth might be that the Level 2 systems are terrible, poorly monitored and dangerously implemented.

I bet that crashes at Level 2 are going to come down the same way as Sudden Unintended Acceleration claims. The driver is going to say that the car did something bad, the company is going to say it didn't (and you should have taken over sooner), and not release the video or telemetry. I would like to hope I am wrong.

(Maybe people should record the drive using their own video setups for some personal security, like in the FSD Beta videos)
 
Last edited: