Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
...Even if the truth might be that the Level 2 systems are terrible, poorly monitored and dangerously implemented...

I don't get what you are saying.

With the first autopilot fatal accident in Florida in 2016, Tesla admitted that its camera thought the white semi-truck was the white sky.

They didn't say their camera was so superior that it could recognize the giant semi-truck and successfully deploy the brakes.

Then the exact same scenario happened again in 2019 in Florida as well (colliding and went through under another semi-truck).

Again, Tesla didn't say that its system was so superior to detect a big semi-truck right in front and successfully deploy the brakes this time either.

I would say that's a terrible combination of hardware/software systems that could not avoid a collision.

But because the system is so terrible, that's why Tesla reminded the DMV that its FSD is L2 and will be L2 in its final version.

Tesla didn't tell DMV that its system is and will be so good that it will be able to avoid crashes and it will be L3 and above.

So, by definition of L2, because it's terrible that's why it needs a competent human driver to drive the L2 car.

If it's not terrible, then it should be promoted to L3 and above!

So far, all finished court cases concerning Tesla L2 crashes, from slow speed garage damages to high speed and also including Sudden Unintended Acceleration, have been in Tesla's favor: Yes, the system is terrible and that's why the driver has to drive the L2 car.


...not release the video or telemetry...

NHTSA did look through Tesla logs in investigating 232 Sudden Unintended Acceleration complaints so I don't get the conspiracy that Tesla's logs are withheld.

Anyone with extra money ($1,200) can purchase the cable to retrieve your own Tesla Event Data Recorder EDR. You only need money and not permission to get your own car's log.



I sense that the Level 4 crashes are going to be much more punishing than the Level 2 crashes. They will likely be investigated deeper and have very high civil costs.

The above article about angry residents against L4 Waymo cars that caused accidents seem to be on Waymo's favor as there's no lawsuit against the company.

So looks like L4 Waymo has dodged the potential lawsuits while Tesla has been getting lots of L2 car crashes lawsuits.
 
  • Like
Reactions: hgmichna
I don't get what you are saying.

With the first autopilot fatal accident in Florida in 2016, Tesla admitted that its camera thought the white semi-truck was the white sky.

They didn't say their camera was so superior that it could recognize the giant semi-truck and successfully deploy the brakes.

Then the exact same scenario happened again in 2019 in Florida as well (colliding and went through under another semi-truck)

Again, Tesla didn't say that its system was so superior to detect a big semi-truck right in front and successfully deploy the brakes this time either.

I would say that's a terrible combination of hardware/software systems that could not avoid a collision.

But because the system is so terrible, that's why Tesla reminded the DMV that its FSD is L2 and will be L2 in its final version.

Tesla didn't tell DMV that its system is and will be so good that it will be able to avoid crashes and it will be L3 and above.

So, by definition of L2, because it's terrible that's why it needs a competent human driver to drive the L2 car.

If it's not terrible, then it should be promoted to L3 and above!

So far, all finished court cases concerning Tesla L2 crashes, from slow speed garage damages to high speed and also including Sudden Unintended Acceleration, have been in Tesla's favor: Yes, the system is terrible and that's why the driver has to drive the L2 car.




NHTSA did look through Tesla logs in investigating 232 Sudden Unintended Acceleration complaints so I don't get the conspiracy that Tesla's logs are withheld.

Anyone with extra money ($1,200) can purchase the cable to retrieve your own Tesla Event Data Recorder EDR. You only need money and not permission to get your own car's log.





The above article about angry residents against L4 Waymo cars that caused accidents seem to be on Waymo's favor as there's no lawsuit against the company.

So looks like L4 Waymo has dodged the potential lawsuits while Tesla has been getting lots of L2 car crashes lawsuits.

Thanks for the detailed reply. Now you see why I said "I hope I am wrong". Seems I was wrong. I'm glad to hear that Tesla cooperates well with the NTHSA in accident investigations.

I still feel that this statement holds: "I bet that crashes at Level 2 are going to come down the same way as Sudden Unintended Acceleration claims. The driver is going to say that the car did something bad, the company is going to say it didn't (and you should have taken over sooner)." Though that's probably what both parties are expected to say initially. We'll just have to see if/when there are NHTSA/NTSB investigations into future L2/L4 crashes how the outcomes are sorted out. Tesla won the SUA cases, and some Autopilot crashes have been found partly Tesla's fault. Uber was found partly at fault for the Tempe crash, shared with the driver, the victim, and the state of Arizona.

Some early blame-shifting from Elon: “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said. “They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.”. Ultimately Tesla did accept part blame for the Autopilot deaths, but the issues of driver complacency, drivers' technical confusion, and Autopilot's flaws do still remain.

Yes, there will probably be damages awarded no matter what level the car is driving, depends who is found at fault.
 
It's very clear to me that Mobileye is a sort of Nikola-esque company / approach right now. There are many logical disconnects for critical thinkers. They're not making meaningful company agreements. There's a lot of marketing jargon that have little underlying engineering sense.
MobilEye is older company than Tesla. Their technique is in millions of cars.
 
MobilEye is older company than Tesla. Their technique is in millions of cars.

I understand that part. Just because one aspect sounds good doesn't mean the approach makes sense. Did you see my list of nonsensical aspects of Mobileye's approach?

Did you know that Mobileye is limited in the data they can gather from their fleet (they even say themselves that they don't collect images or video)? Also limited in the software updates they can push? How do they even update their "technique"?
 
  • Like
Reactions: 82bert and DanCar
The above article about angry residents against L4 Waymo cars that caused accidents seem to be on Waymo's favor as there's no lawsuit against the company.
Most accidents don't end up in lawsuits. Thankfully in these cases it was only property damage. It was clearly Waymo's fault. They paid to have the other car repaired like any normal traffic incident. I don't think this gives us any vision on what will happen the first time an L4 vehicle kills a kid that ran out in the street in a way that was basically un-avoidable and the two parties disagree on fault or liability.

Also, Waymo has specifically not gone after people that have attacked their vehicles as they don't want any more news on the topic. The articles specifically say this. So right now, Waymo is very willing to settle with anyone that they damage and not even go after people that are causing damages to them.

Anyone with extra money ($1,200) can purchase the cable to retrieve your own Tesla Event Data Recorder EDR. You only need money and not permission to get your own car's log.
Note that this is probably a lot less data than you expect. This is the standard EDR that is in most cars- "vehicle dynamics" like acceleration and seatbelt state. Tesla does not give you access to see the video the system records, autopilot data, or other data not required by federal law. You also can't do anything useful with the data without Tesla knowing you did it and sending it to them (you have to use their online tool to create a report).

NHTSA did look through Tesla logs in investigating 232 Sudden Unintended Acceleration complaints so I don't get the conspiracy that Tesla's logs are withheld.
And then in another case, they actively withheld them:

It's all standard Tesla stuff- sometimes they are great, sometimes they are a brick wall. Doesn't matter if you are NHTSA or a normal consumer. It is interesting that Tesla provided tons of logs when those logs clearly exonerate them, and then seem to fight the release of stuff that doesn't look great for them.

But to get to your main point- the issue I see here is that Tesla calls their system "FULL SELF DRIVING," They have videos that open with "THE DRIVER IS ONLY HERE FOR LEGAL REASONS" and they are constantly talking about robotaxis. Then, when in court or in front of a regulator, they say "of course it's terrible, but it's OK, it's L2!". How is an average consumer supposed to know that against the mass of marketing effort trying to suggest that it's more?
 
  • Helpful
Reactions: Dan D.
...And then in another case, they actively withheld them:...

I think that article was misread. Tesla never withheld any car logs. As a matter of fact, the NTSB threatened to ban Tesla because Tesla defied NTSB and went ahead releasing the info to the public:

"In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data," the National Transportation Safety Board (NTSB) said on Sunday. "However, the NTSB is unhappy with the release of investigative information by Tesla."

So the message here is: Tesla needs to shut its mouth and don't inform the public of the urgent info that needed to be corrected. NTSB preferred to keep all parties mute until the info will come out in a report years later.

Again, "cooperation" with NTSB here is the opposite of what we think: Tesla needs to agree with NTSB to keep info secret and should not share it with the public timely.

On the other hand, other articles have reported that:

"The NTSB's primary evidence is log data that is stored to an SD Card inside every Model X. This data records second-by-second changes in the vehicle's steering-wheel position, velocity, and other variables. NTSB examined log data from a month of Huang's morning commutes and found two days—February 27 and March 19—when the vehicle drifted toward the lane divider that would kill him days later. In each case, logs show Huang applying torque to the steering wheel and guiding the vehicle back into its proper lane."

It clearly proves that the car log was not withheld but it was examined by NTSB not just a few seconds during the accident but A MONTH to find 2 days in that month that same kind of steering behavior.

Back to the article that says "Tesla refuses to cooperate": It does not say Tesla withheld any car logs.

It seems to complain that within 90 days, Tesla failed to comply with NTSB safety recommendations (in-car driver monitoring camera, geofencing):

"Typically, automakers respond to safety recommendations within 90 days. Tesla, however, has said it has updated Autopilot to issue more frequent warnings to drivers who aren't paying attention to the road. Autopilot has caused confusion and apparently continues to do so based on its name. You see, Autopilot is not a fully autonomous driving system. It's semi-autonomous and even Tesla says drivers need to keep their hands on the wheel for any potential scenario.
But the NHTSB is interested in other factors as a result of an engaged Autopilot system, among them distracted driving and highway infrastructure. In Huang's case, not only did his Model X slam into a highway barrier but his hands were not on the steering wheel just six seconds prior to the crash, a fact recently uncovered by investigators. There was also no evidence of braking or evasive action. Recovered data logs also show he was playing a game on his iPhone.

Days before his death, Huang reported that his Model X steered away from the highway on more than one occasion and he was forced to intervene. Knowing this, why was he playing a game on his smartphone at that specific troublesome stretch of highway? It's impossible to know the answer, but the NHTSB still seems determined to issue new safety regulations regarding Autopilot - with or without Tesla's input."
 
Last edited:
....some Autopilot crashes have been found partly Tesla's fault....
The findings from NTSB investigation in both Florida cases seem not to directly blame Tesla on the 2016 one and in 2019 one, it mildly and indirectly blames Tesla's incompetency in producing an excellent human driver who knows how to follow Tesla's manual.

In the 2016 Williston, Florida crash, it squarely blamed drivers for inattention, lack of reaction, technology overreliance.

https://data.ntsb.gov/Docket/Document/docBLOB?ID=40457788&FileExtension=.PDF&FileName=NTSB - Adopted Board Report HAR1702-Master.PDF

3.2 Probable Cause

The National Transportation Safety Board determines that the probable cause of the Williston, Florida, crash was:

1) the truck driver’s failure to yield the right of way to the car,
2) combined with the car driver’s inattention due to overreliance on vehicle automation, which resulted in the car driver’s lack of reaction to the presence of the truck.
3) Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer.


In the 2019 Delray Beach, Florida, crash, it also blamed drivers but it reworded from previous driver's technology overreliance into Tesla's failure to curb the driver's technology overreliance and NTHSA's failure to make sure Tesla getting rid of driver's technology overreliance.


https://data.ntsb.gov/Docket/Document/docBLOB?ID=9288075&FileExtension=pdf&FileName=HAB2001-Rel.pdf

Probable Cause
The National Transportation Safety Board determines that the probable cause of the Delray Beach, Florida, crash was

1) the truck driver’s failure to yield the right of way to the car,

2) combined with the car driver’s inattention due to overreliance on automation, which resulted in his failure to react to the presence of the truck.

3) Contributing to the crash was the operational design of Tesla’s partial automation system, which permitted disengagement by the driver, and the company’s failure to limit the use of the system to the conditions for which it was designed.

4)Further contributing to the crash was the failure of the National Highway Traffic Safety Administration to develop a method of verifying manufacturers’ incorporation of acceptable system safeguards for vehicles with Level 2 automation capabilities that limit the use of automated vehicle control systems to the conditions for which they were designed.


....Ultimately Tesla did accept part blame for the Autopilot deaths...

I've seen no such admission from Tesla. Tesla feels sorry for the loss of lives it has never admitted that its design didn't function properly.
 
  • Like
Reactions: DanCar
I've seen no such admission from Tesla. Tesla feels sorry for the loss of lives it has never admitted that its design didn't function properly.
Seems like an admission of part blame to me.


What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied
 
Thanks for the detailed reply. Now you see why I said "I hope I am wrong". Seems I was wrong. I'm glad to hear that Tesla cooperates well with the NTHSA in accident investigations.

I still feel that this statement holds: "I bet that crashes at Level 2 are going to come down the same way as Sudden Unintended Acceleration claims. The driver is going to say that the car did something bad, the company is going to say it didn't (and you should have taken over sooner)." Though that's probably what both parties are expected to say initially. We'll just have to see if/when there are NHTSA/NTSB investigations into future L2/L4 crashes how the outcomes are sorted out. Tesla won the SUA cases, and some Autopilot crashes have been found partly Tesla's fault. Uber was found partly at fault for the Tempe crash, shared with the driver, the victim, and the state of Arizona.

Some early blame-shifting from Elon: “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said. “They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.”. Ultimately Tesla did accept part blame for the Autopilot deaths, but the issues of driver complacency, drivers' technical confusion, and Autopilot's flaws do still remain.

Yes, there will probably be damages awarded no matter what level the car is driving, depends who is found at fault.
In both of the fatal cases with more details so far, the drivers were highly experienced users well aware of the limitations. In the first case, the driver posted videos of his use before. In the second case, the driver was aware of and previously reported the car veering into the exact barrier he eventually crashed into. While in the first case Tesla did make their nags more strict, I'm not convinced it would necessarily have deterred that first driver (as there are defeat devices he could have used).

In both cases, I think the only thing that really would have helped is a driver monitoring system that is harder to defeat (like infrared cameras others are using).

There was another case in 2019 where AP was engaged 10 seconds prior, but not as much was known about the proficiency of the driver in using the systems. One of NTSB's issues with Tesla's system is that it doesn't disable the system in areas where it wasn't designed for (doesn't have strict geo fence), but I think users would revolt if Tesla put those limitations due to the actions of a few.
 
Seems like an admission of part blame to me.
...Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied...

Stating the facts of the event is not accepting blame.

The NTSB explains why that fact happened in footnote #8 of page 4:

"Object classification algorithms in the Tesla and peer vehicles with AEB technologies are designed to avoid false positive brake activations. The Florida crash involved a target image (side of a tractor trailer) that would not be a “true” target in the EyeQ3 vision system dataset and the tractor trailer was not moving in the same longitudinal direction as the Tesla, which is the vehicle kinematic scenario the radar system is designed to detect."

It explained that the accident happened because it's not what the system was designed for.

It's designed for vehicles moving in the same direction as the Tesla's. If they do, the camera would recognize the back of the vehicles.

On page 3, it explained the AEB technologies that have been tested for rear-end collisions but it's not designed for side ones:

"The report also identified several crash modes that were not validated by the project, including straight crossing path (SCP)3 and left turn across path (LTAP) collisions:"

The 2 semi-trucks performed an LTAP which the system was not designed for.

So, the system worked as designed. The LTAP collisions were not within the scope of the design so bad things can happen when things don't comply with the design (making sure driving in the same direction as the Tesla and make sure don't show your side, only show your rear).
 
I would say that's a terrible combination of hardware/software systems that could not avoid a collision.

But because the system is so terrible, that's why Tesla reminded the DMV that its FSD is L2 and will be L2 in its final version.

Tesla didn't tell DMV that its system is and will be so good that it will be able to avoid crashes and it will be L3 and above.

So, by definition of L2, because it's terrible that's why it needs a competent human driver to drive the L2 car.
I realize that it's your words and not Telas' that their L2 system is terrible, but they can see that as well as anyone.

If that was me and I released a terrible system onto the market, Beta or otherwise, I would not sleep unless I had assured myself I had done all I could to make the driver aware, and ensure driver alertness (split second, not one alert every 8 or 30 seconds). Actually I am fairly certain I would not release that system at all.
 
So, the system worked as designed.
[...]
The Florida crash involved a target image (side of a tractor trailer) that would not be a “true” target in the EyeQ3 vision system dataset and the tractor trailer was not moving in the same longitudinal direction as the Tesla
[...]
It explained that the accident happened because it's not what the system was designed for.
[...]
On page 3, it explained the AEB technologies that have been tested for rear-end collisions but it's not designed for side ones:

The 2 semi-trucks performed an LTAP which the system was not designed for.

So, the system worked as designed. [...]
Oh come on. If the system is so designed as to not see a huge truck across the road then it should NOT exist.
 
Last edited:
You are overreacting!

There are limitations in this imperfect world. As long as users are in compliance with the designers, we are all good.
https://www.tesla.com/en_CA/blog/tragic-loss
Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied
The user was in agreement with the designer in not noticing the huge truck across their path.

The user did not use the system on the type of road the designers intended, but the system let him anyway.

The user did not heed the warnings and was not controlling the car for two minutes. The system let him anyway.

In a sense the car crashed itself, as its design was to stop eventually in case of driver incapacitation, which in a very loose sense he was. However the time for all the warnings to take effect was orders of magnitude longer than sensible.

People still use Autopilot on twisting country roads, thus are not compliant with Tesla. Just so they are aware, when they crash Tesla and the NHTSA will blame them. The NTSB will not blame anyone but will recommend changes.

I bet you could take a poll of every Tesla driver using Autopilot and everyone else too about whether the car is designed to react to an entire semi-truck & trailer across the highway and obtain almost unanimous agreement that it would react. The designers say otherwise.

People are going to try and fool eye-monitoring too. Those crashes will be blamed on them too.

This is not a good plan; "the future will be wonderful once we work through all the pesky deaths first". I don't see it that way.
 
  • Like
Reactions: daktari
... the drivers were highly experienced users well aware of the limitations. In the first case, the driver posted videos of his use before. ...
Joshua Brown exaggerated the capabilities of Autopilot. He was not well aware of limitations. Contrary to popular belief white truck against bright sky had nothing to do with the problem. Mobileye only trained for back and front of vehicles, so it was not designed or trained for cross traffic. Had Joshua Brown known this, he might be alive today.
 
Last edited:
The user was in agreement with the designer in not noticing the huge truck across their path.

No. The user was violating the terms of use and not in agreement with the design. The NTSB said on page 11:

"Drivers should read all instructions and warnings provided in owner’s manuals for ADAS technologies and be aware of system limitations".

Many users do not agree with manufacturers' system limitations and think the system can respond to a huge semi-truck in front.

Many users still insist that if the semi-truck is big, the system should deploy brakes and avoid the collision.

As the NTSB explained above, yes only if that big semi-truck is in compliance with the design that has been tested and passed:

1) expose its rear to the system and not its side
2) travel on the same direction as the Tesla.

That is not too much to ask!

The user did not use the system on the type of road the designers intended, but the system let him anyway.
True. In an L2 system, the driver drives the car. The L2 machine is not to argue with the human's decision like the way the HAL-9000 in the movie "2001: A Space Odyssey".

If the L2 wants to protest against humans' judgment then it should be classified higher like L5.


The user did not heed the warnings and was not controlling the car for two minutes. The system let him anyway.
Same as above. Autopilot has not graduated to the level of robot HAL-9000
In a sense the car crashed itself, as its design was to stop eventually in case of driver incapacitation, which in a very loose sense he was. However the time for all the warnings to take effect was orders of magnitude longer than sensible.
True. That's the function of L2. Human is responsible. L2 will let the car crash itself if there's no competent human driver in control.

People still use Autopilot on twisting country roads, thus are not compliant with Tesla.
True.
Just so they are aware, when they crash Tesla and the NHTSA will blame them.
True. Humans are in control and to be blamed when the system is less than L3.
The NTSB will not blame anyone but will recommend changes.
It did blame 2 opposite drivers but not the manufacturers who wrote the manual and the terms of service.

It does recommend manufacturers changes to crack down on bad drivers who violate the terms of use.

It also wants Apple to disable the iPhone because people die while operating the car and the iPhone at the same time. Does that raise ethical questions like what about passengers who don't drive but have an iPhone? Transforming manufacturers into a police force to crack down on those who don't read Tesla's manual can be an ethical issue.

I bet you could take a poll of every Tesla driver using Autopilot and everyone else too about whether the car is designed to react to an entire semi-truck & trailer across the highway and obtain almost unanimous agreement that it would react. The designers say otherwise.
It's human nature. Consumers buy something and we think we are in control of how our products SHOULD work.

However, legally, it's the manufacturers who dictate how the products should work and should NOT work.

When consumers don't know how it should not work, and got into trouble (such as death) they will be blamed for violating the terms of use.

People are going to try and fool eye-monitoring too. Those crashes will be blamed on them too.
Sure. Humans are smart and can defeat a nanny system.
This is not a good plan; "the future will be wonderful once we work through all the pesky deaths first". I don't see it that way.
It is still a good plan. People will die whether they use an imperfect Autopilot/FSD or not.

The difference is: Those who do use Autopilot/FSD have a less chance to die:

 
Last edited:
  • Like
Reactions: Microterf
The difference is: Those who do use Autopilot/FSD have a less chance to die:

LOL. We argue that AP should only be used in easy to drive situations: On the highway, no complex things like trucks crossing. Then, we say OMG AP IS SAFER THAN NORMAL DRIVING!. Duh. Because AP can only be used in low risk situations. I'm actually surprised it's only 2X safer than all overall driving given the percentage of accidents that happen on city streets. It may actually be telling us it's worse than a human driver on a highway given that rate. Also, their statistics are "accident" not "fatality." AP accidents could be more severe than human accidents but at a lower rate. Finally, do the statistics count accidents where the AP was not "engaged" at the point of impact because the driver had applied the brakes or turned the wheel to try and avoid the accident like an attentive driver should? We don't know because Tesla doesn't tell us how they came up with these.

People still use Autopilot on twisting country roads, thus are not compliant with Tesla.
The fact Tesla allows this is nuts. They know what road you are on and already do different things based on it (like enforcing speed limits). Back when AP2 came out, it blocked you from engaging on anything but controlled access highways for almost a year, so they already have the code. There really is no excuse for Tesla allowing you to engage AP on roads that you shouldn't engage it on. This means Tesla knows people do it, and lets them do it because it has value for Tesla.
 
In safety teaching, ie. the "Swiss cheese model of safety", Tesla's approach with the UI with L2 systems is a big hole in the cheese. It allows people's errors and misuse to result in a consequence. The reason is because the user interface is designed to be a Level 4 UI, not a L2. "Pling plong, Now the car has control, you have not".

A blended UI however, where the car and driver control the car together, mitigates this a bit, by keeping the driver in charge all the time. The car never really takes control away, it just assists.

They need to change the UI, we can't continue to sacrifice people to make autonomous progress.
 
  • Like
Reactions: diplomat33