Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

U.S. opens formal safety probe for Autopilot - 2021 Aug 16

This site may earn commission on affiliate links.
We already know that the radar-equipped TACC did not deal with static objects when traveling at high speeds.

Please provide data to back up the assertion that it occurs with the vision-only version.
From what I understand this is more of a modeling problem than a sensor problem. Radar detects static objects just fine but the systems are designed to largely ignore them because otherwise they create all kinds of false positives that would lead to the vehicles inappropriately slamming on the brakes, and that is massively dangerous by itself. This is the case with all adaptive cruise control systems and certainly for Tesla dating back to 2016. If there were an easy solution, you'd think it would have been rolled out five years ago or at a minimum back in 2018 when these accidents were first documented.

If we want something to back up vision not being an immediate solution to this, I think we can use the letter Tesla's Associate General Counsel Eric Williams sent to the California DMV prior to the release of FSD Beta. You can find it at Page 16 of the following link


As little as nine months ago, this is what Eric was saying

For context, and as we’ve previously discussed, City Streets continues to firmly root the vehicle in SAE Level 2 capability and does not make it autonomous under the DMV’s definition. City Streets’ capabilities with respect to the object and event detection and response (OEDR) sub-task are limited, as there are circumstances and events to which the system is not capable of recognizing or responding. These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving path, unmapped roads. As a result, the driver maintains responsibility for this part of the dynamic driving task(DDT). In addition, the driver must supervise the system, monitoring both the driving environment and the functioning of City Streets, and he is responsible for responding to inappropriate actions taken by the system. The feature is not designed such that a driver can rely on an alert to draw his attention to a situation requiring response. There are scenarios or situations where an intervention from the driver is required but the system will not alert the driver. In the case of City Streets (and all other existing FSD features), because the vehicle is not capable of performing the entire DDT, a human driver must participate, as evidenced in part through torque-based steering wheel monitoring, or else the system will deactivate.


And on Page 26, Eric straight up says they don't expect any significant change in OEDR or anything that will shift responsibility of the dynamic driving task to the system

2. Please describe and provide any relevant documentation reflecting Tesla’s intended functionality for the final release of FSD City Streets to the general public. Specifically, which of the limitations associated with the OEDR described in the letter dated November 20, 2020 will continue to be part of the final release of FSD City Streets to the general public?

While the current pilot version of City Streets is still in a validation and review stage, we expect the functionality to remain largely unchanged in a future, full release to the customer fleet. We are analyzing the data obtained in the pilot and using it to refine the feature’s operation and customer experience. We will continue to make refinements as necessary, and only after we are fully satisfied with performance, integrity, and safety will we release the feature to the customer fleet. That said, we do not expect significant enhancements in OEDR or other changes to the feature that would shift the responsibility for the entire DDT to the system. As such, a final release of City Streets will continue to be an SAE Level 2, advanced driver-assistance feature. Please note that Tesla’s development of true autonomous features (SAE Levels 3+) will follow our iterative process (development, validation, early release, etc.) and any such features will not be released to the general public until we have fully validated them and received any required regulatory permits or approvals.
 
Last edited by a moderator:
  • Informative
Reactions: daktari
Radar-based AP will always have the problem of tending to ignore stationary vehicles. It will also continue to exhibit phantom braking due to overpasses and metal-frame bridges. The moving radar has to ignore targets “moving” with the same velocity as the vehicle or get hopelessly swamped.

Pure vision AP will in principle solve the stationary vehicle problem. Right now, it’s only being demonstrated on HW3 vehicles. My question is, how far back in time will Tesla’s computers support a vision-only EAP (not FSD)? Will it run on AP 2 vehicles?
 
One question I've had about Teslas for years now- Does Automatic Emergency Braking (AEB) work at all?

I've seen far too many of these crashes over the years, and if your AEB can't stop for a firetruck then it's not worth a damn. I really like the car, but I drive assuming that it has the stupidest and lamest software possible, because that's what I see with these accidents. It's all well and good to emphasize that it's still the driver's responsibility, but I think it's also worth pointing out that these safety systems just don't work.
 
One question I've had about Teslas for years now- Does Automatic Emergency Braking (AEB) work at all?

I've seen far too many of these crashes over the years, and if your AEB can't stop for a firetruck then it's not worth a damn. I really like the car, but I drive assuming that it has the stupidest and lamest software possible, because that's what I see with these accidents. It's all well and good to emphasize that it's still the driver's responsibility, but I think it's also worth pointing out that these safety systems just don't work.

Partial lane blockages, like what a fire truck normally does, has been a limitation of most AEB system for a long time. In fact one of the interesting things is that on the same night in the same area there were two similar incidents: Drivers charged after state police cruisers struck in two separate incidents

The first crash occurred around 10:05 p.m. on Route 24 northbound near Exit 17 in West Bridgewater. State police said a trooper from the Middleborough barracks had pulled over a 2007 Lexus to issue a warning and was outside his cruiser when the SUV was struck from behind by a Tesla Model 3.

In a second, similar incident, a trooper from the Charlton barracks was responding to a crash along the Massachusetts Turnpike in Warren at 11:04 p.m. when his cruiser and another cruiser were hit by a rented 2019 Subaru Outback.

So the Subaru AEB system "failed" about 1 hour after the Tesla system in a similar situation.
 
Problem is driver without attention to traffic ahead

Problem is car without attention to traffic ahead.

Some of you continue to blame the human for the "autonomous" vehicle's shortcomings. This seems to be acceptable because "Tesla said so", but it's a really poor tactic. In what other scenarios would this be considered okay?

Currently FSD is like a lifeguard with narcolepsy. It might just be sleeping at the most inopportune time. I'd rather have no lifeguard at all, then at least I know it's my responsibility to keep the kids safe.
 
Problem is car without attention to traffic ahead.

Some of you continue to blame the human for the "autonomous" vehicle's shortcomings. This seems to be acceptable because "Tesla said so", but it's a really poor tactic. In what other scenarios would this be considered okay?

Currently FSD is like a lifeguard with narcolepsy. It might just be sleeping at the most inopportune time. I'd rather have no lifeguard at all, then at least I know it's my responsibility to keep the kids safe.
These are not autonomous vehicles. Autopilot is a level 2 driver assist technology. You, the driver, are still the captain.

With that said, there is a room for improvement when it comes to stopped vehicles. No argument there. But the buck stops with the driver.
 
Why is everyone assuming an investigation is bad for Tesla? The government has a job to do. Find out if autopilot causes more accidents. Data shows it doesn’t, but he government wants to be sure. I don’t think they will find Tesla at fault, but if they come up with some recommendations tesla can enact to make the transition to autonomous driving safer, Tesla and its customers should be happy to gain that insight.
 
These are not autonomous vehicles. Autopilot is a level 2 driver assist technology. You, the driver, are still the captain.

With that said, there is a room for improvement when it comes to stopped vehicles. No argument there. But the buck stops with the driver.

I see... I've conflated the two. It seems to me that Autopilot and "Full Self-Driving" are pretty much the same thing. Thanks for clarifying.
 
  • Disagree
Reactions: TSLY
Why is everyone assuming an investigation is bad for Tesla? The government has a job to do. Find out if autopilot causes more accidents. Data shows it doesn’t, but he government wants to be sure. I don’t think they will find Tesla at fault, but if they come up with some recommendations tesla can enact to make the transition to autonomous driving safer, Tesla and its customers should be happy to gain that insight.
Yes, this time they will take seriously any recommendations, for sure.


Feb 25, 2020
Tesla ignored safety recommendations made by the National Transportation Safety Board (NTSB) about its Autopilot driver assistance system, the board’s chairman Robert Sumwalt said on Tuesday.
..
“One manufacturer ignored us. And that manufacturer is Tesla,” Sumwalt said on Tuesday during the beginning of a hearing about a fatal 2018 crash that involved Autopilot.

“We ask that recommendation recipients respond to us within 90 days. That’s all we ask. Give us a response within 90 days. Tell us what you intend to do,” Sumwalt said. “But it’s been 881 days since these recommendations were sent to Tesla, and we’ve heard nothing.”

Tesla did not respond to a request for comment.
 
  • Informative
Reactions: Matias
Partial lane blockages, like what a fire truck normally does, has been a limitation of most AEB system for a long time. In fact one of the interesting things is that on the same night in the same area there were two similar incidents: Drivers charged after state police cruisers struck in two separate incidents

The Subaru case is interesting from the standpoint of maybe nobody does this properly, but if that's the case, then they need to just stop lying about the capabilities. Lying about the capabilities makes some more naive people believe the marketing.

Crashes like these aren't even close to being a partial lane blockage. The truck is fully in the lane, the Tesla hits it full on. This is a system that doesn't look like it works at all to me.

TESLA_Capture-2-640x354.jpg




5bfc0f388fd8d.image.jpg



3424875-B.jpg


Walter Huang's crash into a concrete road divider:

1100x619_cmsv2_005676cf-8844-5235-a1c5-dc08e8a950d6-5991924.jpg


 
  • Helpful
Reactions: father_of_6
One question I've had about Teslas for years now- Does Automatic Emergency Braking (AEB) work at all?

I've seen far too many of these crashes over the years, and if your AEB can't stop for a firetruck then it's not worth a damn. I really like the car, but I drive assuming that it has the stupidest and lamest software possible, because that's what I see with these accidents. It's all well and good to emphasize that it's still the driver's responsibility, but I think it's also worth pointing out that these safety systems just don't work.
In Europe yes, but it is limited like all other AEB systems: AEB testing is in the end of the videos.


 
Problem is car without attention to traffic ahead.

Some of you continue to blame the human for the "autonomous" vehicle's shortcomings. This seems to be acceptable because "Tesla said so", but it's a really poor tactic. In what other scenarios would this be considered okay?

Currently FSD is like a lifeguard with narcolepsy. It might just be sleeping at the most inopportune time. I'd rather have no lifeguard at all, then at least I know it's my responsibility to keep the kids safe.
Sorry, I disagree. The sensors in any cars don't have the range to stop safely from an object at high speed. At highway speed 100 km/h you travel at 28 m/s. Range is only around 100 m. The best one can hope for is that the system will brake, before the driver reacts, to reduce the velocity at impact. The higher the speed, less chance to stop. Look at EuroNCAP that tests these scenarios.

The issue is though that some drivers don't understand this fact and believe the system is "a quantum leap", "sublime" and will "blow their mind" because the guru said so. They hold one hand lightly on the wheel while leaning back and relaxing.

Some misuse statistics and claim the system reduces errors and crashes, but fail to mention that the system introduces new errors and crashes also.

Some TSLA promoters also believe their car to have x-ray vision or psychic abilities and post it on twitter. (TeslaOwnerSiliconValley)

The misleading hype, ambigous marketing combined with driver naivity and brand enthusiasm are the issues to adress.
 
Pure vision AP will in principle solve the stationary vehicle problem. Right now, it’s only being demonstrated on HW3 vehicles. My question is, how far back in time will Tesla’s computers support a vision-only EAP (not FSD)? Will it run on AP 2 vehicles?
Sorry, I don't believe that vision only will solve this problem. When speed is to high, time to impact will be shorter than time for intervention because sensor range is to short.

You are already travelling at 28 m/s at 100 km/h. Stopping distance on dry roads is 50+ meters, almost 2 seconds. With a vision range of 100 meters, system has only 1 second to identify and classify object as a block and initiate a full emergency brake. Someone in another thread calculated how many pixels the front narrow camera can see at 100 m. It was not too many...

At 150 km/h, the upper limit of AP today, one is pancake already... they need to improve sensor range.
 
  • Like
Reactions: linux-works
One question I've had about Teslas for years now- Does Automatic Emergency Braking (AEB) work at all?

I've seen far too many of these crashes over the years, and if your AEB can't stop for a firetruck then it's not worth a damn. I really like the car, but I drive assuming that it has the stupidest and lamest software possible, because that's what I see with these accidents. It's all well and good to emphasize that it's still the driver's responsibility, but I think it's also worth pointing out that these safety systems just don't work.
AEB is not designed to prevent accidents. When an accident is unavoidable, it brakes to reduce impact. That is all. AEB has nothing to do with this discussion. The discussion is about TACC/AP.
 
  • Like
Reactions: SMAlset