Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla FSD/Autopilot under attack

This site may earn commission on affiliate links.
Lots to unpack in this thread, from copyright law to proper use of NTSBHSASBH.

But really, the core concept most people on this forum seem disagree over IMO is the legal vs. human aspect.

Some are arguing that what Tesla is doing is legally right, that they didn’t promise anything, when they say « auto-pilot », it does not imply it is really « auto », it is « automatic » until it is not, etc. So legally they fit within limits and people raising issues about it are ill-intentioned.

Others are arguing that this stance is in denial of how human beings operate i.e. they are not robots. In particular, when your car, extensively advertised as such, handles a lot of situations on its own for miles/km/hours, then has an abrupt absurd behavior, sure, you have to be in control at all time, that’s what the legal agreement says, but by the very definition of a system that can reduce your cognitive load, you get into less of an alert mode, and you might be ill-prepared to react. Again, are you supposed to do that, legally? No, you are not. But is that legal’ish stance even recognizing how the average human brain operates? There are tons of studies in that space, this has been extensively analyzed with plane pilots, the risk of zoning out is real. Would it be best if we could have sub-milliseconds reaction time at all time for long monotonous durations like a robot? Yes, for sure, and that’s the very reason we need robots: because we are horrible at it! And so, since Tesla has been very actively playing in that grey zone for a while, hoping for humans to be like robots, but that it still can’t stop when a fire truck is stopped in the middle of a lane, that makes a number of people wonder whether that’s really ok.

As for the media attention they are getting, well, yeah, they are getting HUGE amount of FREE media attention. Some positive, some negative. I’m surprized anybody would be surprized…
 
Others are arguing that this stance is in denial of how human beings operate

This is what I'm arguing, but there are two parts to how human beings operate.

One part is unconscious distraction that you covered pretty well.

The other is conscious distraction through abusing the systems. Of using driver assist features to justify some decision to drive under the influence, or to text while driving. This is where I think most of the accidents are currently occurring. But, I think it would be a mistake for any safety regulatory body to band-aid this through requiring driver monitoring.

I am curious to see how different countries solve these problems with L2.

Europe seems to be taking the approach of limiting the capabilities of L2 along with requiring driver monitoring (not yet but its coming). This combination plus having better drivers overall might be a good stop gap solution. But, it won't allow for a continuous improvement of FSD that Tesla is banking on.

The US will likely even allow FSD Beta as they don't seem to be doing anything to stop it

Not sure if any country will adopt a position of placing any liability on an L2 vehicle for an accident that occurred during its usage. This is what I'm advocating for.
 
I agree that Tesla is an easy target, especially as it does not fund the politicians and journalists.

That said, the L1-5 system is an artificial dinosaur and shell be abandoned all together, IMHO. There is really no L3 and L4, and they can't be. The driver is in watching the system or s/he is not watching system. If the operation is not required to take over immediately, then it is an L5 system because the time required to respond to any critical situation cannot be codified into a law by politicians. Instead, it may take a second or two for a situation to escalate beyond the capabilities of ANY non-L5 system, but a non-attentive driver will NOT respond adequately within 1 or 2 seconds. This means that ANY L3 and L4 system is inherently flawed.

I agree that L1-L5 is flawed.
I agree that there really is no L3 due to either extremely limitations or extreme danger.
I completely disagree on L4 as L4 is really where autonomous vehicles are at. L4 doesn't require a driver to take over immediately, and it's not even a timed take over like L3. It has to pull safely off the road to allow a sleeping passenger to take over.
L5 is just some fairy tale, and relies on the fairy tale of general AI.

So what we really have is either the human is responsible or the car is responsible during L4 activation in a geo-fenced area/conditions.

So basically L1, and L4.

L2 tries to do something really dumb by taking away driving responsibility from the driver yet still assigning the individual 100% of the blame if an accident happens.
 
Absolutely nothing in Reuters' article (as usual) regarding the likelihood that using Ap has already prevented thousands of accidents from happening, preventing many deaths and incidents of serious injury. This would only come out in time, statistically. People see news reports of Teslas cruising down the highway with their drivers fast asleep at the wheel and hysterically throw their hands up in horror when *nothing dramatic is happening*! Imagine how many more happen (again completely undramatically) when there isn't anyone around to record it on their phone - and the Tesla either eventually stops or the driver finally wakes up and takes back control. Tens of thousands of people fall asleep at the wheel every day in the world and most end very badly accounting for a good proportion of 'KSI' (killed and/or seriously injured) statistics, yet this is *never* mentioned.
 
So, now when Tesla updates the software to detect the emergency vehicles at night, slow down, and alert the drivers, what is the chance of Reuters publishing an article on the fast Tesla response and asking for legislative action to push other OEMs to include the emergency vehicle detection measures in their software? This isn't even ha ha.

 
  • Like
Reactions: finman100
So, now when Tesla updates the software to detect the emergency vehicles at night, slow down, and alert the drivers, what is the chance of Reuters publishing an article on the fast Tesla response and asking for legislative action to push other OEMs to include the emergency vehicle detection measures in their software? This isn't even ha ha.

The NHTSA has already sent data requests to the other OEMs deploying similar systems, it's clear they'll be doing a comparison of capabilities and identifying shortcomings across all of them. And the media has already reported on those requests, but I'm guessing the NHTSA doesn't have documented crashes into emergency vehicles from other OEMs using L2 systems. You can be sure the NHTSA would be acting on that data if they had it, they issue recalls to legacy companies all the time.

These emergency light detection capabilities also have not been tested yet, and they come with the disclaimer

“Never depend on Autopilot features to determine the presence of emergency vehicles. Model3/ModelY may not detect lights from emergency vehicles in all situations. Keep your eyes on your driving path and always be prepared to take immediate action.”
 
The NHTSA has already sent data requests to the other OEMs deploying similar systems, it's clear they'll be doing a comparison of capabilities and identifying shortcomings across all of them. And the media has already reported on those requests, but I'm guessing the NHTSA doesn't have documented crashes into emergency vehicles from other OEMs using L2 systems. You can be sure the NHTSA would be acting on that data if they had it, they issue recalls to legacy companies all the time.
I wonder if the other OEMs even have the ability to collect information regarding the state of the systems of a car involved in a crash.
I also wonder where are the articles titled like "A Ford vehicle was involved in a fatal crash. It is not known yet if the vehicle was operated by the Ford's autonomous driving system"
 
  • Like
Reactions: finman100
Absolutely nothing in Reuters' article (as usual) regarding the likelihood that using Ap has already prevented thousands of accidents from happening, preventing many deaths and incidents of serious injury.
Tesla publishes their own safety report of cars on and off AP. They claim cars on AP are about 2X as safe as cars off AP.
The issue is that AP can mostly only be used on the highway, so they are comparing cars on the highway to cars on surface streets.
The highway is statistically 3X as safe as surface streets. So there is a reasonable argument that AP actually reduces safety if it's only 2X as safe in the areas that it can be used. Tesla refuses to publish data about the safety of vehicles using AP vs the safety of cars not using it but could be using it. So Tesla is either very bad at statistics, or is hiding something.

We have the data. It's isn't good. It isn't Ruters' job to make up the idea that AP saves lives when there is no data to support that.

Tens of thousands of people fall asleep at the wheel every day in the world and most end very badly accounting for a good proportion of 'KSI' (killed and/or seriously injured) statistics, yet this is *never* mentioned.
Source? 3,700 people die every day in accidents around the world, 100 of those in the USA. Most of these are not due to people falling asleep. If 20,000 people fall asleep every day, it appears to not be that dangerous, since if every death was due to falling asleep, it would only be 18% fatality rate. (3700/20,000)
You're making up stuff just like you want Reuters to do.
 
Last edited:
...People see news reports of Teslas cruising down the highway with their drivers fast asleep at the wheel and hysterically throw their hands up in horror when *nothing dramatic is happening*!...

Driving a Tesla without paying attention to the robot's performance in order to intervene timely means the driver is defeating Tesla's design: All current Autopilot/FSD/FSD beta requires a driver to be in charge of the drive even while the automation system is active.

Thus, drunk, sleep, or doing pranks as in:


although "nothing happened" and although they are saved and not dead, it puts potential harm to self and others.

Just like the St. Louis lawyer couple who waived guns at protesters in front of their house, no one died, nothing happened and the couple was saved from being mugged, raped... and they were pardoned by the Governor but it's still a violation of how to use guns and their law licenses are being jeopardized.
 
Thanks for Sandy Munro's video clip.

He praised how great and amazing Tesla's "chip" is but he failed to explain how the current FSD beta still cannot avoid collisions.

He thinks the Government should do what they do best such as Afghanistan and not monkey around with what they are not trained in and what they do not understand such as investigating Tesla advancements. I think the job of NTSB/NHTSA is to protect the public right here in the US so I do think there's a role for them in investigating battery fires and automated driving technology.

The US is still quite novel with EV battery fire and complains it takes so much water, so much time, and even with so many re-ignitions. In the meantime, the EU has been able to solve that problem with a mobile giant bathtub/water pool and place an EV in it.

9d54216a86e6968d867b6d0f2d5e6df17550e6f0

Photo credit: Luxemburger Wort
 
See thread title... make internal bet with self on not only what the content of the discussion will be, but also which forum members will actively participate in said discussion.

Check back on thread after 24 hours, confirm both thread content, and participants, matched expectations.

Note that I am not saying anything is wrong, bad, etc... just that it was fairly predictable what this thread would look like, and who would participate in it, based on the title.
 
  • Like
Reactions: Sandor
I wonder if the other OEMs even have the ability to collect information regarding the state of the systems of a car involved in a crash.
I also wonder where are the articles titled like "A Ford vehicle was involved in a fatal crash. It is not known yet if the vehicle was operated by the Ford's autonomous driving system"
The NHTSA established and recently amended a standing order regarding reporting requirements around these systems, so they'll be ensuring manufacturers are meeting their statutory obligations. You can find the order here


I don't think many of the legacy OEMs are facing the same level of risk through limited capabilities and operating domains of their systems, much more conservative designs, fewer vehicles deployed with the tech enabled, or a combination of all those. But outside of the Tesla world, Nio's ADAS crashes have grabbed headlines here even when they happen in China.

Pushing boundaries like this kinda opens you up to more scrutiny, and the legacy brands in general are definitely being more conservative. For example, Ford was hoping to hit just 100,000 BlueCruise-enabled vehicles on roads by next year. BlueCruise is geofenced and won't even execute a moderate curve for you, and I think it specifically can be used on mapped highways where it will give you advance warning of such a curve.
 
Last edited by a moderator:
  • Like
Reactions: alexgr
I wonder if the other OEMs even have the ability to collect information regarding the state of the systems of a car involved in a crash.

I would think they would want to, for legal reasons.


I also wonder where are the articles titled like "A Ford vehicle was involved in a fatal crash. It is not known yet if the vehicle was operated by the Ford's autonomous driving system"

Ford doesn’t market their system as full self driving. Tesla has a problem. Fans of tesla should acknowledge it before it becomes that one company that was sued out of business.
 
  • Disagree
Reactions: alexgr and sfuape
I wonder if the other OEMs even have the ability to collect information regarding the state of the systems of a car involved in a crash.
Event Data Recorders are a thing. In fact, they are in almost all vehicles sold today. Tesla is not some special unicorn of data collection.

U.S. will not seek to require event data recorders in cars, trucks

NHTSA said in a statement it was withdrawing the proposal because nearly 100 percent of manufacturers voluntarily equip vehicles with the devices.
In 2006, NHTSA adopted regulations requiring EDRs to collect data if they were installed in vehicles and steps to ensure the survivability of the data in a crash. Automakers are free to collect additional data if they choose.
 
Event Data Recorders are a thing. In fact, they are in almost all vehicles sold today. Tesla is not some special unicorn of data collection.

U.S. will not seek to require event data recorders in cars, trucks
Not every accident is reported back to and investigated by the manufacturer. I don't think it is an unfair to assume that most recorded events go unrecorded on the OEM's books. Some OEMs have emergency reporting capabilities in their vehicles, and it would be interesting to know IF those are capable to beaming all the config parameters back to the OEM in case of an accident.
 
  • Disagree
Reactions: qdeathstar
and it would be interesting to know IF those are capable to beaming all the config parameters back to the OEM in case of an accident.
Onstar has existed since 1996. It's on most GM vehicles, and GM makes 3X as many vehicles a year as Tesla.

Onstar predicts the injury severity and can use this real time to inform first responders without any driver intervention. Seems pretty likely that they can beam whatever they want about a crash, just like Tesla.

American College of Emergency Physicians

Just one example. The new Chevy Corvette has full OTA firmware updatability as well. All this stuff is coming to all cars pretty soon.
 
Onstar has existed since 1996. It's on most GM vehicles, and GM makes 3X as many vehicles a year as Tesla.

Onstar predicts the injury severity and can use this real time to inform first responders without any driver intervention. Seems pretty likely that they can beam whatever they want about a crash, just like Tesla.

American College of Emergency Physicians

Just one example. The new Chevy Corvette has full OTA firmware updatability as well. All this stuff is coming to all cars pretty soon.
Yes, it would be interesting to know if GM gets ANY information from OnStar system.

Regarding the OTAs, I don't think they will work for complete SW upgrades (new SW versions) because of the multiplicity of ECU systems that run drivers of various manufacturers' software. Simply put, it is unsafe to run an update on Porsche, VW, or Ford without verification at a dealership site. Small subsystem updates will be running, but major updates will still require dealership visits until the OEMs will adopt Tesla's model of keeping control of all (most) of their subsystem development and production.
 
Yes, it would be interesting to know if GM gets ANY information from OnStar system.
I literally give you a link that says GM gets crash severity data the instant a crash occurs, and you ask if they get ANY information?

I don't think they will work for complete SW upgrades
Already happening on Corvettes.

until the OEMs will adopt Tesla's model of keeping control of all (most) of their subsystem development and production.
You still think Tesla is a magic unicorn of product development. They do not make most of their subsystems. The iBooster brake booster is made by Bosch. The TPMS and Radar is Continental, etc. They don't make their own steering racks or ABS pumps. This is a good thing. Tesla cannot be experts at all things.

They update them the same way any car does, it just happens to have the programming computer built in to the car. Tesla has had OTA since 2012 in the first Model S- you think that car was all internal Tesla subsystem development and production?

There's no reason other manufacturers can't do OTA, they just haven't wanted to in the past.
 
  • Disagree
Reactions: alexgr