Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Should Tesla restrict Auto-Steer to only Freeways via software?

Should Tesla push a software update to restrict Auto-steer to Freeways?

  • Yes... it's still too risky if there's the potential for cross traffic.

  • Maybe... I don't feel strongly either way...

  • No... let drivers decide what level of risk their willing to take.


Results are only viewable after voting.
This site may earn commission on affiliate links.

nwdiver

Well-Known Member
Feb 17, 2013
9,410
14,632
United States
My understanding of mobil eye is that it's currently not capable of identifying cross traffic hazards. Not sure if could be a software fix or if the hardware simply isn't there.

I firmly believe that on Interstates where there is little or no cross traffic driving with auto pilot enabled is likely far safer than driving manually. My fear with state highways and other side roads is that using auto pilot as it currently exists might actually by more hazardous than driving manually due to increased complacency with regards to cross traffic.

There's no where near enough data to have any statistical opinions but IMO it's crucial that self-driving cars prove themselves quickly to not just be safer but FAR safer than manual driving. If auto pilot isn't ready for cross traffic then perhaps steps should be taken to reduce its exposure to those scenarios.
 
If it can't handle cross traffic, it should be freeway only, that's pretty simple to me. Also I don't think it should work at excessive speeds, I think if you want to drive way over the speed limit, you should be in full control. It could also play an alarm if the driver is not paying attention.

I believe you cannot set autopilot ever, to anything above 90. Given the highest posted speed limit in .us is 85, that seems reasonable.
 
I believe you cannot set autopilot ever, to anything above 90. Given the highest posted speed limit in .us is 85, that seems reasonable.

This is correct.

But at any rate, stuff like being sliced in half by a semi trailer or driving over a cliff or having a truck ahead drop a large object on the road can be unsafe for an inattentive driver at any speed and equally likely on a freeway.
 
The fundamental assumption I see posed in the OP's question is that reduced risk is inherently more valuable than freedom of choice. Sorry - I'm a philosopher by education (businessman and investor by trade - philosophers don't generally get to a point where they can buy Teslas).

Now - why is that? What if, for example, it was determined that self driving cars are in fact 25% LESS safe than human drivers - should they be banned? If so - why? Is there not some balancing point where the utility gained by millions of people getting to relax during their drive is equal to or outweighs the increased risk of death? We make this kind of calculated trade-off all the time in many other decisions.

Why do people seem to hold the bedrock belief that self driving cars should not be allowed unless they are not just equal to humans, but in fact far, far better?

Why do we have the instinct that for some reason a death caused by the error of a robot is worse than a death caused by the error of a human pilot?
 
Can we have a corresponding poll that asks the question whether car companies should restrict cars to certain speeds on certain roads? It's essentially the same question, just for a different feature common in all cars instead of just Tesla's.

Speed related car accidents account for about 10,000 deaths per year in the US. Even though cars are capable of exceeding the speed limit, and in fact this feature is frequently prominently touted a selling point, roadways don't support cars going that fast safely and the use of the feature is left to the good judgement of the driver. Although most people don't speed irresponsibly there is a good portion of the driving public that does to the danger of everyone around them. This feature could easily be software limited.

Similarly with Autopilot and other autonomous driving technology, it's a feature that won't prevent all accidents and has its limitations that should be used with the good judgement of the driver.

The knee jerk reaction to the Autopilot accident has been to discuss restricting the technology in some way as if the AP caused the accident. It's a reactionary response. The car was doing what it was supposed to do and encountered a scenario that exceeded its limitations at which point the backup system, the driver, was expected to react. It may very well be that the accident was unaviodable, that even if the driver had responded or had been in control of the car without AP he would have crashed. But the problem with jumping to the conclusion that this one fatality is evidence that the autonomous driving is unsafe doesn't take into account the number of times AP has avoided an equally fatal accident. You can't measure something that didn't happen and disabling the feature may actually be a step backward in safety overall.

Another good question to ask would be if it should be required that trucks have side rails on the beds as in Europe in order prevent cars driving under them. The reason this accident was fatal was because the truck lacked such a guard rail. Removing AP or upgrading it may help Tesla and its customers but it won't prevent this type of accident from occurring to either Tesla's or other cars. A guard rail under the trailer won't prevent the accidents either, but it could mitigate the fatalities that occur when it does.
 
My understanding of mobil eye is that it's currently not capable of identifying cross traffic hazards. Not sure if could be a software fix or if the hardware simply isn't there.
Currently Tesla does not have the hardware needed for detecting objects approaching from the side of the car. Since the hardware isn't present, the software isn't either.
 
In general, TACC enabled is safer for the driver than not enabled.

Lateral detection, per MobilEye, won't be available until 2018.

I use DriverAssist/DA (what most others call AutoPilot or AP) at every possible opportunity. Partly because the exceptions or knowledge derived thereof is shared with the fleet, and partly because TACC has improved to the point that I'd rather have it than not. That said, I watch it like a hawk and of course due to nag our hands have to be on the wheel anyway (thanks, Belgian back seat driver). It could cause an accident every day if I let it.

Restricting it to freeways only would just delay improvement from fleet experience.

Per Elon, traffic light and stop sign reaction (not to be confused with recognition) is due this year. Ain't no traffic lights on the freeway, generally speaking.

People probably aren't supposed to use basic cruise control on city streets either. But basic cruise control doesn't learn. DA/AP does and will.

People will generally govern themselves per their own level of risk tolerance. A lot of people don't like automatic transmissions because they "want to be in control of the car." These people won't use DA/AP if it was free and came with a courtesy reacharound.

Other people love being beta testers. I say let the people choose. The only reason DA/AP is in the news is because it's new. Nobody asks if cars in an accident on the freeway had basic cruise control engaged. At least not at first.
 
Why do people seem to hold the bedrock belief that self driving cars should not be allowed unless they are not just equal to humans, but in fact far, far better?

Something else to chew on...

The fatality/miles driven rate varies drastically between states. It is 1.65 per 100M miles driven in South Carolina, but just 0.57 per 100M miles in Massachusetts. That means it is almost 3 times more deadly to drive amongst South Carolinians. Should we ban the Palmetto State? :)

* Fatality Facts
 
Can we have a corresponding poll that asks the question whether car companies should restrict cars to certain speeds on certain roads? It's essentially the same question, just for a different feature common in all cars instead of just Tesla's.

Speed related car accidents account for about 10,000 deaths per year in the US. Even though cars are capable of exceeding the speed limit, and in fact this feature is frequently prominently touted a selling point, roadways don't support cars going that fast safely and the use of the feature is left to the good judgement of the driver. Although most people don't speed irresponsibly there is a good portion of the driving public that does to the danger of everyone around them. This feature could easily be software limited.

Similarly with Autopilot and other autonomous driving technology, it's a feature that won't prevent all accidents and has its limitations that should be used with the good judgement of the driver.

The knee jerk reaction to the Autopilot accident has been to discuss restricting the technology in some way as if the AP caused the accident. It's a reactionary response. The car was doing what it was supposed to do and encountered a scenario that exceeded its limitations at which point the backup system, the driver, was expected to react. It may very well be that the accident was unaviodable, that even if the driver had responded or had been in control of the car without AP he would have crashed. But the problem with jumping to the conclusion that this one fatality is evidence that the autonomous driving is unsafe doesn't take into account the number of times AP has avoided an equally fatal accident. You can't measure something that didn't happen and disabling the feature may actually be a step backward in safety overall.

Another good question to ask would be if it should be required that trucks have side rails on the beds as in Europe in order prevent cars driving under them. The reason this accident was fatal was because the truck lacked such a guard rail. Removing AP or upgrading it may help Tesla and its customers but it won't prevent this type of accident from occurring to either Tesla's or other cars. A guard rail under the trailer won't prevent the accidents either, but it could mitigate the fatalities that occur when it does.
Just wanted to say...cool handle. Great movie!!!
 
What part of the word NO don't you understand? Although this was most likely operator error, technical malfunction must be ruled out, but drivers are to be attentive when enabling and utilizing Tesla's advanced tech package features. To do otherwise is to invite catastrophe.
 
Speed limits are based on a number of factors. The highest posted in the US is 85, but the highest posted on a highway in Florida is 65, at which point 85 mph is excessive.

There are plenty of places where 70MPH is the PSL, and in general, people drive considerably faster than PSL in florida.

Looking at a map, two busy state routes, 41 and 27 pass through Williston, FL, where the accident was. I don't know which road it was on, but I'd bet one of those. The guy probably had no idea that he was even passing through a town, and thought he was still in the middle of nowhere. Listing to the audio of a movie is probably pretty dumb in general, it's very distracting.

The truck driver also probably aggressively cut through traffic -- I have seen trucks drive very aggressively many times in Florida.
 
There are plenty of places where 70MPH is the PSL, and in general, people drive considerably faster than PSL in florida.

Looking at a map, two busy state routes, 41 and 27 pass through Williston, FL, where the accident was. I don't know which road it was on, but I'd bet one of those. The guy probably had no idea that he was even passing through a town, and thought he was still in the middle of nowhere. Listing to the audio of a movie is probably pretty dumb in general, it's very distracting.

The truck driver also probably aggressively cut through traffic -- I have seen trucks drive very aggressively many times in Florida.
The problem with driving over the speed limit in that area is it on a slight hill, approaching an intersection. This is why I would assume the car was never in the truck's line of sight until it was too late. To say the truck driver probably made an aggressive cut in traffic is unreasonable since there is nothing to support that.
 
Can we have a corresponding poll that asks the question whether car companies should restrict cars to certain speeds on certain roads? It's essentially the same question, just for a different feature common in all cars instead of just Tesla's.

It's not the same question for 2 reasons.

1 - There are laws governing speed. There are no 'Auto pilot on/off' laws or signs.

2- Speeding might reduce your reaction time but it won't potentially lull you into taking your attention off the road.