Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
Well you “solve” the problem by taking the product off the street until it is ready. Tesla doesn’t do that they let us test it with our lives and the lives of those in other cars.
I would say if Autopilot reduces the number of accidents/injuries/fatalities then you should not take it off the market. Now, I am very suspicious of Tesla's simplistic statistics on this and I would love it if third party researchers got their hands on the data and did an independent analysis.
AP does not remove the responsibility for driving from the human.
No automaker has solved bad driving.
This is true but the NHTSA's mandate is to make the roads safe. If an automaker releases a feature that makes the roads less safe they have the authority to force changes.
 
This
Tesla driver gets license suspended after drunkenly falling asleep on Autopilot

And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And plenty of more examples should not be possible but Tesla doesn’t implement better controls relying instead on a blurb in the owners manual stating the driver must be in control.
If this is really your position, then we will have to ban cars, bikes, knives, pencils, etc. Your position essentially translates into: the only way you will stop 'stupid' people from hurting themselves is to ban everything.

Please tell me how many people killed themselves this year *without* AP by ignoring the road.
 
Well you “solve” the problem by taking the product off the street until it is ready. Tesla doesn’t do that they let us test it with our lives and the lives of those in other cars.


It's ready right now.

For use where there isn't cross-traffic like this to begin with, and with an attentive driver monitoring conditions and ready to take control at any time.

Which the owners manual makes pretty clear.

If you wanna keep blaming AP for user error, knock yourself out, but it continues to be a completely crap argument based on what Tesla states the product actually is/does and where it should be used.
 
This
Tesla driver gets license suspended after drunkenly falling asleep on Autopilot

And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And plenty of more examples should not be possible but Tesla doesn’t implement better controls relying instead on a blurb in the owners manual stating the driver must be in control.

If their hand was on the wheel/ in the spokes then the system operated as designed.
Even if not, the issue is with the attention verification system. If it had been any other car, the headlines would have been 'head on collision' or 'car in ditch', possibly with multiple injuries or fatalities.
 
  • Like
Reactions: electrictorque
If this is really your position, then we will have to ban cars, bikes, knives, pencils, etc. Your position essentially translates into: the only way you will stop 'stupid' people from hurting themselves is to ban everything.

Please tell me how many people killed themselves this year *without* AP by ignoring the road.
No body said ban. What I said is better controls need to be implemented. Having a hand on the wheel with my eyes closed does not work. The vehicle needs to have a better monitoring system. Other vehicles do it, other vehicles that don’t even have a self driving assisted driving whatever you want to call it, system. Sure stupid is as stupid does but I do t have to hand the guy the scissors before he runs off with them. It’s not a on or off situation. It could be better. That’s all I am saying. That and I do t like that Tesla is content with the end user performing all the testing on public roads. If the system worked it wouldn’t inadvertently drive into truck or steer into center dividers. It’s these small issues that need to be worked out until the general public is ready to use it. The system gives to much of a false sense of security. Maybe not to those of us who realize the limitations s but we are the minority.
 
If their hand was on the wheel/ in the spokes then the system operated as designed.
Even if not, the issue is with the attention verification system. If it had been any other car, the headlines would have been 'head on collision' or 'car in ditch', possibly with multiple injuries or fatalities.
Your right and this perfectly illustrates my point on how the general public views autopilot because Tesla calls it Autopilot and Full self drive and it is neither.
 
This
Tesla driver gets license suspended after drunkenly falling asleep on Autopilot

And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And plenty of more examples should not be possible but Tesla doesn’t implement better controls relying instead on a blurb in the owners manual stating the driver must be in control.
Would it have been better for him to pass out without AP running while driving?? I'd argue that AP may well have saved lives in these cases. It IMPROVED safety. Your examples aren't holding a lot of water here.
 
Would it have been better for him to pass out without AP running while driving?? I'd argue that AP may well have saved lives in these cases. It IMPROVED safety. Your examples aren't holding a lot of water here.
Oh yeah. Thank goodness the drunk put on his autopilot. We should all be grateful. What would have even been better is if the system reconginized the driver wasn’t being attentive because his eyes were closed, gave a warning to alert the driver and when that didn’t work pulled over shut off and put on the hazards.
 
Last edited:
It's ready right now.

For use where there isn't cross-traffic like this to begin with, and with an attentive driver monitoring conditions and ready to take control at any time.

Which the owners manual makes pretty clear.

If you wanna keep blaming AP for user error, knock yourself out, but it continues to be a completely crap argument based on what Tesla states the product actually is/does and where it should be used.
An owners manual which most people don’t read. How many time have I seen someone on this forum give instructions to read the owners manual when they asked a simple question. That’s doesn’t make it right, but that shouldn’t be the fall back answer for such a important aspect of the system. Tesla could be more proactive on monitoring wither a driver is being attentive as well. Nissan just released details of their self driving system that utilizes a camera to monitor driver attentiveness. Now in my opinion the rest of the system is behind Tesla, but this is a good idea that you think with such a reported focus on safety would use instead of the stupid, I can fool it with an orange, torque sensing system.
 
Last edited:
  • Like
Reactions: OPRCE
Wait a second there. You are talking about one of my areas of great expertise over many decades. RADAR (all caps by the way) does see stationary objects very well if the object is made of such a material as to reflect the pulse. In other words, most things (but not all) on the roadway. The problem is how the RADAR data is processed. Hence software.

Exactly. RADAR has exactly the same problem that LIDAR does in terms of removing ground clutter. Is that a bridge that you're about to go under or a trailer that you're about to go under? Does the road slope down just ahead or does the road turn right before a sheer cliff? Is that a bump in the road or a body? And so on.

The only way to be sure is with vision, and maybe not even then. :)

The problem is that RADAR has low resolution, if it didn't there would be no reason to invent LIDAR.

That doesn't really help much. I mean yes, it can help a little bit, but ultimately the purpose of either RADAR or LIDAR is to identify additional areas of concern for the vision system to look into, so resolution isn't particularly important unless you're trying to use it to avoid running over small animals in the road or something. For the most part, interesting obstacles are not small, and small obstacles are not interesting. :D
 
  • Like
Reactions: Gabbleratchet7
Exactly. RADAR has exactly the same problem that LIDAR does in terms of removing ground clutter. Is that a bridge that you're about to go under or a trailer that you're about to go under? Does the road slope down just ahead or does the road turn right before a sheer cliff? Is that a bump in the road or a body? And so on.

The only way to be sure is with vision, and maybe not even then. :)



That doesn't really help much. I mean yes, it can help a little bit, but ultimately the purpose of either RADAR or LIDAR is to identify additional areas of concern for the vision system to look into, so resolution isn't particularly important unless you're trying to use it to avoid running over small animals in the road or something. For the most part, interesting obstacles are not small, and small obstacles are not interesting. :D
I would be be curious to see an overpass that would cause a false positive with LIDAR or a semi trailer that would be invisible to it.
 
A good viable solution would be to have a camera pointed towards the driver and have software constantly analyze where he's looking. No eyes on the road, send alarms and slow down. Other manufacturers are already using this, why is this so hard to implement? Is this because of the silly expectation of privacy or some real reason?

The real reason is to save money, a decision taken by Musk in the design stage, over-ruling his engineering team who wanted to implement a proper IR face-tracking system - Tesla rejected more advanced driver monitoring features on its cars

Even in M3 the internal camera is pretty unsuited to the task of driver attentiveness monitoring, as it is off-axis, pointing straight down the cars' centre-line, almost certainly not infra-red and will have a fisheye lens for general surveillance of RoboTaxi passengers.
 
The real reason is to save money, a decision taken by Musk in the design stage, over-ruling his engineering team who wanted to implement a proper IR face-tracking system - Tesla rejected more advanced driver monitoring features on its cars

Even in M3 the internal camera is pretty unsuited to the task of driver attentiveness monitoring, as it is off-axis, pointing straight down the cars' centre-line, almost certainly not infra-red and will have a fisheye lens for general surveillance of RoboTaxi passengers.

There's going to be flaws/weakness in every system that we try to come up to solve bad drivers. How good/accurate is the IR face tracking system? I doubt it's 100% accurate or there is no flaws in that system. People can put on sunglasses and close their eyes for 10 seconds and I doubt the IR face tracking can tell that you are sleeping in the 1st second. Stupid drivers will find way to cheat the system no matter how much safety Tesla tries to put into their cars. See how many people died for not wearing/misuse the seatbelts. I have friends who goes on my Tesla in the backseat and I've to remind them to put on their seatbelts everytime. Next thing when I look back I see 1 guy have the seatbelts behind his back to bypass the warning because he doesn't like being strapped in.
 
There's going to be flaws/weakness in every system that we try to come up to solve bad drivers. How good/accurate is the IR face tracking system? I doubt it's 100% accurate or there is no flaws in that system. People can put on sunglasses and close their eyes for 10 seconds and I doubt the IR face tracking can tell that you are sleeping in the 1st second. Stupid drivers will find way to cheat the system no matter how much safety Tesla tries to put into their cars. See how many people died for not wearing/misuse the seatbelts. I have friends who goes on my Tesla in the backseat and I've to remind them to put on their seatbelts everytime. Next thing when I look back I see 1 guy have the seatbelts behind his back to bypass the warning because he doesn't like being strapped in.

Sure, but if it were 95% effective compared to the current solution at let's say 5%, it should cut out about 90% of the AP crashes people get themselves into through inattention.
 
Sure, but if it were 95% effective compared to the current solution at let's say 5%, it should cut out about 90% of the AP crashes people get themselves into through inattention.


See diminishing returns.

There's been what, like 3 fatalities, ever, on AP? (and all but 1 of those on roads AP is explicitly not intended to be used on)

I know "every life is precious" and all, but if it costs 100 million dollars for a company to take the "works better even when driver is a complete idiot" level from 99.99% to 99.999 that's probably not a worthwhile investment of funds compared to spending it on "works better when driver is NOT being a complete idiot and adding 9 to that number instead.
 
Sure, but if it were 95% effective compared to the current solution at let's say 5%, it should cut out about 90% of the AP crashes people get themselves into through inattention.
What makes you think the current solution is only 5%?
Even if it is, the pending software release appears to dramatically increase the functionality. I'm sure I recall Tesla saying that the current software was not optimized for the new NN hardware, so the results will sometimes be worse than with the old hardware.
There have been two, or maybe three Tesla autopilot deaths involving submarining with trailers compared to a total of over 200 total per year (which never seems to get mentioned). It's not at all certain that if there was no autopilot those two or three deaths wouldn't have occurred. The other data point we don't know is how many of these types of accidents were prevented by autopilot. Maybe none, maybe twenty, just no way to tell.
 
Remember the case where you approach a stop light with a vehicle already stopped at the line. The radar never sees it moving and the Tesla would just plow into it.

Nope. I am not sure I ever saw anything like that in a Tesla.

Tesla’s will still plow into cars stopped at a stop light today in some instances while on autopilot. Especially if there is any sort of bend in the road while approaching the light.

Yep, can confirm.
Not sure how Tesla will fix this - maybe the FSD computer can respond more quickly

Here's a good example from 55mph with cut-out to stationary at lights


What makes you think the current solution is only 5%?

Because it can easily be fooled by fruit or a small water-bottle jammed in the spokes, or the driver holds one hand on the wheel while attending 100% to email on their phone with the other.

i.e. the engineering assumption that "torque detected on wheel = driver's attention on road ahead" is fundamentally unsound.

Even if it is, the pending software release appears to dramatically increase the functionality.

I sure hope so but suspect the improvements will only be seen on HW3/FSD!
 
Last edited:
Because it can easily be fooled by fruit or a small water-bottle jammed in the spokes, or the driver holds one hand on the wheel while attending 100% to email on their phone with the other.

i.e. the engineering assumption that "torque detected on wheel = driver's attention on road ahead" is fundamentally unsound.
That is a silly argument. There are always going to be idiots who subvert the system because "they know better". People, particularly back seat passengers, buckle the seatbelt and then sit down because they don't want to use them, but no one blames seatbelts for that kind of behaviour.
 
  • Informative
Reactions: pilotSteve
Status
Not open for further replies.