Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Short-Term TSLA Price Movements - 2016

This site may earn commission on affiliate links.
Status
Not open for further replies.
I agree largely with both your statements regarding AEB. I doubt the tesla would have been able to completely stop in time given the rate of closure.

Does anyone with AP recall these:

1). what is the shortest distance that AEB activates in front of the car?

2). It's ALWAYS on, right?

3). With Teslas TACC, the tesla "follow distance" is settable but I don't think AEB is adjustable.

Clearly the radar just saw the empty space under the trailer (~4 ft tall) and ignored the 'overhead' area as Elon indicated. I wonder what the min height or angle above the radar is enough to trigger AEB.

Seems that the car needs to account for going up hill while there is a bridge or overhead sign ahead. Therefore the radar scan height is apparently less than 4 ft off the ground.

Just curious how far ahead it looks.


Thanks
The big issue is that the beam spreads. For a given max detection height you have to limit either beam spread or detection distance. A simple view is that if the beam is 4' wide at 200' then it will be 8' wide at 400' in front of the car. At some point you have to limit any return from a greater distance. Similarly, stationary objects, including the road surface, generate a lot of returns. You can figure the Doppler shift for stationary objects and remove that clutter. The issue becomes "what about a wall?" or perhaps "what about a stopped vehicle?" so you have to still allow exceptionally strong stationary returns from a given distance i.e. say allow a strong stationary return at 700'. This is what makes stationary cars at distance or truck trailers where most of the signal goes under the truck so tough. As I've mentioned before scanning radar or LIDAR makes handling these issues much easier.
 
I wouldn't be surprised to see an increase in nagware. Tesla has been great about having less than most. You'll go through 10 pages of legal stuff before driving your car and every minute have the car asking if you are really sure you know what you are doing and if you still promise to be a safe driver. Yuck!!!

Yes - more nagware. Then we can have 15 new threads here on how to defeat the new nagware.
 
  • Funny
Reactions: GoTslaGo
I think that is a bad example.

That accident was prevented by auto braking, a feature available on many cars including some Tesla that don't have autopilot. The ability for the car to "drive Itself" had no impact of the outcome of that video as a large percentage of cars that don;t have any type of AP would have acted to same.

Also I don't think the speed was enough for it to be near fatal and it seems likely a driver would have been able to avoid of at least migrate the damage from that accident easily.

I don't think you would say these words if you were more familiar with the near-accident in the video. The Tesla driver was on an Uber drive in the Seattle area, was momentarily distracted, and the Tesla braked heavily from more than 40 mph to prevent a potentially-serious accident.

You point to this video as an example of autobraking, but it was autobraking that was being used at that moment along with lane-keeping as the Tesla was in fact in autopilot mode. Are you aware that the Tesla autopilot has some latitude to shift left or right within it's lane but lane-keeping is not allowed to depart the lane unless the driver puts on the blinker and the Tesla determines that a lane change can be done safely? Are you aware that stopping was the only allowed solution for the Tesla autopilot in this case? Since the Tesla driver was not aware in time of the vehicle ahead that went broadside to it, the speed-control portion of the autopilot system did indeed prevent a potentially fatal accident.

Would the autobraking function of some manufacturer's vehicles performed less capably than the Tesla? Almost certainly yes, because of both the performance of the Tesla (extremely adept skid protection in braking) and because of the speed of detection that a threat existed.

You say that the autopilot had no bearing on the ability of the Tesla in the video to stop, but that is incorrect too. When the Tesla's lane-keeping feature of autopilot is active, the Tesla assumes that the driver will not deviate from the lane and the Tesla is more prepared to stop than if lane-keeping is off and the Tesla driver is just using Traffic Aware Cruise Control. I point to the minor accident in which a Tesla with autopilot off but cruise-control on bumped a vehicle that was extending partway into the lane. TACC was following the car ahead, which successfully entered the adjoining lane and went around the stopped vehicle. That successful circumnavigation of the obstacle influenced the Tesla's timing on when to hit the brakes. The decision would almost certainly have been made to brake further out with autopilot on, and would likely have prevented the minor accident because the rule of "you must remain in this lane" would have removed any ambiguity about the ability to pass left or right of the obstacle. My point is that operating in autopilot mode does indeed affect when the decision is made to brake sharply and consider an obstacle ahead one which most be stopped before reaching and one that cannot be navigated around.
 
  • Informative
Reactions: MitchJi and EinSV
I don't own a Tesla yet (M3 reservation holder) but have experienced autopilot for a few miles. What I see happening here is that the driver potentially didn't see the trailer for a few seconds where he should have acted which would have resulted into no accident and a honk to the trailer.

My question for the group is that if the current autopilot system makes user pay less attention on road, specially on highways by knowing that Tesla handles most of all scenarios on highways. Is it possible that drivers start having trust into the system that they are safer on highways and that constant attention is no longer required? If this is possible even for a small number of users, I would recommend Tesla to add an alert every some seconds or some acknowledgement to the user that they must pay attention.

For any other car, the exact issue happens all the time. So, I don't believe autopilot is the reason for this accident, but Tesla may add bells and whistles to keep user attention on the road if autopilot contributes to users not paying attention on road.

That is exactly the problem with Autopilot! It gives the driver so much confidence in the car that he/she loses his/her focus on the road more easily. Why wouldn't you respond to that text that you just got? You know that the car will keep driving!

I don't think that more alerts are the way to go, it will only annoy the driver and other passengers and people will try to silence the alerts etc. I think that the best option is a system that can check if you're looking at the road ahead or not, if you're looking more than about ten seconds at other things that the road the car will notify you and you will have to touch the wheel or something so that it knows that you're focused and ready to take over if necessary.
 
The big issue is that the beam spreads. For a given max detection height you have to limit either beam spread or detection distance. A simple view is that if the beam is 4' wide at 200' then it will be 8' wide at 400' in front of the car. At some point you have to limit any return from a greater distance. Similarly, stationary objects, including the road surface, generate a lot of returns. You can figure the Doppler shift for stationary objects and remove that clutter. The issue becomes "what about a wall?" or perhaps "what about a stopped vehicle?" so you have to still allow exceptionally strong stationary returns from a given distance i.e. say allow a strong stationary return at 700'. This is what makes stationary cars at distance or truck trailers where most of the signal goes under the truck so tough. As I've mentioned before scanning radar or LIDAR makes handling these issues much easier.

Yep. I understand beam spread. Good points. This is why I'm curious what max distance is used by tesla automatic braking. It unfortunately doesn't seem to be published.

Also, I noticed that the Tesla FIRMWARE notes indicate that Automatic Emergency Braking (AEB) isn't active above 85 MPH.
 
  • Informative
Reactions: neroden
Yep. I understand beam spread. Good points. This is why I'm curious what max distance is used by tesla automatic braking. It unfortunately doesn't seem to be published.

Also, I noticed that the Tesla FIRMWARE notes indicate that Automatic Emergency Braking (AEB) isn't active above 85 MPH.

From the owner's manual - forward collision warning supposedly "sees" 525' ahead:

Screenshot 2016-07-02 16.08.05.png
 
That is exactly the problem with Autopilot! It gives the driver so much confidence in the car that he/she loses his/her focus on the road more easily. Why wouldn't you respond to that text that you just got? You know that the car will keep driving!

I don't think that more alerts are the way to go, it will only annoy the driver and other passengers and people will try to silence the alerts etc. I think that the best option is a system that can check if you're looking at the road ahead or not, if you're looking more than about ten seconds at other things that the road the car will notify you and you will have to touch the wheel or something so that it knows that you're focused and ready to take over if necessary.

That's the whole advantage of AP. By freeing the driver from the robot stuff (lane keeping and distance keeping) the driver is free to oversee the road conditions and scan for unusual problems. In this case (no disrespect for the departed intended) if the driver was misusing it by watching a video (alleged) then it isn't the fault of the A/P.
 
  • Like
Reactions: GoTslaGo and hobbes
Automatic emergency braking is a feature that refers to automatic braking to prevent or reduce the severity of an accident. It works when a Tesla is not using autopilot. Note, while the emergency braking may be disabled above 85 mph, this does not mean that Traffic Aware Cruise Control and autopilot with TACC abandon efforts to protect the driver when speed is above 85 mph. It only means that the safety system does not work outside of autopilot and TACC when the speed is above 85 mph.
 
  • Informative
  • Like
Reactions: neroden and FredTMC
That's the whole advantage of AP. By freeing the driver from the robot stuff (lane keeping and distance keeping) the driver is free to oversee the road conditions and scan for unusual problems. In this case (no disrespect for the departed intended) if the driver was misusing it by watching a video (alleged) then it isn't the fault of the A/P.

AustinEV, well said! You make the same point I make regularly at this sticky autopilot thread:
A flight instructor teaches Tesla Autopilot
 
From the owner's manual - forward collision warning supposedly "sees" 525' ahead:

View attachment 183806

Thx for posting! Seems that FCW looks out up to 525'. Thanks

On subsequent page goes on to say a number of warnings including impairment due to "bright sunlight". See attached screenshot

Also, seems that Automatic Emergency Braking (AEB) operates independently from FCW
 

Attachments

  • image.png
    image.png
    1 MB · Views: 33
That's the whole advantage of AP. By freeing the driver from the robot stuff (lane keeping and distance keeping) the driver is free to oversee the road conditions and scan for unusual problems. In this case (no disrespect for the departed intended) if the driver was misusing it by watching a video (alleged) then it isn't the fault of the A/P.

I do understand that that's the point of AP, but don't you think that you will become less alert when you've driven thousands and thousands of miles without having to interfere a single time? Yes, AP frees the driver from all the repetitive tasks so the driver can look ahead and have a better overview of the whole situation, but it's hard to keep that attention after you've been in the car for hundreds of hours and not having to interere once!

And no, that still doesn't mean you should go watch a movie in the car, but even answering a text or looking up what the weather will be like when you arrive or looking at someone during a conversation with him/her can cause the same fatal result. The problem is that when you have a 'normal car' you have to look out the window and be attentive, because otherwise you will drive off the highway in seconds, but AP is so good that 99,999% of the time you can look somewhere else and still be safe. This doesn't mean that I'm not a fan of AP (I'm a huge fan!), AP has proven already to make your car safer, but this is certainly an issue that Tesla and other automakers need to address in the near future when they have systems that can drive almost(!) autonomously but still require the attention of the driver.
 
Last edited:
I do understand that that's the point of AP, but don't you think that you will become less alert when you've driven thousands and thousands of miles without having to interfere a single time? Yes, AP frees the driver from all the repetitive tasks so the driver can look ahead and have a better overview of the whole situation, but it's hard to keep that attention after you've been in the car for hundreds of hours and not having to interere once!

And no, that still doesn't mean you should go watch a movie in the car, but even answering a text or looking up what the weather will be like when you arrive or looking at someone while having a conversation with him/her can cause the same fatal result. The problem is that when you have a 'normal car' you have to look out the window and be attentive, because otherwise you will drive off the highway in seconds, but AP is so good that 99,999% of the time you can look somewhere else and still be safe. This doesn't mean that I'm not a fan of AP (I'm a huge fan!), AP has proven already to make your car safer, but this is certainly an issue that Tesla and other automakers need to address in the near future when they have systems that can drive almost(!) autonomously but still require the attention of the driver.

VanE, you bring up a point that is one of my major focuses in another thread. The current version of Tesla autopilot is certainly not foolproof. Poor highway markings, the sun's glare, and construction zones are just three of the situations that can render the current version of autopilot unreliable. Someone who treats the current beta software and hardware version 1 as being fully-autonomous driving is indeed looking for trouble because of all the imperfections that still exist, and the poor fellow who perished using autopilot discovered a new (but not the only) imperfection of the current system. What I urge other Tesla drivers to do is to allow the reduction in energy needed to maintain speed and stay in your lane to allow you to become better-connected with what is happening around you, not less-connected. You really do have more time to check your mirrors now and to scan for trouble well ahead of you than you ever did. So, the choice is up to the individual driver: do you use the reduced workload of cruise control and lane-keeping to become more connected with your surroundings or less-connected? I have been strongly advocating the former.

You should add that link to your signature IMO.

Mitch, I can only have one link in my signature, but I've done as well as I can to run with your good idea.
 
I wouldn't be surprised to see an increase in nagware. Tesla has been great about having less than most. You'll go through 10 pages of legal stuff before driving your car and every minute have the car asking if you are really sure you know what you are doing and if you still promise to be a safe driver. Yuck!!!

Nagware would certainly be appropriate it the car's speed is significantly above the speed for the conditions of travel. e.g., faster than other traffic or upon approaching a cross street not having traffic controls.
 
Here's the full text of a statement released by his family yesterday

Statement of Joshua Brown's Family - Canton Man Killed in Truck/Tesla Crash

Statement of Joshua Brown's Family - Canton Man Killed in Truck/Tesla Crash

CANTON, Ohio, July 01, 2016 (GLOBE NEWSWIRE) -- On May 7, 2016, Joshua Brown (40) of Canton, Ohio, was killed in a motor vehicle crash in Williston, Florida, caused by a semi tractor-trailer which crossed a divided highway and caused the fatal collision with Josh's Tesla. Josh was a master Explosive Ordinance Disposal (EOD) technician in the US Navy, an exceptional citizen, and a successful entrepreneur. He was a proud member of the Navy’s elite Naval Special Warfare Development Group (NSWDG). Most importantly, he was a loving son and brother.

There has been considerable media speculation since Tesla provided the National Highway Transportation Safety Administration (NHTSA) with data indicating that the Tesla autopilot system was activated at the time of the crash. While the public's fascination with this new technology is understandable, the grief which Josh's family continues to endure is personal and private. Accordingly, the Brown family requests that all communications on this matter be directed to their attorneys Jack Landskroner and Paul Grieco of Landskroner Grieco Merriman.

The investigation into the cause of this crash is ongoing. In honor of Josh's life and passion for technological advancement, the Brown family is committed to cooperating in these efforts and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways.



Contacts:
Landskroner Grieco Merriman, LLC
1360 W. 9th Street, Suite 200
Cleveland, Ohio 44113
Cleveland OH Personal Injury Lawyer : Landskroner Grieco Merriman
 
Status
Not open for further replies.