Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
...
Are you really saying that in heavy rain/sudden fog you're safer when you have autosteer activiated (ie steering for you) than when you do the driving yourself?

Some situations that I encountered and realized that Autopilot is much better than I am:

1) Very bad lane markings due to construction: I looked at the markings and I was confused (because of old scars/marks) but Autopilot was not as it follows the new ones.

2) Medium to light rain at night: Because of water reflection and angles, I couldn't clearly see some lane markings but Autopilot has been doing very fine.

I haven't experienced dense fog but for light fog, I still use Autopilot but I do slow down so I can see what's ahead.

Without Autopilot, I used to have to suddenly slow down my car to figure out what I was visualizing.
 
I also just love how other folks put Lidar and other modalities from other automakers as the untested standard.... they have no meaningful assets on the road and when they try, they wreck.... Tesla has thousands of car on the road using autopilot right now as I type.... GM on the other hand will be showing you a their supercruise on eligible highways once the dealership opens in the morning

GM Super Cruise is not similar in concept to AutoPilot. When you can lose billions of dollars by having a gas tank that rocket engines can ignite you think from a different perspective. Sure NBC had to drill more holes in the tank, and it took 3 tries, but that's the rules of the game. First press demo of Super Cruise was 2012. Finally they released it this year. Even NBC can't blow it up apparently.

Because of how it operates, it cannot hit gore points. I think we will find it lacks the ability to hit stopped vehicles at high speeds, or hit a vehicle crossing the road. It's sensors operate differently than those of other companies. It even adds long range thermal imaging into the mix.

Super Cruise can only go on 'blue lines'. These are known centers of traffic lanes. No pre-defined blue line, it cannot operate. It does not calculate where the road is. It knows where the road is, and the sensors check to see if it's safe to travel.

But it will never be able to do this:


Mechanically, it could, but it won't allow it since it's a safety system.

It will be interesting to see why they have been putting V2V (car to car data stream) hardware into their cars. Cadillac has not fully explained what the goal is yet.

Of course the main event will be to see if GM will actually release a car next year with no steering wheel for sale.
 
Last edited:
What I said was that I am safer, whether or not it is activated. I have more data available to me about cars in the vicinity.

Ok... More data is great... But that occurs without activing autosteer. Wouldn't it be safer to keep seeing the extra information, but not turn over steering to the car by activating autosteer?

Some situations that I encountered and realized that Autopilot is much better than I am:

1) Very bad lane markings due to construction: I looked at the markings and I was confused (because of old scars/marks) but Autopilot was not as it follows the new ones.

2) Medium to light rain at night: Because of water reflection and angles, I couldn't clearly see some lane markings but Autopilot has been doing very fine.

* * *

Without Autopilot, I used to have to suddenly slow down my car to figure out what I was visualizing.

These both seem like situations where you would be better off slowing down, rather than relying on autosteer to let you stay at high speed. I don't see how you can act as a check on AP's decisions if the car is going faster than you would feel comfortable driving.
 
Some situations that I encountered and realized that Autopilot is much better than I am:

1) Very bad lane markings due to construction: I looked at the markings and I was confused (because of old scars/marks) but Autopilot was not as it follows the new ones.

2) Medium to light rain at night: Because of water reflection and angles, I couldn't clearly see some lane markings but Autopilot has been doing very fine.

I haven't experienced dense fog but for light fog, I still use Autopilot but I do slow down so I can see what's ahead.

Without Autopilot, I used to have to suddenly slow down my car to figure out what I was visualizing.

This is the scariest thing I've read on this forum to date. You need to slow down.
 
This was my first though: The driver involved in the crash could have made a bad decision that day.

He might have decided to pass all the cars in front of him, by using the carpool lane on his left,
and then tried merging back on his right, but was not able to merge safely in time and hit the separation wall.

I saw this scenario too many times, and almost everyday during my commute.

Roadshow: Why not pylons to prevent last-minute lane changes? – The Mercury News


f002dffe-5cd5-4ec9-a9ca-706b866a4a51-jpeg.291237
Could be, however Tesla stated that Autopilot was on during the crash and that the driver never intervened. After reviewing all the pictures of the crash, the location, and taking into account Tesla's statement based on the data recovered from the car - Driver was likely not paying attention and the Autopilot was following the solid left lane that led the vehicle straight into the divider.
 
  • Informative
Reactions: NeverFollow
My experience (owning one of each Model X) has been that AP2 tends to be left-biased (or "left-handed", if you will) in the same location that AP1 tends to be right-biased (or "right-handed") when a single lane opens up into two lanes.

What do I mean by this? Let's look at Exit 23 off US-17 North in Campbell, California. If you engage Autopilot on the exit lane (before reaching the exit sign shown in Google Street View), then use the stalk to reduce maximum speed to 45 MPH (before the exit sign) and finally let Autopilot pick a lane when one lane opens up into two lanes, you'll find that AP2 picks the left-hand lane while AP1 picks the right-hand lane. (I don't recommend trying this, but if you do, disengage Autopilot well before the right-hand turn.)

Why does AP2 behave differently than AP1 here? I don't know, but I think it is interesting in light of the accident that started this thread, and the video from Chicago above per an article in Electrek today: Tesla owner almost crashes on video trying to recreate fatal Autopilot accident.

Note that there are many differences to this road from both the interchange in Mountain View, California and the interchange in Chicago, the least of which is that it's not an interchange, there is no gore area involved (just one lane widening into two lanes), there is yellow paint on the left lane marker, there are no adjacent lanes connected by a paved surface, and I'm reducing speed manually as it drives (so as not to go unusually slow in the exit lane).

Also, I'm certain you can find locations where AP2 is right-biased, so it's not always left-biased. In fact, if you take Exit 10 off I-280 North in Cupertino, California onto Wolfe Road with Autopilot engaged (again, reducing maximum speed using the stalk to around 45 MPH once on the exit ramp), AP2 will prefer the right lane instead of the left lane when the right-hand lane opens up into two lanes. (Here again, the road geometry is different with a dashed line on the left lane marker, a solid line on the right lane marker, and additional lanes to the left but not the right.)

Anyway, I find it useful to know that this behavior difference exists since I drive vehicles with both AP1 and AP2 frequently, so I really can't assume which lane Autopilot will take in these situations. (And this behavior could change in the future with a software update anyway.)

Here's another reason to be extra cautious with the new "wide-lane" support in AP2 on 2018.10.4: Lane correction when the vehicle is literally about to split two lanes at highway speeds is extremely quick. It felt more like an avoidance maneuver—as if it wanted to complete the lane change before the lane divider actually started on the pavement. I had both hands on the steering wheel (as usual) when driving, and it was still surprising how rapidly Autosteer moved into the right lane.

This happened at the lane split on I-280 South headed for the I-880 North/US-17 South exit, just past the Winchester Blvd exit (images are copyright Google from from Google Street View; they were not taken by me, and they are not from today):

i-280-Wincheser-exit.png


The Google Street View vehicle is one lane to the left—the lane to its right is the one that splits into two lanes. With 2018.10.4, the wide-lane support kept my Model X centered between the ever widening "lane" until maybe 30-50 feet (very rough estimate) before the new lane divider started, at which point Autosteer moved rapidly into the right-hand lane (I don't know why the right lane was chosen vs. the left lane in this case):

I-280-Lane-Split.png


I don't have a dash cam, and I don't care to have the publicity that might come with posting a video, but I want folks to be aware of this behavior. I'm considering reporting this to Tesla since it happens so quickly—it's basically the opposite of a smooth lane change.
 
  • Informative
  • Helpful
Reactions: Matias and TaoJones
agree but calling it Autopilot is a bad name - it should be called driver assist mode or something but of course the former sells. It shouldn't be called Autopilot till it's really there.
The media loves to wag the term Autopilot for clickbait. I just looked up headline news stories for today paired with Tesla Model X Crash Autopilot. They include Activated, Engaged, On, and In. Ignorance abounds, with some reports better than others at explaining the technology limitations and driver responsibility.

I don't agree that Autopilot is a bad name. I assume like many others you're putting the term in context with aviation. Okay, it's an over one hundred year old technology that is still being refined. Today you can find autopilots that can only do simple tasks such as keeping the wings level, right up to sophisticated aircraft Autolanding systems. Yet their are still no pilotless commercial aircraft. In fact, even with all that marvelous technology you won't find any airline that permits their pilots to fly distracted. Perhaps they shouldn't be calling it Autopilot "till it's really there" too?
 
I'm considering reporting this to Tesla since it happens so quickly—it's basically the opposite of a smooth lane change.

I have encountered this behavior in the opposite type of split - where the lanes merge. Basically as soon as the dotted line stopped, AP darted over to center the car on the new lane. Much to the dismay of the lazy merger on my tail who probably got surprised and thought I was an a-hole cutting him off. It happened very quick, and I am not sure that if the fellow had been in my blind spot that I wouldn’t have hit him.
 
  • Informative
Reactions: ddkilzer and Matias
The media loves to wag the term Autopilot for clickbait. I just looked up headline news stories for today paired with Tesla Model X Crash Autopilot. They include Activated, Engaged, On, and In. Ignorance abounds, with some reports better than others at explaining the technology limitations and driver responsibility.

I don't agree that Autopilot is a bad name. I assume like many others you're putting the term in context with aviation. Okay, it's an over one hundred year old technology that is still being refined. Today you can find autopilots that can only do simple tasks such as keeping the wings level, right up to sophisticated aircraft Autolanding systems. Yet their are still no pilotless commercial aircraft. In fact, even with all that marvelous technology you won't find any airline that permits their pilots to fly distracted. Perhaps they shouldn't be calling it Autopilot "till it's really there" too?

As a shareholder, and coincidentally as an owner of multiple marine (nautical) autopilot systems, I would be very happy if Tesla’s “Autopilot” was renamed “Driver Assist”.

At least until “Autopilot” does what those who are unfamiliar with nautical and aviation-centric Autopilot systems think it does.
 
  • Like
Reactions: bhzmark and e-FTW
Ok... More data is great... But that occurs without activing autosteer. Wouldn't it be safer to keep seeing the extra information, but not turn over steering to the car by activating autosteer?



These both seem like situations where you would be better off slowing down, rather than relying on autosteer to let you stay at high speed. I don't see how you can act as a check on AP's decisions if the car is going faster than you would feel comfortable driving.
I do slow down. I never said I was going faster than I would be comfortable driving. I said that I received more information. Please please stop changing what I've said to try to fit your narrative.

It's clear from how you're stating things that you haven't used autopilot. I don't 'turn over steering'. That makes it sound like I don't have full control. That's just not the case. While autopilot is activated, I can still steer - any action I take deactivates it immediately. It doesn't set the speed, I do. And if I want to slow down from my initial setting, I slow it down. Or speed it up. Or I lightly tap the brake. Whatever. What ever action I take, even if AP is activated, is the action taken. AP does not overrule the driver. The driver is always in control.

Here's a new scenario that just occurred for me yesterday: I was driving home last night from Portland, about an hour drive into the Columbia River Gorge. I had medium heavy rain - the windshield wipers could easily keep up. AP was activated, but I had knocked it down to around 50mph. Visibility wasn't the worst I've seen, but it wasn't perfect either. Anyhoo, a car passed me on my left & must have hit a huge pocket of pooled water. It was like someone had just dropped huge buckets of water on my windshield & instantly went from fair visibility to zero visibility for just a few seconds. Honestly, it freaked me out a little bit BUT autopilot didn't hiccup. (I kind of patted the car right after that and will deny that maybe I mumbled 'good job', because I don't talk to cars. :))
 
Visibility wasn't the worst I've seen, but it wasn't perfect either. Anyhoo, a car passed me on my left & must have hit a huge pocket of pooled water. It was like someone had just dropped huge buckets of water on my windshield & instantly went from fair visibility to zero visibility for just a few seconds. Honestly, it freaked me out a little bit BUT autopilot didn't hiccup.

There are situations where technology sees more than the naked eye. This is one of those scenarios.

It's silly the arguments that "I drive better than any Tesla EVER can 24x7x365. Yup don't need it, don't want it."

I know this argument because I made it before I was a Tesla owner. Over 20 years driving now - no accidents - before and after AP.
 
  • Like
Reactions: e-FTW and bonnie
I was planning to do more calls etc to make my 1 hr commute productive
Sure, calls, that is still allowed. But that “etc” is concerning. You cannot do anything else but take and make calls, or use a voice-activated assistant while driving. Driver assistance systems change nothing to this state of affairs.
I understand the technology
The above contradicts this statement.

Edit: voice assistants are productive too!
 
Last edited:
According to IIHS Teenage drivers account for 13% of traffic fatalities and drivers over 70 account for 11%. As of 2016 IIHS reports 27% of traffic fatalities involve a driver with blood alcohol over the legal limit. Even if you were to assume that zero teenagers and zero elderly drive Teslas and that no Tesla drivers are ever drunk it wouldn't be enough to change the prediction that AS is a large net benefit.

Incidentally I can count teens and elderly who drive Teslas among people I've personally met and I've seen a number of articles regarding accidents and arrests of people driving Teslas while drunk, so I'm quite skeptical of the claim that they are a negligible component of Tesla drivers.

I understand that Volvo had a good year last year and was able to report no driver fatalities for the XC90. However the numbers I've been using include people besides the driver being killed. A quick search has no difficulty turning up reports of XC90 involved fatalities:

Alcohol-involved crash kills 1 near Palmer and leaves 1 seriously injured
Hillsborough Crash Claims Motorcyclist
Volvo driver charged in death of Charlotte woman, 71, killed in 4-car crash

This is not to disparage the XC90 or it's drivers. We're all human. But I take exception to the impression you give that they are almost never involved in fatal accidents. Additionally the XC90 is available with an ADAS system with lane keeping features not unlike Tesla's AS. I'm quite confident that their system is imperfect and that Volvo does not represent it as otherwise.

Wait... so you are saying if a Volvo gets in an accident with a Kia, then the death is on the Volvo? That is not how it works.
BMW 5 series
Audi Q7
Audi A6
Lexus 350
Benz M class
These vehicles had absolutely no deaths for occupants from 2012-2015 (Insurance inst of highway safety)
Did those vehicles have autopilot systems 5 years ago?
So how are they just as safe as a Tesla without the special super duper autopilot system? There has to be a reason.

"Both Model S and X owners had an average age of 53 years old. Model X owners showed a significant bump in household annual income versus Model S owners, ticking in at an average of $503,000 and $267,000 respectively. Despite the fact that income levels of both Model S and X owners place them near the top 1% of household incomes in the United States, 94% of current owners claim that this is the most expensive vehicle they have ever purchased." -Teslarati

The deadliest cars on the road are very small vehicles... or vehicles like the mustang and Charger. These vehicles are cheap and tend to be driven by young people.
I live in a pretty high AGI area and I haven't seen any teens driving around in 100k SUVs. High Schools must get higher allowances in your area.
 
Really? He can't do anything else? Not even dictate his latest novel? What about scheduling meetings via a voice assistant? What about listening to and responding to TMC posts via his voice assistant?
Sigh, this thread...
You are correct, I made an absolute statement. Rookie mistake.

I was extrapolating the “etc” and assumed it meant doing email and texting. My bad for assuming that is what “productive” meant. Am so old-school. :)
Fixed.
 
I do slow down. I never said I was going faster than I would be comfortable driving.

That part was in response to a different poster. There were two quotes... One from you (with my response) and then one from someone else (with my response).

The driver is always in control.

No... AP is in control. Once a driver turns AP on, it's not like AP gets your permission before each steering adjustment. It steers. The driver has the opportunity (and indeed the responsibility) to override its mistakes. And the driver can turn off AP whenever you like. But while AP is activated, AP makes steering adjustments, which cannot always be predicted, and only once it starts am adjustment does the driver have a real opportunity to figure out what it is doing and to intervene. The driver may be responsible. But AP is in immediate control. Realistically, the driver is left in an oversight role.

It's not that I don't understand how AP functions. It is that you and I use the word control differently. To me, if a computer can, unilaterally, make active steering decisions that cannot be predicted by the human driver, and the human is reduced to the role of interceding, then the computer is in control.
 
No... AP is in control.
You have no idea what you're talking about. I control the wheel while auto-steer is on. If it doesn't follow what I direct then it is immediately disabled. It is only in control if I am not controlling the wheel.

Give it a try some time so you don't say embarrass yourself with such foolishness.