Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Does Tesla have a self-driving data advantage?

This site may earn commission on affiliate links.
Agree that the edge case of a white big rig cutting across the highway would be difficult to anticipate, but certainly Tesla is aware of drivers becoming so comfortable with autopilot so as to ignore the fine print and not pay attention to the road.

I must be special because I can manage not to pay attention to the road even when I don't use autopilot.

If only there was a feature the Tesla had that could be improved for all of us whether we use AP or not.
 
Do not want this thread to discuss how simulation mode should work, but I wonder why they can't gather simulated data for when AEB thinks it should have kicked in vs when driver manually slammed breaks to avoid a mishap.

The simulation is really just feeding it video data along with sensor data. So it's pretty much entirely a question of how many different situations that are recorded. Where it's recorded in a similar fashion to this kind of dataset.

The KITTI Vision Benchmark Suite

I really wish Tesla had dash cam style recording where if you even ran into an "interesting" situation you could push a button to flag it to be uploaded to Tesla. Then when you got home the video would be uploaded to Tesla over WIFI.

Note - You can play around with the above dataset fairy easily if you're running Ubuntu with a fairly modern GPU, and don't mind the hassle of installing nvidia digits onto it.
 
  • Like
Reactions: Ben W
Agree that the edge case of a white big rig cutting across the highway would be difficult to anticipate, but certainly Tesla is aware of drivers becoming so comfortable with autopilot so as to ignore the fine print and not pay attention to the road.
In fact I don't think that particular case is hard to predict, but Tesla has been clear that AP is not for use on roads with cross traffic, and that particular incident occurred on a highway with cross traffic. The driver appeared to have failed to exercise reasonable care whether he had been using AP or not, and sadly paid for it with his life.
 
Tesla should have anticipated this edge case based on the crazy antics that have been posted on YouTube over the last half year..

Crazy antics with lane keeping cars predates Tesla Autopilot. In fact the one with the Infinity is what inspired someone from this forum to replicate it.

The only difference is the Tesla didn't require using a coke can.

The only one that really made me cringe was the one made by Elons wife.
 
In fact I don't think that particular case is hard to predict, but Tesla has been clear that AP is not for use on roads with cross traffic, and that particular incident occurred on a highway with cross traffic. The driver appeared to have failed to exercise reasonable care whether he had been using AP or not, and sadly paid for it with his life.

I think its really hard to say whether the message has been clear enough. Yes, it's all there in the fine print, in the manual, in all official Tesla communications. But is it a sort of *wink* *wink* "yeah, you don't really have to keep your hands on the wheel"? Is it like the iTunes agreement where no one actually reads the fine print? I think if one had to take a DMV-style written exam prior to using autopilot, then it would be easier to place the blame on the driver. But at this point all we have is barely a button buried deep in the menu system documenting the "informed consent" for enabling autopilot.
 
I think its really hard to say whether the message has been clear enough.
I agree that Tesla could make more of an effort to educate owners, especially new owners, about the proper uses and limitations of AP. My opinion is that every new owner who buys the AP option should spend at least 30 minutes driving with a properly trained Tesla employee so the owner could practice using AP in real world conditions with someone sitting next to them to explain how it works.

If Tesla had done that with every new AP user, I think there would have been fewer AP-related incidents and stupid behavior. But the number of incidents would still be far more than zero. And the recent sad fatality may still have occurred.
 
Okay now I'm paranoid - can somebody get hold of the moderators please if you have a personal connection to them and get this thread title changed?

I'm imagining Forbes or Fortune or some other publication doing a story on this thread: "Tesla owners celebrate death of beta testers as a way to increase Elon Musk's power."

I reported my own post a couple hours ago but haven't heard anything.
 
I understand the spirit behind the OP, but Joshua's crash highlighted the shortcomings of the hardware, not some missing piece of code that the "machine" hadn't yet learned. I'd put money on Tesla already testing a newer car with the triple-cameras on a dummy tractor trailer scenario, exactly like Joshua's. I don't believe this would be done outdoors though, for risk of media scrutiny.
 
No. I do not agree. This was not the best possible outcome for anyone, including Tesla. Even the short sellers can't even really claimed to have benefitted. Tesla is already aware of the shortcomings of its software, except now they have to deal with a death causing naysayers to scream it from the top of their lungs to any and all who will listen.

Forgetting about your dear old dad for a sec, the title and the premise leave a bad taste in my mouth regardless.
 
I think its really hard to say whether the message has been clear enough. Yes, it's all there in the fine print, in the manual, in all official Tesla communications. But is it a sort of *wink* *wink* "yeah, you don't really have to keep your hands on the wheel"? Is it like the iTunes agreement where no one actually reads the fine print? I think if one had to take a DMV-style written exam prior to using autopilot, then it would be easier to place the blame on the driver. But at this point all we have is barely a button buried deep in the menu system documenting the "informed consent" for enabling autopilot.

Yes, some people need to have their hands held 24/7 and be babysat. Some people require a red neon sign to be imprinted on their retinas to get the message. Some people don't like to be held responsible for their actions (including not reading the fine print/dismissing it). And then there are the other people who prefer to be treated as mature, consciencious adults accepting of their role and what goes with it.

We can make room for the naive adult, one who doesn't know they should read the fine print, or doesn't know the media lies like a rug/exaggerates for attention/gets it wrong because they are lazy, etc..

The problem with all that is that Mr. Brown was neither naive about AP functionalities, nor it appears a person wanting or needing more hand holding. Instead this is all just an unfortunate accident with enough blame between both drivers that there's no need to go looking for more. And as a result of this unfortunate accident, Tesla will improve AP in yet another area based in their publicly known focus on safety. (I hope I don't need to give examples of that, but I can if you require that evidence.)
 
  • Like
Reactions: WarpedOne
Yes, some people need to have their hands held 24/7 and be babysat. Some people require a red neon sign to be imprinted on their retinas to get the message. Some people don't like to be held responsible for their actions (including not reading the fine print/dismissing it). And then there are the other people who prefer to be treated as mature, consciencious adults accepting of their role and what goes with it.

As Tesla moves towards mass market, they're going to need to spend more time optimizing for and communicating with the least common denominator. One naive adult is an outlier. But where do you draw the line? 10 naive adults? 100 naive adults? What percentage is acceptable given the current limitations and messaging of the autopilot system? For example, Ikea has offered a free retrofit kit for their dresser/drawers for years and all their instruction manuals are very clear that the dresser/drawers need to be fixed to the wall. Yet kids are still dying and Ikea was forced to discontinue the entire line and recall every single unit. Sure, its easy to blame the parents for not following instructions, but clearly Ikea is taking much or most of the blame now.
 
... So in the real world Tesla is ahead of Google when it comes to implementing some level of "self driving" and has already logged many more real world driving miles.

This, in my mind is an important distinction, which in conjunction with the OP's point (concerning the path other manufacturers should optimally take) cannot be overstated.

One need look no further than other manufacturers' implementations of, for example, Autosteer (lanekeeping) to get a good idea of substandard.

I would rather the bar be set *very* high for the players in this sandbox. Let them all canoodle as they are in the R&D stage. But you could pull most of what's out there today, and presumably tomorrow from most manufacturers and we would not be worse off for it.

Neverminding that the other manufacturers will have to end up copying Tesla's OTA approach for updates, and that will be a neat trick given the scale differences.

Finally, if Sir Branson's prognostication comes to pass (mostly EV in 14 years), and personally I think he's smoking whatever it is that billionaires smoke in place of crack, then the other manufacturers are even MORE behind not just in features and safety and updateability, but in infrastructure.

Yeah... the gap is not closing. Yet.
 
Separately, I believe the trucker and his previously-cited company of one driver and one truck will be found primarily responsible for the accident - a left turn into oncoming traffic? Puh-lease. The Autopilot focus is misplaced. For all the handwringing about past speeding, you can't even drive past 85 with Autosteer engaged. The speed limit in more than one state (not Florida) is 80, ffs. Had it been a Ford Fiesta using cruise control, this probably wouldn't have made even the regional news.

I defer to the investigators. But that's my $0.02.
 
As I understand it, Joshua Brown's accident wasn't some type of edge case waiting to be discovered. It has been known for years that Mobileye's current tech doesn't activate for laterally crossing vehicles. It is designed to prevent or reduce the severity of rear end collisions.

Tesla has said that Autopilot should be used on limited access or divided freeways because they don't (usually) have cross traffic.
 
As I understand it, Joshua Brown's accident wasn't some type of edge case waiting to be discovered. It has been known for years that Mobileye's current tech doesn't activate for laterally crossing vehicles. It is designed to prevent or reduce the severity of rear end collisions.

Tesla has said that Autopilot should be used on limited access or divided freeways because they don't (usually) have cross traffic.

We don't know, and won't know until the investigation is complete, whether there was sufficient time in which to react after the truck pulled out into oncoming traffic .

There are many stretches of those divided state routes in Florida that are flat and that are straight.

So if the car had only basic cruise control (not even TACC), there may not have been sufficient time to react. What AP can and cannot manage at this juncture may be largely immaterial. Distracted driving doesn't come into play unless there was time to be distracted.

We just don't have enough information yet. The only reason this accident is in the news at all is because there happens to have been an AP Tesla involved.

More to the premise of the thread, let's say it's 3 years hence and AP2.0 manages lateral intrusions just fine. I can only hope that the bar is set *very* high so that the quality of the driver experience is as uniform as possible. Today, it's anything but, and that, from a safety standpoint, is a significant problem with a higher than acceptable chance to get worse.
 
Fatality was best possible outcome for Tesla - now they have a moat around their data set - agree?
Certainly an interesting "take" on the situation.

The competition will have to run their autopilots in simulation mode. Even now - almost 2 years after Autopilot launched - the competition's lane assist systems are not learning anything once in their customer's fleets. Anyone who wants to replicate what Tesla has done will have to start from scratch.

Musk has always been a risk taker. If the government decides this was a bad approach and does not allow this beta test to continue, it will have been an expensive mistake. Removing the feature and refunding the 2.5K could cost Tesla 60 million. Other manufacturers such as GM would not seem so stupidly cautious.
 
except in pure simulation mode

You haven't given any argument that says that this is somehow worse.

I really wish Tesla had dash cam style recording where if you even ran into an "interesting" situation you could push a button to flag it to be uploaded to Tesla. Then when you got home the video would be uploaded to Tesla over WIFI.

This is the sort of innovative feedback mechanism we need to be thinking about!

Thank you kindly.