Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla autopilot HW3

This site may earn commission on affiliate links.
I'll bet that would be a relatively easy retrofit, especially if it is just used for surveillance (and doesn't have to feed into anything for real-time processing).

I have zero interest in letting my MS drive itself around and pickup passengers for a small profit. Some days I cringe when my daughter gets in the car with sticky fingers and who knows what stuck to her shoes! If the redundancy additions between AP2.0 and AP2.5 are for regulatory approval for "commercial" use, I'm happy to skip it. Owning this car is just as much for the pleasure of driving it as it is for the cool tech. Get me to the state of the latest FSD video where car can do pretty much all of the driving with me paying less attention and I'll be happy.
 
  • Like
Reactions: SO16 and Raechris
I have zero interest in letting my MS drive itself around and pickup passengers for a small profit. Some days I cringe when my daughter gets in the car with sticky fingers and who knows what stuck to her shoes! If the redundancy additions between AP2.0 and AP2.5 are for regulatory approval for "commercial" use, I'm happy to skip it. Owning this car is just as much for the pleasure of driving it as it is for the cool tech. Get me to the state of the latest FSD video where car can do pretty much all of the driving with me paying less attention and I'll be happy.

Yeah. I do want the car to be able to drive itself for me with no attention paid on road trips and in traffic jams, but there's no way I'd let it pick up strangers by itself for a few dollars at a time.

It may well be great business for Tesla, with cars built for it, but I doubt many S and X owners will rent their personal cars this way.

(Of course, there are a surprising number of them on Turo, so maybe some folks think differently.)
 
This is probably an easy prediction to make but I think the jump from AP2 to AP3, once we get the new AP3 software, is going to be quite significant. It is going to feel like a brand new, next generation, autopilot. The memories of AP2 not seeing stopped vehicles, veering towards dividers, etc are going to be distant memories.
 
I'll bet that would be a relatively easy retrofit, especially if it is just used for surveillance (and doesn't have to feed into anything for real-time processing).

Unless Tesla removes it, the selfie-cam connector is on the HW3 or FSD computer board (Model S/X AP 2.5 boards have the connector removed though, the place is there). After that it would be merely about installing a retrofit camera part on winshield and routing a cable from the camera to behind the glove box. I don’t really see them as doing this for free but yeah I wouldn’t consider it impossible they would design and sell such a retrofit, if the selfie-cam matters for Tesla Network and Tesla Network really materializes during the lifetime of AP2/2.5 Model S/Xs.

After this it would work just as well as on any car. I see it much less likely they’d hook it up to anything else than the real thing.
 
You care to specify which of those fatalities was caused by the behavior of AP2 as opposed to the failure of the driver to take responsibility for the vehicle as required? I look forward to you assigning blame correctly. Please use a method that's transparent and well-described so that we can check your work.
 
You care to specify which of those fatalities was caused by the behavior of AP2 as opposed to the failure of the driver to take responsibility for the vehicle as required? I look forward to you assigning blame correctly. Please use a method that's transparent and well-described so that we can check your work.

How do you fit through a door with balls that big?
 

As of the time of writing, that page lists three Tesla crashes; only one of which is after the release of AP2.

That one crash is the Mountain View crash, discussed for a few thousand posts over here:

Model X Crash on US-101 (Mountain View, CA)

Here's the preliminary NHTSA report, which is now more than ten months old, but AFAICT they've provided no update yet:

https://www.ntsb.gov/news/press-releases/Pages/nr20180607.aspx

From the discussion, it sounds like AP did steer the car into a barrier, possibly aided by bad markings, in full view of the driver for several seconds and while accelerating hard, with no driver input whatsoever.

A Prius hit the barrier a few days before at similar speed, and the driver walked because the barrier collapsed as intended. If it had been reset as designed or other protections had been added to replace it, it seems unlikely that this would have been a fatal accident.
 
You care to specify which of those fatalities was caused by the behavior of AP2 as opposed to the failure of the driver to take responsibility for the vehicle as required? I look forward to you assigning blame correctly. Please use a method that's transparent and well-described so that we can check your work.

Would the person have died if AP2 didn't exist and the person didn't slam into the barrier?
 
As of the time of writing, that page lists three Tesla crashes; only one of which is after the release of AP2.

That one crash is the Mountain View crash, discussed for a few thousand posts over here:

Model X Crash on US-101 (Mountain View, CA)

Here's the preliminary NHTSA report, which is now more than ten months old, but AFAICT they've provided no update yet:

https://www.ntsb.gov/news/press-releases/Pages/nr20180607.aspx

From the discussion, it sounds like AP did steer the car into a barrier, possibly aided by bad markings, in full view of the driver for several seconds and while accelerating hard, with no driver input whatsoever.

A Prius hit the barrier a few days before at similar speed, and the driver walked because the barrier collapsed as intended. If it had been reset as designed or other protections had been added to replace it, it seems unlikely that this would have been a fatal accident.

@Bet TSLA said there were 0 accidents which in fact was wrong and incredibly disrespectful to someone who died.
 
Feel free to discuss the nuances of causality and where, as an armchair observer, one would place most blame, in the respective threads.
Simply because the Wikipedia list is titled the way it is, it doesn't provide any indication of causality.
Personally I'd lean to the notion of driver responsibility if the driver has to supervise the system and that supervision, properly executed, would break the chain of events. Viewing the fatalities list like this, they are all driver error.
 
Yes responsibility with all Tesla vehicles certainly rests with the driver. Because Tesla offers driver assistance systems, not autonomous driving systems. The question is -- what does Tesla market to consumers? What is it that they talk about and imply? And how much concern have they paid to safety -- real, practical safety, not just legal liability -- in their quest to prove to the world that Elon is the smartest guy on the planet and lidar sucks, and also to sell more cars and pump the stock price? If you care about real safety -- not just legal responsibility -- then you have to think about the effect of what you are doing. And I don't just mean your technology but everything you are doing, which includes the way you market and sell your cars, and the things you say on Twitter and during presentations and interviews on YouTube.

It also means you need to think about the effect of releasing your driver assistance technology to the public before it is really done. This includes thinking seriously about how people will actually use your product, and not just how you intend them to use it or how the lawyers describe its proper use.

Tesla is using its customers -- rather than trained and supervised test engineers -- to test and develop clearly unfinished systems. And the fanboys all hail them as visionary and nimble compared to those old Detroit dinosaurs.
 
Tesla is using its customers -- rather than trained and supervised test engineers -- to test and develop clearly unfinished systems. And the fanboys all hail them as visionary and nimble compared to those old Detroit dinosaurs.
I acknowledge the messaging issue.
However, I have yet to hear of a case where autosteer performed worse than a car without it. Detroit (and most of everyone else) is selling cars that will crash soon after you stop steering. Tesla's will at least try to keep you on the road.

Is a Tesla in AP any less safe that the mutiple models/years of GM cars where the power steering motor would fail? Both require additional effort from the driver to maintain control, but at least AP disengages when you do.
 
Is a Tesla in AP any less safe that the mutiple models/years of GM cars where the power steering motor would fail? Both require additional effort from the driver to maintain control, but at least AP disengages when you do.

TACC is great and definitely a safety feature, though it has recently regressed in that now it thinks the speed limit (on the highway) drops to 25mph almost every time I pass an exit and slams on the brakes, which is unsafe. So let's say that in 2018.50.x TACC is a net win on safety, and in 2019.x so far it seems like a step back.

But Autosteer is a different matter. I think there is one and only one case in which it improves safety, and that is if the driver loses consciousness involuntarily, like a heart attack or something. In this case it has a good chance of bringing the car to a safe stop once it no longer gets steering wheel torque.

In every other case, I believe it reduces safety, even when the driver is alert and paying attention. I use it a lot less now than I used to because I have come to this conclusion. Because it gives people a false sense of security it makes people more likely to quit paying attention, maybe check that text message that just came in, or worse to get in the car and drive when they shouldn't, because they're tired or maybe a bit tipsy. This makes crashes more likely, not less.

Until they are at least L3 (or very very reliable L2), lane keeping systems are unsafe IMO unless they include a very robust driver attention monitoring system. The more mild steering assist and lane departure warning systems are fine.

If nothing else, activating Autosteer and especially NOA seems to make TACC jumpier and more prone to phantom braking, which is why I don't use Autosteer much these days. Tesla, give me rock solid and smooth TACC without phantom braking please before you waste time chasing FSD.
 
Would the person have died if AP2 didn't exist and the person didn't slam into the barrier?
Would the Prius have crashed if AP hadn't been on? Oh, yeah, no AP. The driver of the X had reported that intersection as causing AP problems something like 7 times. If I had an intersection with a known issue I would be hyper alert. I like AP but I'm not going to get on my cell phone or watch a movie, etc. while it is controlling the car. Do you think people haven't crashed on adaptive cruise control? AP is just adaptive cruise control with lane keep assist. OK, NoA is different but I think you get the idea.