Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Auto-Pilot - I give up, it's downright lethal

This site may earn commission on affiliate links.
I have to say I'm surprised by the number of apologies made for Tesla's 'suboptimal' implementation of TACC, which many cheaper cars and, especially cars that don't market their AI or other advanced software driving assistance capabilities, get right.
Yes but I think the thread has moved from its title which is about Autopilot functionality ... and that is for a restricted environment so if it doesn't work properly outside that environment it is fair to give some leeway. When just talking about TACC I completely agree with you ... it really should work on the range of roads that anyone would reasonably use cruise control, yet frequently it doesn't. I would like the option of dumb cruise and a speed limiter too!
 
The title of this discussion attracts anti-Tesla propagandists. It is also obviously wrong. No actually lethal case is mentioned. We don't even know whether a Tesla reacts differently when there is a tailgate collision risk, i.e. another car following closely behind.

To me the title is even more obviously wrong, as I use TACC and Autopilot wherever possible, which in my case means almost always, and I have had no situation that was, or could have been, lethal.

I do not use TACC in streets where cars are parked in my lane. I also do not use Autopilot on severely zig-zagging roads and on narrow roads without centerline.

Yes, there is the occasional phantom braking, because a Tesla always brakes when it perceives a collision risk. Since the perception of a computer differs from the perception of a human, phantom braking is fundamentally unavoidable not only in Teslas. Conversely, humans also cause phantom braking when they see something that eventually turns out not to warrant braking. Due to our brains that have evolved over millions of years, we can recognize most things correctly, but not always all of them.

This discussion is also pointless, because we are discussing the old software that we already know will be replaced soon.

The discussion about wishing for an old-style cruise control is particularly pointless, because it is abundantly clear that Tesla will never install any cruise control that would drive full-speed into an obstacle. Get used to the fact already that we will always have at least a few phantom brakings.

The best everybody can do is stay alert while on Autopilot and learn what TACC and Autopilot can and cannot do. For me, phantom brakings are rare, because I anticipate and prevent or reduce them, by switching off TACC in unsuitable situations or by using the accelerator when I am sure that a braking is indeed a phantom.
 
The title of this discussion attracts anti-Tesla propagandists. It is also obviously wrong. No actually lethal case is mentioned. We don't even know whether a Tesla reacts differently when there is a tailgate collision risk, i.e. another car following closely behind.

To me the title is even more obviously wrong, as I use TACC and Autopilot wherever possible, which in my case means almost always, and I have had no situation that was, or could have been, lethal.

Using a similar rhetorical flourish to the thread title, these are the two most German paragraphs I have ever read :D

The rest of it was some of the worst apologist guff I have read in years.
 
Yeah right, Elon! He clearly thought the software at the time was capable of Level 5 autonomy. He was completely wrong. Now he’s telling us v9 is the answer to everything. Given his track record I, for one, find he has no credibility whatsoever.

And how many versions of the light bulb did Edison make to achieve the break through, how long did the Wright brothers make powered flight for the first time they succeeded?

We seem to live in an age where people don't understand any more the hard work required for breakthroughs.

Elon isn't the only one to under estimate the complexity of AI development. IBM, Waymo, even DeepMind have all failed to meet their public targets. But does that mean they should all give up and stay with the status quo?

The ture test of life isn't about celebrating success, its about how you deal with failure.
 
  • Like
Reactions: pow216 and jpm777
Count me as another one who gave up on using AutoPilot a long time ago as too unreliable with scary phantom braking. I do occasionally try it on an empty motorway, but it hasn’t really improved in the two years I’ve had the car. It’s interesting watching the reviews of the new Mercedes EQS as I know, having owned a Mercedes, their radar cruise control works very well. As others have said, other manufacturers have similar systems that work well, so there’s no excuse for Tesla not even getting the basics right.

As well as announcing more details of EQS models and prices, Mercedes have also being demonstrating their level 3 autonomous driving system Drive Pilot. It looks like it uses a combination of lidar, cameras and very accurate GPS to work, has some sort of driver engagement cameras inside and can even spot emergency vehicles behind using it‘s rear camera. What isn’t so great is that it only works up to 60kph (about 37mph), so only really useful in traffic, and doesn’t work at all in the rain. In fact it has multiple rain sensors, including one in the wheel well, to stop engagement when it’s wet. No prices yet as it isn’t available until the end of the year but will be interesting to see where Mercedes will price it in comparison to Autopilot.
 
And how many versions of the light bulb did Edison make to achieve the break through, how long did the Wright brothers make powered flight for the first time they succeeded?

We seem to live in an age where people don't understand any more the hard work required for breakthroughs.

Elon isn't the only one to under estimate the complexity of AI development. IBM, Waymo, even DeepMind have all failed to meet their public targets. But does that mean they should all give up and stay with the status quo?

The ture test of life isn't about celebrating success, its about how you deal with failure.

I don't think anyone is debating whether its harder or easier than they thought, or the amount of effort required to make it work or the benefit it will bring when they succeed. It's that other companies, and for a long time, have had systems that read speed limits, and have active cruise control that is way more reliable than Teslas current offering, and any other company trying to break the technology ceiling through the use of AI, including those you mention, don't sell it to the public for use way before its sufficiently robust.

The true test of life isn't about celebrating success, it about how you deal with failure and not expose the public to unnecessary danger.
 
And how many versions of the light bulb did Edison make to achieve the break through, how long did the Wright brothers make powered flight for the first time they succeeded?

We seem to live in an age where people don't understand any more the hard work required for breakthroughs.

Elon isn't the only one to under estimate the complexity of AI development. IBM, Waymo, even DeepMind have all failed to meet their public targets. But does that mean they should all give up and stay with the status quo?

The ture test of life isn't about celebrating success, its about how you deal with failure.
Fodder for the devil's advocate:
Edison, like Elon added other people's ideas into his trials to develop a product. Edison was blindly convinced his was the best solution and as a master of promotions with grandiose promises would publicly electrocute animals and even an elephant to make his point that DC current was safer than AC despite that being untrue. I don't recall him selling lightbulbs before they worked. Nor did the Wright brothers use the public as crash test dummies.😈😈

Remember Edison was Tesla's rival:D
 
And how many versions of the light bulb did Edison make to achieve the break through, how long did the Wright brothers make powered flight for the first time they succeeded?

We seem to live in an age where people don't understand any more the hard work required for breakthroughs.

Elon isn't the only one to under estimate the complexity of AI development. IBM, Waymo, even DeepMind have all failed to meet their public targets. But does that mean they should all give up and stay with the status quo?

The ture test of life isn't about celebrating success, its about how you deal with failure.

There are, as you say, several companies trying to develop autonomous driving and I’m sure all of them are struggling to get to grips with it. Sure they’ve missed their public targets. But no one mouths off quite like the Musk, making preposterous, ridiculous predictions that have no hope of fulfilment. Keeping quiet is probably best then you don’t look like a complete tool when you get is so disastrously wrong.
 
Tesla will never install any cruise control that would drive full-speed into an obstacle.
Nor will any other manufacturer's adaptive cruise control. The difference is that other vehicles brake when there is an actual object in the way (as detected by radar), not when the vehicle 'thinks' there may be something there in the future, or is distracted/confused by shadows.
 
The title of this discussion attracts anti-Tesla propagandists. It is also obviously wrong.
This discussion attracts people who have experienced severe often dangerous phantom braking. It is not wrong.
To me the title is even more obviously wrong, as I use TACC and Autopilot wherever possible, which in my case means almost always, and I have had no situation that was, or could have been, lethal.

I have on several occasions. Emergency braking on the outside lane of a motorway, bringing the car down from 70mph to 40mph before I can intervene is downright dangerous. I really don’t know how I haven’t been rear ended.
Yes, there is the occasional phantom braking, because a Tesla always brakes when it perceives a collision risk. Since the perception of a computer differs from the perception of a human, phantom braking is fundamentally unavoidable not only in Teslas.

Not true. I’ve driven thousands of miles in my A6 using adaptive cruise control and never once experienced phantom braking. I’ve had loads of instances of phantom braking using TACC on identical roads and under similar conditions.
This discussion is also pointless

Why contribute to it then?
 
.. it is abundantly clear that Tesla will never install any cruise control that would drive full-speed into an obstacle.

It's abundantly clear they do

tesla-model-s-crash-south-jordan-720x489.jpg
 
Just thought I’d put this out here. I know I am in the US but I’ve had my Model 3 LR since 5/21, and Autopilot was an absolute nightmare for the first 700 miles. I’m at 1600 now, and drove a few hundred over the weekend and was blown away at the improvement that I am seeing now daily. Back roads of Georgia and Interstate 10 with absolutely no fantom braking or left turn lane Tendencies! So at least here the system is definitely improving.
 
Just thought I’d put this out here. I know I am in the US but I’ve had my Model 3 LR since 5/21, and Autopilot was an absolute nightmare for the first 700 miles. I’m at 1600 now, and drove a few hundred over the weekend and was blown away at the improvement that I am seeing now daily. Back roads of Georgia and Interstate 10 with absolutely no fantom braking or left turn lane Tendencies! So at least here the system is definitely improving.
That's probably based on the 'AI' training being US focused and your roads are substantially different to ours i.e. a lot wider. Our A Roads (one down from a Motorway / Interstate) seems to be the equivalent of a US dirt track (exaggerating a little ;) )

Roads in US suburbs fascinate me. Cars can park either side (not that they need to with the large driveways) and 2 way traffic can still flow. A blind horse could navigate that without hitting anything. In the UK typically if cars park either side - the road is blocked. Thankfully most people understand that so people don't park directly opposite each other turning a residential street into a slalom course.

Then we have 'B' Roads - I think in the US you call them footpaths - often with not enough room for two vehicles to drive past each other without praying or the occasional pull off area - If you are really lucky you find a lorry coming down the road which is nearly as wide as the car so you then have the joys of reversing for 1/2 mile to find a muddy entry into a farmers field to reverse into so the lorry can get past.

If FSD can deal with this lot - then I'll be impressed, hell just stop slamming on the brakes for no reason and be nearly as good as other manufacturers level 1 based autopilots would be a good start.
 
Last edited:
Just thought I’d put this out here. I know I am in the US but I’ve had my Model 3 LR since 5/21, and Autopilot was an absolute nightmare for the first 700 miles. I’m at 1600 now, and drove a few hundred over the weekend and was blown away at the improvement that I am seeing now daily. Back roads of Georgia and Interstate 10 with absolutely no fantom braking or left turn lane Tendencies! So at least here the system is definitely improving.
I think you need to take a longer term view. Factors such as weather, the height of the sun which changes each day and therefore shadows, and just general variability all play a part. You maybe right, let’s hope so, but change in performance comes after software updates, they don’t learn and improve as you drive, at least not in the car. The car may capture info that gets fed back, mixed with thousands of other data points and a new model is included in the next software update.
 
I think you need to take a longer term view. Factors such as weather, the height of the sun which changes each day and therefore shadows, and just general variability all play a part. You maybe right, let’s hope so, but change in performance comes after software updates, they don’t learn and improve as you drive, at least not in the car. The car may capture info that gets fed back, mixed with thousands of other data points and a new model is included in the next software update.
Nope, disagree. What I’m seeing is AI learning. Nothing to do with weather or sun angles, I’ve driven through rain storms and the improvements are definitely noticeable minus external circumstances.:)
 
Nope, disagree. What I’m seeing is AI learning. Nothing to do with weather or sun angles, I’ve driven through rain storms and the improvements are definitely noticeable minus external circumstances.:)
Disagree all you want, its well documented that Tesla don't tune the in car models on the fly, learning is fleet learning aggregating thousands of drivers experiences, thats pne of the reasons why they're building one of the biggest computers in the world to process it. With respect, you've had the car for 6 weeks, I've been driving one for 6 years, maybe it's possible it's you thats learning how the system works rather than the car learning where you drive.
 
Tesla FSD beta 9 is the closest thing to autonomous driving around at present, the fact it actually runs on hardware that is in every Tesla made since 2017 is nothing short of amazing. In comparison the Mobile eye demo cars have literally a boot full of computers.
not sure you understand the diff, here.

M.E. is doing data collection and they sink it to local disk arrays. there's no reason a test car would upload THAT kind of bulk data over a wireless connection. all companies have data-centers in their trunks when they are doing r/d. its actually amazing to see all the switches (gig and 10gbe), ethernet cables, disk storage units (ssd), etc.

you cant compare that to what a customer buys and drives. by the time all the sensor and compute modules are put into actual drivable buyable cars, they've size reduced things down, fit them into 'claim spaces' where there's spare room to hide things and then you dont see the 'data center in the trunk' anymore.
 
That's probably based on the 'AI' training being US focused and your roads are substantially different to ours i.e. a lot wider. Our A Roads (one down from a Motorway / Interstate) seems to be the equivalent of a US dirt track (exaggerating a little ;) )
I'm in the bay area and one of the things that sold me on the tesla was seeing SO many of them driving up and down the local roads. the tesla density, here, is higher than the density of starbucks stores (lol).

my level2 experience with my m3 has been quite good. the local highways (like the M roads in the UK) work pretty well with AP and I engage AP almost all the time when I'm on that kind of road. but again, I trust it since 100x as many teslas traverse this patch of pavement than anywhere else in the world, so to speak. that bit of ground has been seen by their cameras more times than god, himself, has seen it. of course tesla should know, by know, how our roads curve and so on.

in other areas of the world where the cars are not constantly sending updates of updates of updates of that bit of road, the driving NN isn't going to be as refined.
 
  • Like
Reactions: Foucault
First phantom breaking episode today. A36, notional limit with clear white lines each side, no side turnings or overhanging trees. Approaching from the other direction was a large agricultural machine, staying neatly in its lane. So my car aggressively slowed, i knocked it out of AP and took over. It was not fun.
Disappointing: it seems that AP is only usable on a dual carriageway.
 
Yes but I think the thread has moved from its title which is about Autopilot functionality ... and that is for a restricted environment so if it doesn't work properly outside that environment it is fair to give some leeway. When just talking about TACC I completely agree with you ... it really should work on the range of roads that anyone would reasonably use cruise control, yet frequently it doesn't. I would like the option of dumb cruise and a speed limiter too!
I agree. A simple, dumb cruise control would be perfect.