Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

BBC Autopilot video

This site may earn commission on affiliate links.
There are some unknowns here. I have not been able to discover what AP following distance setting the Chatham people used. From the video i am estimating that it's a little under 2 seconds. If the Tesla had been following with a 3 second gap, the vehicle would have probably stopped before hitting the dummy vehicle. When using AP1, I generally use the longest distance setting which allows me time to decide and act. I generally do not have a problem in front, but from behind where if the vehicle is too close, and AP will slow down too quickly for that driver to react.
 
Be careful what you ask for. This would be equivalent to saying that any other car on the road should limit the maximum cruise control setting to "safe speeds".

Not at all similar... Normal cruise control behaves in a totally predictable way. You plug in the speed you want, and the system maintains that exact speed.

By contrast, the distance setting on AP is a black box. Folks on this board can't even agree on whether it is a measure of time or of distance, nor can they agree on whether it is linear. Furthermore, if the setting is too short, especially at high speeds, it seems that AS becomes even more unreliable (ie less able to accurately maintain lane) because it sees less lane line between the Tesla and the leading car.

Tesla shouldn't allow follow distances that (as the car understands the road and speed) are too short to allow reliable AS and a reasonable opportunity for the driver to intervene in the stopped car scenario.

If folks want to tailgate when they drive the car manually, that's one thing. But Tesla should stop people from instructing AP to take primary responsibility for steering/speed (with the operator in only a supervisory capacity) in a manner that will be tailgating or that will operate at a follow distance that AP can't reliably handle or that doesn't allow sufficient time for a driver to override.
 
I agree with you @Economite that there are clear technological differences between the two.

The point I was trying to make is that you implied that the Autopilot technology shouldn't allow the user to set a setting that would be considered unsafe or particularly risky. You clarify that position in your reply that Tesla should stop people from tailgating with AP or using it at too close a following distance.

The comparison is that cruise control theoretically lets you set it to keep the car at 100 MPH if you should so choose. To correlate to your argument, other car makers should limit cruise control to lower speeds since no one should be cruising at those speeds since they are riskier in terms of stopping distance, or limiting the ability for the driver to react in a timely fashion to a road hazard.

I make the point that it should not be the car manufacturer's responsibility to do this. Yes, I agree that they need to properly educate the user on what the system can and can't do and when it should or shouldn't be used, but this is the ultimate responsibility of the driver.
 
The biggest problem with Autopilot is the name and marketing. Autopilot implies it WILL drive itself. Tesla's marketing implied 2 years ago the car WILL drive itself. If the name was just "Tesla Enhanced Cruise Control" fewer people would be annoyed and attacking. Every person that test drives my car wants to try the Autopilot and they all like to see handsfree driving. The simple reality is the Tesla system is no where close to autonomous autopilot function. The accidents aside, the phantom braking in AP 2.5 is unacceptable and dangerous. Mobile Eye could have Tesla close, but until Tesla gets it better it should not use the name Autopilot and if that was clearly marketed, few would be attacking it regularly...

Tesla's Autopilot works just like an aircraft's autopilot. There's nothing wrong with the name. Autopilot doesn't equal autonomous.
 
  • Like
Reactions: MorrisonHiker
There are some unknowns here. I have not been able to discover what AP following distance setting the Chatham people used. From the video i am estimating that it's a little under 2 seconds. If the Tesla had been following with a 3 second gap, the vehicle would have probably stopped before hitting the dummy vehicle. When using AP1, I generally use the longest distance setting which allows me time to decide and act. I generally do not have a problem in front, but from behind where if the vehicle is too close, and AP will slow down too quickly for that driver to react.
Do we know how fast the car was traveling? Do we know what firmware it was using? I suspect the demo was not intended to portray how Tesla Autopilot will perform but to say all driving assist systems should not be allowed on the road. I think at the right following distance at a reasonable speed for the distance and with the latest firmware the car would have stopped. Too many unknowns.
 
Tesla's Autopilot works just like an aircraft's autopilot. There's nothing wrong with the name. Autopilot doesn't equal autonomous.

When did an aircraft autopilot use vision system and a neural net to make decisions on steering and speed? Or drive at 65mph next to an Armco barrier?

They don’t work just like each other at all, the objectives are similar, but the margins of error and amount of instrumentation on a plane is much higher.
 
Absolutely

However none of them are saying the hardware and the technology supports full self driving
I said full self driving

Tesla have an option called full self driving

It’s not hard to understand.

What does that have to do with anything? I am not trying to be obnoxious but you do realize that hardware supporting FSD does not mean the car actually is FSD, right? There is hardware and software. Teslas have the hardware to do FSD (cameras, radar and ultrasonics) but lack the software right now. And without the software, Teslas can't do FSD yet. So the fact that Teslas have the hardware to support FSD is completely irrelevant to the test that Thatcham Research conducted.
 
The comparison is that cruise control theoretically lets you set it to keep the car at 100 MPH if you should so choose. To correlate to your argument, other car makers should limit cruise control to lower speeds since no one should be cruising at those speeds since they are riskier in terms of stopping distance, or limiting the ability for the driver to react in a timely fashion to a road hazard.

I don't think your corollary is similar at all. When a driver sets their cruise control at 80MPH, they know exactly what behavior they will be getting; and the car has no way of knowing whether it is in a situation where sustained travel at 100MPH is safe or not.

By contrast, when a driver says "use AP with follow distance 1 and max speed 80MPH" he or she is basically telling the car "keep my vehicle centered on the road and not rear ending anybody; using the speed that will achieve this goal, but no higher than 80 and try to maintain a close follow distance." The car is in a much better position than the driver to decide whether is will be able to maintain safety at the chosen follow setting, since the car has a much better idea of what kind of view of lane lines--and traffic in front of the lead car-- it has under the actual traffic conditions. Also, "follow distance 1" is such an abstract command, that the driver can't really understand what it means. Therefore, I think it would be totally appropriate for AP to refuse to operate with a set of settings that it does not believe are safe for the environment as it interprets it. If a driver wants to follow more closely than AP is willing to follow, he or she can still do so manually, but shouldn't be able to force AP to do it. There is no downside to the car refusing to maintain a very close follow speed.

By the way... At least in American cars (might be different in Europe where there are higher speed limits) I would have no problem with conventional cruise controls refusing to take a setting of 100 MPH. That's a speed that is never legal, and virtually always constitutes reckless driving (or some other multi-point violation).
 
What does that have to do with anything? I am not trying to be obnoxious but you do realize that hardware supporting FSD does not mean the car actually is FSD, right? There is hardware and software. Teslas have the hardware to do FSD (cameras, radar and ultrasonics) but lack the software right now. And without the software, Teslas can't do FSD yet. So the fact that Teslas have the hardware to support FSD is completely irrelevant to the test that Thatcham Research conducted.

Of course I know FSD hasn’t been implemented, but Tesla are selling these cars as having all the hardware to enable it (we agree on that) and the tests illustrate that even the much simpler task of EAP can easily be defeated. With a Volvo with lane keeping they will just point out that the system is a driver aid and was never intentioned to cater approaching FSD, Tesla on the other hand need to address that flaw to live up to their promised capabilities and tests like this raises material questions on whether they’ll ever be able to. That’s why’s a problem for Tesla and not for the others.

In summary, if you can’t write software that can stop, in lane, because your way is blocked, a capability your hardware has to support, then your advanced features are doomed.
 
Last edited:
Since the Audi in front was able to steer into another lane and the Tesla followed at a distance, it should have been able to do the same thing.

Are you implying the Model S doesn’t handle as well? Because if you do, then look up some handling tests of both cars. That’s the only thing here linked to physics.
The Audi in front didn't 'steer', it did an emergency manoeuvre. A human driving the Tesla would also have run into the blow up car.
 
Of course I know FSD hasn’t been implemented, but Tesla are selling these cars as having all the hardware to enable it (we agree on that) and the tests illustrate that even the much simpler task of EAP can easily be defeated. With a Volvo with lane keeping they will just point out that the system is a driver aid and was never intentioned to cater approaching FSD, Tesla on the other hand need to address that flaw to live up to their promised capabilities and tests like this raises material questions on whether they’ll ever be able to. That’s why’s a problem for Tesla and not for the others.

In summary, if you can’t write software that can stop, in lane, because your way is blocked, a capability your hardware has to support, then your advanced features are doomed.

It's a PR problem, probably. People may expect Tesla cars to behave differently because of the FSD hardware. I am just saying that the hardware should not create any FSD expectations since it is not really about the hardware, it's about the software. When Tesla writes the self-driving software, if the cars still have these problems, that's very bad but until Tesla writes the FSD software, the cars should not be judged on self-driving issues.
 
  • Like
Reactions: Russell
Seems the S-class from the video above manages to use AEB to a complete stop, avoiding a collision, from 100 km/h. That is the best I have seen so far. Tesla need to do the same tests.
The BBC video showing us that AEB works but did not stop the car from 38 mph.
What we need pretty soon is a systematic test of the emergency braking system similar to what EuroNCAP does.
The BBC test should also be done with other TACC systems to see performance from the competition.
 
The Audi in front didn't 'steer', it did an emergency manoeuvre. A human driving the Tesla would also have run into the blow up car.

So you mean the reaction time would be too long for a human driver. Could be true and I can’t prove you wrong here, but that’s not the point of the video.

What they wanted to show, is that no currently available system can do an emergency maneuver. And that’s important to know.

Sure you could argue that there are many more problems with today’s systems, but it’s harder to recreate the accidents where AP just fails and hits something.

But evasive steering is something every driver is able to do, but no level 2 system. That’s at least one reason to always keep your eyes on the road.