Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

An interesting EAP fail.

daniel

Active Member
May 7, 2009
4,803
3,616
Kihei, HI
Again- using the current EAP system to make any assumptions at all on how well a different solution running different software on different hardware will do for being dependable is nonsensical.

And imagining that a system that has not yet been developed for hardware that does not exist will be commercially available in 12 to 18 months is equally nonsensical. EAP on HW 2.5 shows what can be done with current software on current hardware. To go beyond that they will need to actually build the next generation hardware and then start writing software for it. And the task we are discussing (driverless car) will require capabilities that nobody has yet demonstrated on any hardware.

It is nonsense to think that the next generation hardware will be able, overnight, to solve problems that even supercomputers cannot yet solve. Driverless cars are coming but are still at least a decade away, in part because of the need to solve cognitive problems that present in a million different forms.

Further, I think it is silly to assert that EAP on HW 2.5 was "not intended" to function on city streets just because Tesla's lawyers made them put that in the owner's manual as a defense against lawsuits by the families of deceased morons who took their eyes off the road while driving. EAP is not FSD, but if it was not "intended" (by Tesla and its engineers) to operate on city streets they would not have allowed it to be engaged or remain on when there's only one lane line. EAP is intended to operate anywhere it can distinguish the lane, and then the lawyers added language to protect them from lawsuits. At the intersection I described it fails to distinguish the lane.

By the way, there are no limited-access highways in Maui. If Tesla "intended" EAP to only function on limited-access highways, they would disable it for cars on Maui. Tesla intends for drivers to push the limits of EAP (with extreme caution: eyes on the road and hands on the wheel) so they can collect data useful in the development of the next step in the (long) road towards FSD.
 

efusco

Moderator - Model S & X forums
Mar 29, 2009
5,421
666
Nixa, Missouri, United States
It really doesn't.

Your argument was calling something a "fail" when it's not supposed to work there by design





And never will with EAP.

Because EAP is not FSD, and is not intended to ever be

EAP is for use on divided, limited-access, highways. That's all it's ever intended to do.

So using it to judge how "far" they are from navigating situations EAP explicitly isn't intended for makes no sense.

It'd be like saying dumb cruise control on a base model Tesla (no AP) can't handle slowing down in stop and go traffic, therefore TACC can't ever work right.

They're different feature sets and the limits of one tell you little to nothing about the other.


FSD (the one that works at intersections) isn't even going to be using the same computer, let alone the same software, as what your car is running right now.
I think I understand what you're trying to get at here, but here's the rub. No matter what EAP is supposed to do, no matter it's limits, what it should NEVER do is to create a more dangerous situation. I must be able to recognize situations outside of it's 'skill set' and warn the driver to take over rather than to perform a potentially dangerous action, as it has in the circumstances Daniel describes.
 
  • Like
Reactions: C141medic

diplomat33

Well-Known Member
Aug 3, 2017
7,193
8,224
Terre Haute, IN USA
It's an indication to me that we are further from a driverless car than Elon thinks we are. There are so many things of this general sort, where it's not a matter of seeing where other cars are, or of seeing obstacles, pedestrians, etc., but a matter of comprehending what's expected in an unusual configuration. I think we'll have a driverless car in 10 to 15 years, and it will be HW5 or HW6, not HW3 that has the capacity to run the necessary software.

First of all, Autopilot is not designed to do intersections yet. So it is perfectly "normal" that AP failed in this case since AP has not been programmed yet to handle it. The intersection scenario you describe is something that will only be implemented in the "automatic city driving" feature that is yet to be released. So this is not evidence that FSD is still a long ways off. I definitely disagree with your last point about FSD being 10-15 years away. Tesla just needs to teach AP how to handle that intersection and it will be able to do it. It won't take 10-15 years for Tesla to teach AP how to handle this and similar situations.
 

Knightshade

Well-Known Member
Jul 31, 2017
11,645
15,727
NC
I think I understand what you're trying to get at here, but here's the rub. No matter what EAP is supposed to do, no matter it's limits, what it should NEVER do is to create a more dangerous situation. I must be able to recognize situations outside of it's 'skill set' and warn the driver to take over rather than to perform a potentially dangerous action, as it has in the circumstances Daniel describes.

The only way it creates a "dangerous situation" is "when the driver uses a feature someplace it's explicitly incorrect to use it at all"


If you turn on dumb cruise control, then crash into the back of a slower car- is that the fault of the cruise control making it more dangerous? Or is that user error?

Because the AP thing going on here is the same deal.
 

Knightshade

Well-Known Member
Jul 31, 2017
11,645
15,727
NC
And imagining that a system that has not yet been developed for hardware that does not exist will be commercially available in 12 to 18 months is equally nonsensical.

Good news, then...NONE of that is actually true!

HW3 exists, and is shipping in current Teslas.

The software for it exists (and is on test cars and has been for months) and will be commercially available to the fleet by end of this year.


EAP on HW 2.5 shows what can be done with current software on current hardware. To go beyond that they will need to actually build the next generation hardware and then start writing software for it

Again- they already have. They started a couple years ago.


. And the task we are discussing (driverless car) will require capabilities that nobody has yet demonstrated on any hardware.

err.... Tesla demonstrated that about a month ago. On the new hardware that is shipping on new cars today (but running SW that isn't yet publicly released) (well, there WAS a driver in the car, but it handled intersections, stop signs, stop lights, cross-traffic, etc without issue)



It is nonsense to think that the next generation hardware will be able, overnight, to solve problems that even supercomputers cannot yet solve

Can you cite which "supercomputers" you think have tried and failed to run self-driving cars?

Further, I think it is silly to assert that EAP on HW 2.5 was "not intended" to function on city streets

Then you are again factually wrong.

Not only does the manual state you're wrong, and the description of EAP itself says you're wrong (from the start EAP was billed as "on ramp to off ramp" for highway driving), the actual code running shows you are wrong

EAP is not programmed to respond to city street situations like oncoming or cross traffic, or handle turns at intersections.

See also the NHTSA report on the guy who died on AP in 2016- he was using it somewhere Tesla explicitly says not to- and died as a result of driver error while they found the car itself performed exactly as expected with no errors.


EAP is not FSD, but if it was not "intended" (by Tesla and its engineers) to operate on city streets they would not have allowed it to be engaged

The car will "allow" you to drive it off a cliff, that doesn't mean Tesla "intended" you to do that, does it?


. At the intersection I described it fails to distinguish the lane.

Because it is not programmed or intended to handle intersections

Not sure where you're getting lost on this fact
 
Last edited:
  • Like
Reactions: mongo

Lloyd

Well-Known Member
Jan 12, 2011
6,268
2,070
San Luis Obispo, CA
How about getting more standardized road markings to assist in EAP coming to a final solution. I see problems ONLY where the road, or the markings are abnormal.
 

Knightshade

Well-Known Member
Jul 31, 2017
11,645
15,727
NC
How about getting more standardized road markings to assist in EAP coming to a final solution. I see problems ONLY where the road, or the markings are abnormal.


Given it's not even sold anymore, and won't be getting upgraded HW, it's very likely EAP (without FSD) isn't going to see much more improvement going forward.

Enhanced Summon is the last new feature promised to it, so I imagine it'll go into a maintenance mode at that point with little improvement or effort once they switch "on" the much larger and more advanced NN running on HW3 cars (which all 2.x cars with FSD will get upgraded to as well).
 

efusco

Moderator - Model S & X forums
Mar 29, 2009
5,421
666
Nixa, Missouri, United States
The only way it creates a "dangerous situation" is "when the driver uses a feature someplace it's explicitly incorrect to use it at all"


If you turn on dumb cruise control, then crash into the back of a slower car- is that the fault of the cruise control making it more dangerous? Or is that user error?

Because the AP thing going on here is the same deal.
Negative Ghost Rider, the EAP does, and should, know where it can and can't be used. It should disable. This isn't something that should be driver dependent at all.
 

Knightshade

Well-Known Member
Jul 31, 2017
11,645
15,727
NC
Negative Ghost Rider, the EAP does, and should, know where it can and can't be used. It should disable. This isn't something that should be driver dependent at all.

How do you figure that?

Map data?

The speed limit issue makes it pretty clear that's not a reliable source.



Noticing oncoming traffic? That won't work because....the system is explicitly not programmed to know what that is

Watch the visualization when you have cars coming the other way next to you- it either shows nothing- or is shows cars facing the same way you are kinda randomly located because it doesn't understand what's going on there.




Since the driver is explicitly responsible when using EAP, and a human should beat map data at knowing what kind of road they are on, I'm not sure how you conclude leaving it to the driver to decide if he's turning it on where the manual says it's intended for use isn't the best way to handle it. What would be (reliably) better?
 

diplomat33

Well-Known Member
Aug 3, 2017
7,193
8,224
Terre Haute, IN USA
I think I understand what you're trying to get at here, but here's the rub. No matter what EAP is supposed to do, no matter it's limits, what it should NEVER do is to create a more dangerous situation. I must be able to recognize situations outside of it's 'skill set' and warn the driver to take over rather than to perform a potentially dangerous action, as it has in the circumstances Daniel describes.

Negative Ghost Rider, the EAP does, and should, know where it can and can't be used. It should disable. This isn't something that should be driver dependent at all.

No. Autopilot is a L2 driver assist which by definition means that it is not required to know where it can or can't be used. A L2 driver assist means that it actually is the driver's responsibility to know that. What you are asking for is a L3+ self-driving system which Autopilot is not.
 
  • Like
Reactions: Knightshade

MentalNomad

Member
Dec 6, 2018
359
400
USA
It's an indication to me that we are further from a driverless car than Elon thinks we are. There are so many things of this general sort, where it's not a matter of seeing where other cars are, or of seeing obstacles, pedestrians, etc., but a matter of comprehending what's expected in an unusual configuration.

It may not be intuitive, but the opposite is true. These sorts of unique situations will never be identified and solved in a lab somewhere on the mainland. They will only be solved when there are fleets of hundreds of thousands of cars driving around everywhere encountering these situations, with humans taking over where the AI fails. This gives the system the data to train the neural Nets to handle these unique situations.

This is, in fact, the way to get to full self-driving at the pace that Musk is suggesting.
 
  • Like
Reactions: diplomat33

daniel

Active Member
May 7, 2009
4,803
3,616
Kihei, HI
I think I understand what you're trying to get at here, but here's the rub. No matter what EAP is supposed to do, no matter it's limits, what it should NEVER do is to create a more dangerous situation. I must be able to recognize situations outside of it's 'skill set' and warn the driver to take over rather than to perform a potentially dangerous action, as it has in the circumstances Daniel describes.

Exactly.

... I definitely disagree with your last point about FSD being 10-15 years away. Tesla just needs to teach AP how to handle that intersection and it will be able to do it. It won't take 10-15 years for Tesla to teach AP how to handle this and similar situations.

Trying to teach it to handle every different unique situation would take so long they'd never be able to keep up with new unusual situations. The software for a driverless car needs to be able to analyze situations it's never seen or been taught, and that's a level of AI that has never yet been achieved.

Right now they have cars that can drive certain specific routes without driver intervention. This is a far cry from being able to drive anywhere in any city and town.

I'm a believer. Just not a believer in Elon's time line.
 

diplomat33

Well-Known Member
Aug 3, 2017
7,193
8,224
Terre Haute, IN USA
Trying to teach it to handle every different unique situation would take so long they'd never be able to keep up with new unusual situations. The software for a driverless car needs to be able to analyze situations it's never seen or been taught, and that's a level of AI that has never yet been achieved.

Yes, an AI that can analyze situations that it's never seen before is precisely what we need for a self-driving car and it actually is being developed now. It's not perfect yet but we are getting there. For example, Tesla did not teach Autopilot to recognize every single individual car that exists on the road. That would be ludicrous. Tesla fed the machine with a ton of images of different cars to train the machine to recognize what a car looks like so that when it sees a car, even one that it has never seen before, it can correctly identify it 100% of the time. Karpathy talks about this in his presentations. When I said that Tesla just needs to teach Autopilot to recognize this particular intersection, I did not mean that Tesla should teach it every single situation. Rather, I meant that this intersection is an example of an edge case that needs to be fed into the machine in order to better the neural net so that it can become capable of recognizing it and similar intersections in the future.
 

daniel

Active Member
May 7, 2009
4,803
3,616
Kihei, HI
And there are so many edge cases that you cannot simply list them. You need AI. And yes, they are working on AI. But do you really think they'll have an AI that's "I" enough, and reliable enough for a car, in a year to a year and a half? They've been working on AI for fifty years. It's coming. But it ain't 12 to 18 months away.

I love my Model 3 and EAP. When they really do have FSD I will buy that if I'm still alive and ambulatory. I expect that to be in a decade if I'm really lucky.
 

diplomat33

Well-Known Member
Aug 3, 2017
7,193
8,224
Terre Haute, IN USA
And there are so many edge cases that you cannot simply list them. You need AI. And yes, they are working on AI. But do you really think they'll have an AI that's "I" enough, and reliable enough for a car, in a year to a year and a half? They've been working on AI for fifty years. It's coming. But it ain't 12 to 18 months away.

I love my Model 3 and EAP. When they really do have FSD I will buy that if I'm still alive and ambulatory. I expect that to be in a decade if I'm really lucky.

I am just a bit more optimistic than you. I don't think Tesla will have L5 autonomy next year but I don't think it will take 10-15 years either. After all, Waymo almost has L4 in certain geofenced locations. And AI is developing at an exponential rate.
 

daniel

Active Member
May 7, 2009
4,803
3,616
Kihei, HI
I retract, and apologize for, my use of the word "fail." What I should have said was that this was one of the few spots I've ever driven where EAP remain engaged yet did not drive better than I do. Almost always, when EAP cannot handle a situation it disengages. And this is the first intersection I've encountered, out of probably thousands, where it made a mistake. And I commented on this mistake precisely because it was so unusual for it to make a mistake like this, but also to point out that it's the fringe cases that make the last 1% as hard to achieve as the first 99%.

With regard to Waymo's "almost" L4 in geofenced locations: The problem is that to get from 99.9% to driverless is probably going to be harder than getting from "dumb" cruise control to L4. I appreciate your saying you don't expect L5 next year. I wonder if anyone here expects Tesla to sell a driverless car to the general public next year? I really hope they do, and if they do I'll pay for the upgrade even if they increase the price to $20K. I just don't expect it before I'm ready to buy a new car.
 

Knightshade

Well-Known Member
Jul 31, 2017
11,645
15,727
NC
I retract, and apologize for, my use of the word "fail." What I should have said was that this was one of the few spots I've ever driven where EAP remain engaged yet did not drive better than I do. Almost always, when EAP cannot handle a situation it disengages. And this is the first intersection I've encountered, out of probably thousands, where it made a mistake. And I commented on this mistake precisely because it was so unusual for it to make a mistake like this, but also to point out that it's the fringe cases that make the last 1% as hard to achieve as the first 99%.

With regard to Waymo's "almost" L4 in geofenced locations: The problem is that to get from 99.9% to driverless is probably going to be harder than getting from "dumb" cruise control to L4. I appreciate your saying you don't expect L5 next year. I wonder if anyone here expects Tesla to sell a driverless car to the general public next year? I really hope they do, and if they do I'll pay for the upgrade even if they increase the price to $20K. I just don't expect it before I'm ready to buy a new car.


FWIW I certainly don't expect L5 (which is what you need for driverless) next year- or the year after for that matter.

I DO expect they'll be able (pending regulations) to be able to offer at least L3 or L4 on the highway (places EAP is intended to work today) in that time frame and honestly if I got nothing further from FSD I'd be pretty happy since that's 95% of my driving anyway.

And I'd expect probably capable of L3 in the city (I think regulations will be even tougher on this to get it allowed though)... I think L4 will be tougher here because safely aborting/pulling over with no human is a lot more complex and challenging in such environments than on divided highways.
 

daniel

Active Member
May 7, 2009
4,803
3,616
Kihei, HI
My understanding (I could be wrong) is that Tesla has promised that I can upgrade to the FSD package for $5,000. I would do that to get L3. I could see that maybe coming in a couple of years, but it would still have to be able to deal with unexpected situations, so I'm not buying it until it's actually available. Just taking your eyes off the road is a really big step since the driver is no longer in a position to take over immediately.
 

Saghost

Well-Known Member
Oct 9, 2013
8,217
7,007
Delaware
My understanding (I could be wrong) is that Tesla has promised that I can upgrade to the FSD package for $5,000. I would do that to get L3. I could see that maybe coming in a couple of years, but it would still have to be able to deal with unexpected situations, so I'm not buying it until it's actually available. Just taking your eyes off the road is a really big step since the driver is no longer in a position to take over immediately.

You can upgrade to FSD for $5k today. They didn't guarantee that you could upgrade for that price at any time in the future. They'll almost certainly continue offering the option for your car in the future, but it might be for less money - or a lot more.
 

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top