Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

some examples of where level 5 autonomy might have difficulty

This site may earn commission on affiliate links.
" Say you're driving down a two-way street and there's a lorry unloading a delivery in the opposite lane. The oncoming traffic needs to pull out into your lane to overtake.
What do you do?"

See this well argued BBC article
Would you bully a driverless car or show it respect? - BBC News

and I've thought of more below - but I'm sure 100s of others exist - and the software to get there is unfathomably complicated... it's that last 1% of issues that make level 5 almost impossible in my opinion - unless you're going to used a different road network - but then you can't have 2.
This is why I think people will be foolish to pay for level 5 features now when it might be a decade before software can do this (if ever) and the legislation passed to allow it.
Also - selling it in advance is just Tesla helping their margins - and selling undeliverable promises. Installing $500-1000 of hardware (don't believe for a second it's $8k of hardware) and getting people to pay $8000. I imagine it's a $150 graphics card and some low res digital cameras (not like you need 40megapixels) and some sensors - and we know they're not expensive. The rest is software.


But I've thought of some others:
Temporary roadworks on a small road where a bloke in a high-vis is waving the car forward by hand or with a manual STOP / GO sign.
A country lane narrower than 2 cars where there's only occasional passing spots. Who gives way? Sometimes you have to reverse for 100m - but then you find another car's behind you.
A partially blocked road where the car needs to reverse out - eg a lorry delivering something
A country road with a fallen tree blocking some of the road.
A flooded street (burst water main / low dip in a road after flooding etc).
Pulling over for a fire-engine / ambulance to go past. Just stopping might block the emergency services from passing - even some humans panic and stop in the wrong place making it harder for the emergency vehicle to get round.
Weaving cyclists / motorbikes triggering emergency stop situations.
Cats / dogs / lots of pheasants in the UK that can damage a car.

Oh - and something silly - but there's ways people could mess with automated cars deliberately to cause a crash:
Somebody holds up a pretend speed sign saying "Tesla vechiles only :)" with a legitamate looking "70" mph sign underneath - even though it's a 30 zone. Would the Tesla use it's map knowledge of the road or read the sign? The human knows it's a joke.

One thing Elon is definitely right about (and was criticized by some) that it's impossible to ever get to zero fatalities.
There's always the totally unexpected
2015 Shoreham Airshow crash - Wikipedia
 
Autonomous vehicles are intended to provide a net improvement in safety. This doesn't necessarily include handling every rare or far-fetched event that could happen. Moreover, there is a lot of industry-wide discussion about this topic. Suggestion: have a look at this article explaining US DOT's efforts to coordinate; also, go to www.dot.gov and search for the term "autonomous" - you'll find several other related links.
 
  • Like
Reactions: LargeHamCollider
Yeah, there are going to be several lifetimes spent writing the test suite needed for these systems to be "certified" by the DOT and other national standards bodies. There are lots of edge cases which will need to have the behavior verified before regulators will sign-off on it.

And that's without considering regional driving styles. As an example, if I drove around home how I was recently driving in Rome (and I was a "timid" driver there), I'd be arrested...
 
Autonomous vehicles are intended to provide a net improvement in safety.
This makes sense from a system-wide point of view. But as a system user, I only care about whether an autonomous vehicle is safer than driving myself.

I suppose the ideal situation is that I get to drive for myself, while everyone else has to ride around in their autonomous vehicle. :)
 
Yeah, there are going to be several lifetimes spent writing the test suite needed for these systems to be "certified" by the DOT and other national standards bodies. There are lots of edge cases which will need to have the behavior verified before regulators will sign-off on it.

And that's without considering regional driving styles. As an example, if I drove around home how I was recently driving in Rome (and I was a "timid" driver there), I'd be arrested...

What is wrong with making self-driving cars pass the relevant local driving test? If it is good enough to certify a human ...
 
  • Funny
Reactions: GSP
This makes sense from a system-wide point of view. But as a system user, I only care about whether an autonomous vehicle is safer than driving myself.

I suppose the ideal situation is that I get to drive for myself, while everyone else has to ride around in their autonomous vehicle. :)
LOL. Well, this might be safer for you if you are a better driver than the autonomous system in all cases, which means that you are never distracted, never go over the speed limit, never miss another vehicle in a blind spot,never drift out of your lane, etc., which would make you one of the world's safest drivers. That's great! ... are you a driving instructor or commercial driver? (I'm not meaning to malign your skills, which may indeed be excellent... but, just by way of additional info, I have read that something like 90% of all drivers believe that they are better than average drivers, which is mathematically impossible).

I don't think anyone has the illusion that (at least with technology of the next 5 years) we're going to have autonomous vehicles that can handle every conceivable incident; the idea is that they should be significantly better than the average driver, and thus reduce the total number of crashes and especially fatalities. Even US DOT recognizes this, so they are not standing in the way of development of these systems. There are certainly unusual circumstances that arise on the road from time to time - but even human drivers can take the wrong action in certain cases.

If you look at the leading causes of crashes, DUI is at or near the top of the list. An autonomous system won't do that. Another leading cause of crashes is "aggressive driving" (speeding, cutting off other drivers, tailgating, etc.) - Autonomous systems won't do that either.

So I appreciate your concern, I have some of the same concerns, but this is classic change management... we have to look towards the advantages of the new situation, while also managing risk as best we can. ;)
 
My guess there will be situations when a level-4 autonomous car will raise his hands and say it clearly to the driver "I can't handle this".

But every year that passes those situations will be a lot less recurrent because the system learns from the driver what it should do, and everybody around it will be adapted for an autonomous vehicle.

We can't expect a perfect level-5 ever. Specially considering some road marks that leave an experienced driver puzzled.
Everybody will have to adjust to a new reality. But it doesn't mean it is not coming... Fast.
 
  • Like
Reactions: BriansTesla
I'll give you a real-world experience; the single most memorable occurrence in my millions of miles of driving; I've been thinking about it every time I consider the world of autonomous vehicles.

This happened the last time we traveled through the Yukon in mid-winter. Nasty, wretched white-out conditions. Mile after mind-altering mile of crawling on at 10...9...8 mph, probing with the tires to determine where the road was.

==>Interlude #1. LOTS of places receive white-out conditions, places that get a LOT more traffic than does Yukon Territory. Obviously, the visual elements of autonomy are useless, but what about radar? Can it do much better than locate other vehicles, the occasional road sign, and those invisible trees that suggest that might not be the appropriate path? GPS? Is it precise enough to keep a vehicle where some program says a road ought to be? Augmented, perhaps, by prior breadcrumbs of earlier-passed smart vehicles?

Back to story. If we were inching along at 10mph, a full-sized truck's driver decided 11mph would be all right. Very, VERY happy to let him pass us: he can determine where the road was and all we would have to do was to follow his tail lights. SOP in conditions like these. Just keep those tail lights not too close, not too far. So we did.

After a few zillion hours like this, that truck's brake lights come on. This is when you brake first, ask questions later - so we brake too. But before we could even fully formulate the complex question "Why?"....his lights vanish! One moment there, next moment utterly gone.

==>Interlude #2. What happened? If anyone cares, provide a or some plausible explanation, and write how you would respond. Here is a clue: I am still of this planet.

To be continued.
 
Tesla isn't trying to reach level 5 with these cars - per SAE, that's a car which doesn't have a steering wheel or brake pedal.

Some edge cases are easier than others - I don't see why it would be hard to program a response to the construction worker or the emergency vehicle.

One option for some cases might be for the car to imitate what the other cars are doing, like AP1 following a car when it loses all other references.

I suspect Tesla will be at an advanced level 3 for a long time - the car basically driving itself but the driver needs to be present to address edge cases occasionally.
 
I'll give you a real-world experience; the single most memorable occurrence in my millions of miles of driving; I've been thinking about it every time I consider the world of autonomous vehicles.

This happened the last time we traveled through the Yukon in mid-winter. Nasty, wretched white-out conditions. Mile after mind-altering mile of crawling on at 10...9...8 mph, probing with the tires to determine where the road was.

==>Interlude #1. LOTS of places receive white-out conditions, places that get a LOT more traffic than does Yukon Territory. Obviously, the visual elements of autonomy are useless, but what about radar? Can it do much better than locate other vehicles, the occasional road sign, and those invisible trees that suggest that might not be the appropriate path? GPS? Is it precise enough to keep a vehicle where some program says a road ought to be? Augmented, perhaps, by prior breadcrumbs of earlier-passed smart vehicles?

Back to story. If we were inching along at 10mph, a full-sized truck's driver decided 11mph would be all right. Very, VERY happy to let him pass us: he can determine where the road was and all we would have to do was to follow his tail lights. SOP in conditions like these. Just keep those tail lights not too close, not too far. So we did.

After a few zillion hours like this, that truck's brake lights come on. This is when you brake first, ask questions later - so we brake too. But before we could even fully formulate the complex question "Why?"....his lights vanish! One moment there, next moment utterly gone.

==>Interlude #2. What happened? If anyone cares, provide a or some plausible explanation, and write how you would respond. Here is a clue: I am still of this planet.

To be continued.

If Tesla develops their systems the way I think they will, the car should be much more confident of where the road is supposed to be than you or I would be.

As you can see in Remote S, the car knows the exact parking spot it is in, and I expect the GPS precision and map precision to continue to improve - to the point that the car could stay in the lane based purely on GPS input.

Tesla is also in the process of building the radar whitelist, which I suspect will develop into a radar navigation map - the car downloads a tile for the area, which shows what the radar should see, and based on where the objects on the tile are, it knows where the car is.

My guess is that the truck driver parked the truck and turned the lights off.
 
Almost everyone wants Level 5 today. Almost everyone agrees Level 5 is impossible. Should Tesla just throw in the towel? Of course not, some company needs to step up and get the ball rolling. But expectations need to be managed appropriately, as best as possible.
 
Here's my scenarios that I think automation may have big challenges.

1. In California motorcyclists can drive between two cars. So they essentially make their own lane. We are supposed to watch for them, which is near impossible for a driver.
2. Here in California, we have a ton of cyclists who are riding for leisure. They ride illegally side by side when there is no bike lane chatting with their friends. In some places it is quite safe if the shoulder is wide, other places is down right dangerous. Sometimes the pelotons come through and they are extremely aggressive and even cross over into the opposing lane.
3. Try driving in India where there really are no lanes.
4. In different parts of the world there are large animals on the roads... cows in India, Yellowstone buffalo, Australia sheep. In California we have deer that at times get frightened and jump in front of you.
5. The Sierras when they storm, they really really storm. Once when I was an avid skier, I avoided I80 and took a side road up. Well the road was open and I had 4WD and it was a lonely road, but it was near impossible to see

Google has been working on this for years and driving around mostly Mountain View for years trying to get to this last 1%.

Autonomous driving doesn't interest me all that much unless I can drive asleep or at least surfing the web. For me its hard to pay attention if I am not at least looking at the road. I drive on mental autopilot most of the time.
 
  • Like
Reactions: cantdecide
Saghost wrote:
If Tesla develops their systems the way I think they will, the car should be much more confident of where the road is supposed to be than you or I would be
.

That may be; I certainly don't know. What I can share with you is that experienced North Country drivers do a lot of winter driving by, in effect, the seat of our pants. We "feel" the road - where the crown is, what the slope of our...and the opposite....lane seems to be, subtle differences in wear patterns of the pavement. And, obviously, the difference between what is the road surface and what is the edge, and the no-brainer "ga-dump ga-dump" of center lane bots, of which there are approximately zero on the roads I'm discussing.
All that made the more subtle by snow/ice coverings. I'll be the first one to congratulate programmers who include those kinds of experiences into their algorithms.

Meanwhile, back in the blizzard -

So where were we? Hearts in our mouths, soft stuff in our pants, foot on the brake pedal and wondering what could have caused the truck's lights to disappear. And then - looming out of the blinding snow just a few feet from our pickup's hood -

a bison. Crossing the road, directly behind the truck. Others of its herd had, we later learned when both truck and we stopped for conversation, begun to cross not in front of but effectively "alongside" the leading truck; he saw those apparitions next to him and braked; as the bison entered the road his massive bulk extinguished the truck's lights from our vision. And as those lights had been the only distinguishable object in our diminished world, when they went out so did everything.

Now, a bison is the very largest creature anyone, anywhere outside Africa, can encounter on the world's highways. And few hippos or elephants wander about in blizzards. Twice the mass of a moose, no vehicle is a match for a bison other than the largest Class 8s under just the right circumstances. We were very, very lucky.

Back to autonomous driving - can today's non-visual hardware properly anticipate such a situation? Can it parse the back of a semi with a monster passing by it? And react appropriately? Remember: the truck braking was not the truck stopping. It was the driver gut-reacting ex-post to an event that was, for its driver, already finished - his foot was on the brake pedal only momentarily. Fortunately for us, that was enough for us also to react and slow ourselves for some unknown event.

The first word out of my mouth was holy and the second one wasn't.
 
Last edited:
For ride sharing systems like uber that operate in a fixed environment and at lower speeds I see level 5 happening soon. Also, in those type of environments the car can call for human help and have someone at a control center take over.

For us Tesla drivers we can expect to see level 3 autonomy relatively quickly. Where the car has total control in certain environments. For example you get on the highway and it has total control but when you get off the highway the driver takes over.