Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Apple co-founder: 'I've really given up' on Level 5

This site may earn commission on affiliate links.
Apple co-founder: 'I've really given up' on Level 5


Wozniak hoped Apple, which had been rumored over the years to be working on a self-driving car project, would be the one to build it.

But he has since tempered his expectations. There is simply too much unpredictability on roads, he said, for a self-driving car to manage. For now, he believes the burgeoning technology is better used to give drivers a safety net for certain situations.

"I stepped way back [on] this idea of Level 5. I've really given up," Wozniak said during the J.D. Power Auto Revolution conference in Las Vegas last week.

"I don't even know if that will happen in my lifetime."

Autonomous vehicles would fare better, he said, "if we were to modify roads and have certain sections that are well mapped and kept clean of refuse, and nothing unusual happens and there's no road work."
 
I really don't get the obsession with "Must handle every situation 100% of the time."

Not even humans can do that. Sometimes conditions are too poor and the car needs to pull over. If a self driving car can follow traffic laws and avoid hitting other cars, pedestrians, bikes, etc... while all doing it more safely than a human, we're there.

Leave the damn steering wheel in the car and let the car safely give a message that it can't handle the situation, person needs to intervene (this is where Tesla is going to be for a long time and I'm very glad I own this car).

If a tire comes flying off a car on the highway, again, even a human has a tough time with that.

The biggest issue is people need someone to blame. If I get in an accident everyone can just blame me and we're all happy.

If a self driving car gets in an accident, everyone loses their minds - even if the accident was unavoidable.
 
Apple co-founder: 'I've really given up' on Level 5

Wozniak hoped Apple, which had been rumored over the years to be working on a self-driving car project, would be the one to build it.

But he has since tempered his expectations. There is simply too much unpredictability on roads, he said, for a self-driving car to manage. For now, he believes the burgeoning technology is better used to give drivers a safety net for certain situations.

"I stepped way back [on] this idea of Level 5. I've really given up," Wozniak said during the J.D. Power Auto Revolution conference in Las Vegas last week.

"I don't even know if that will happen in my lifetime."

Autonomous vehicles would fare better, he said, "if we were to modify roads and have certain sections that are well mapped and kept clean of refuse, and nothing unusual happens and there's no road work."

I think he is being way too pessimistic. Frankly, it smacks of "the problem is harder than I thought it would be so I am just going to give up."
 
@DirtyT3sla I completely agree. I am tired of everyone expecting the car to be successful 100% of the time. As I saw in an article - When trying to achieve 99.999%, the further to the right you go they get exponentially harder to achieve. Humans are FAR FROM PERFECT, yet we expect perfection to replace us. I am happy with a reduction of deaths and progress, thats it. But as you say, people want someONE to blame and you can't blame an autonomous car.
 
There is simply too much unpredictability on roads, he said, for a self-driving car to manage. For now, he believes the burgeoning technology is better used to give drivers a safety net for certain situations.

He may be pessimistic, but you do have to wonder how the safety level of a fully autonomous car will compare to the safety of a car with a robust safety net, with a human driver. Imagine when all cars avoid 99% of run-off road incidents, 99% of head-on collisions, and detect and intervene 99% of the time if the driver becomes incapacitated...etc...

That adds a couple of 9s to the human driver safety level...and the bar has potentially been raised for the autonomous vehicle (depending on what people decide is an acceptable error rate). At some point, it comes down to what is the maximum number of fatalities that is allowed...400 per year in the US...is that low enough?

It'll be interesting to see how things evolve.
 
  • Helpful
  • Like
Reactions: alsetym and Cowbell
I think technically everyone here is agreeing with Wozniak that Level 5 is really really hard, and may not be achievable. The moment you are talking about adding steering wheel and human back-up, you are talking Level 4, not 5.

Perhaps we can compare it to train and airplane. On airplane, the critical system needs reliability of 1e-9, where fatality or major injury can only happen once every 1,000,000,000 flights. So when a crash does happen, it is a major deal. The question is, can an autonomous car ever get to that level of safety? I have my doubts...
 
I think he is being way too pessimistic. Frankly, it smacks of "the problem is harder than I thought it would be so I am just going to give up."

I agree. Steve Jobs was a 'magical thinker,' as is Musk.

The dialog cycle is:
Musk: "We will build this, and it will be great."
Subordinate: "No, it can't be done."
Musk: "We will build this, and it will be great."
Replacement Subordinate: "Yes, sir."
 
Woz has a good point. The advantage Tesla has is the networking between Tesla itself and a vast fleet of cars that aid the fleet learning process. Perfection may be impossible to achieve but as time passes the cars will improve, probably massively so. I wonder if regulators will limit self-driving cars to reserved lanes unless and until certain standards are met for driving in lanes with all the other cars. Clearly, the level of competence for FSD will vary between car models and manufacturers.
 
I think technically everyone here is agreeing with Wozniak that Level 5 is really really hard, and may not be achievable. The moment you are talking about adding steering wheel and human back-up, you are talking Level 4, not 5.

I agree that L5 is really really hard. I do not agree that it may not be achievable.

No. Adding a steering wheel and back up driver does not downgrade a car from L5 to L4. L4 can have no steering wheel and no back up driver.
 
@DirtyT3sla I completely agree. I am tired of everyone expecting the car to be successful 100% of the time. As I saw in an article - When trying to achieve 99.999%, the further to the right you go they get exponentially harder to achieve. Humans are FAR FROM PERFECT, yet we expect perfection to replace us. I am happy with a reduction of deaths and progress, thats it. But as you say, people want someONE to blame and you can't blame an autonomous car.

This is true. I recently read a fascinating book on the history of GPS (which was far more of a political battle within the Air Force than I ever imagined), which included its use in automated aircraft landing. There are number of locations where GPS, in conjunction with ground-based beacons, enable the 100% automated landing of passenger airliners, thousands of times per year. The "battle for the 9's" required that the system be something like 99.999999% accurate, otherwise there would be a statistically unacceptable risk of an airliner crashing, considering how many landings occur annually.

I hope that the legislators realize that fully autonomous driving can never be "perfect" just as automated airline landings aren't perfect but that they'll approve its use when it's clear that autonomous driving is consistently saver than humans.
 
but you do have to wonder how the safety level of a fully autonomous car will compare to the safety of a car with a robust safety net, with a human driver. Imagine when all cars avoid 99% of run-off road incidents, 99% of head-on collisions, and detect and intervene 99% of the time if the driver becomes incapacitated...etc...

I am not sure I completely buy this idea that human drivers are so great so it would be better to just add driver assist and active safety systems rather than completely autonomous systems. Humans only have 2 cameras (eyes) that can't see 360 degree at the same time. And some humans drive with bad eye sight. Human drivers also get tired, drive drunk, or get distracted. An autonomous car with 360 degree camera vision, 360 degree LIDAR, 360 degree radar and a computer that never gets distracted or tired and can process images at a fraction of second, and react faster than any human, certainly has the potential to be a far better driver than a human. So, I definitely see the appeal in having an autonomous system do all the driving and remove the human from the driving equation completely.
 
  • Like
Reactions: destructure00
I am not sure I completely buy this idea that human drivers are so great so it would be better to just add driver assist and active safety systems rather than completely autonomous systems. Humans only have 2 cameras (eyes) that can't see 360 degree at the same time. And some humans drive with bad eye sight. Human drivers also get tired, drive drunk, or get distracted. An autonomous car with 360 degree camera vision, 360 degree LIDAR, 360 degree radar and a computer that never gets distracted or tired and can process images at a fraction of second, and react faster than any human, certainly has the potential to be a far better driver than a human. So, I definitely see the appeal in having an autonomous system do all the driving and remove the human from the driving equation completely.

I definitely did not say anything about human drivers being great! Specifically, I implied they run off the road, cross the center line, and fall asleep.

The question is really one of where each system excels. Human failures are very often situations where a computer could easily handle the problem and avoid the accident. While computer failures are often spectacularly dealt with by humans.

So it is like a match made in heaven. Maybe.

We’ll see.
 
I definitely did not say anything about human drivers being great!

Sorry. I guess I am still thinking about the NOVA show that quoted the stat of only 1 crash death per 100 million miles. And based on that stat, the video implied that human drivers are super reliable.

The question is really one of where each system excels. Human failures are very often situations where a computer could easily handle the problem and avoid the accident. While computer failures are often spectacularly dealt with by humans.

So it is like a match made in heaven. Maybe.

We’ll see.

I could maybe see a "hybrid" system as an intermediary stop gap before we get to L5. But I still think that removing the human driver should be the ultimate goal.

Data does show that driver assist systems that become too good cause the human driver to become complacent and less effective. So how exactly, do you create this system that takes the best of human drivers and the best of autonomous systems? Perhaps that is why virtually every company working on autonomous driving, is focusing on full autonomy rather than creating some driver assist hybrid system.
 
Data does show that driver assist systems that become too good cause the human driver to become complacent and less effective. So how exactly, do you create this system that takes the best of human drivers and the best of autonomous systems? Perhaps that is why virtually every company working on autonomous driving, is focusing on full autonomy rather than creating some driver assist hybrid system.

Yes, the devil is in the details on that one.
However, I could see last resort systems working pretty well and being very much in the background. In fact, I expect they will become mandatory over the next decade or so - though depends on the progress of autonomy I suppose. Just because Tesla does not choose to implement things this way (meaning, in the background, though in fact they do to some extent) does not mean every manufacturer has to use the same approach.

And then there is the question of whether a system that is really good (at the level I suggested) at these background tasks MIGHT be actually able to be autonomous...and how/whether that crossover happens. Not sure we know the answer yet. Maybe Waymo does.
 
And then there is the question of whether a system that is really good (at the level I suggested) at these background tasks MIGHT be actually able to be autonomous...and how/whether that crossover happens. Not sure we know the answer yet. Maybe Waymo does.

You raise a good point. At the point where your autonomous car can respond better than the human driver and save the driver from accidents, maybe it is pretty close to being able to just drive itself.

The Waymo CEO did give a presentation where he said that they believe very strongly in removing the human driver completely from the driving equation which is why they are so focused on L4 autonomy. He played a video from an experiment they did awhile back where they put safety drivers in an advanced L2 system and noticed that the safety drivers were not paying attention to the road. So they discontinued the experiment and switched to developing L4 autonomy full-time.

 
You raise a good point. At the point where your autonomous car can respond better than the human driver and save the driver from accidents, maybe it is pretty close to being able to just drive itself.

The Waymo CEO did give a presentation where he said that they believe very strongly in removing the human driver completely from the driving equation which is why they are so focused on L4 autonomy. He played a video from an experiment they did a while back where they put safety drivers in an advanced L2 system and noticed that the safety drivers were not paying attention to the road. So they discontinued the experiment and switched to developing L4 autonomy full-time.

I heard about this. I completely agree, mostly from experience with Autopilot. I do let AP do all the driving on the highway, where I think I may be different from others, hopefully, is that I look ahead to make sure there are no obstructions before 'letting go', though not for too long. I can see most people getting really comfortable with these technologies.

I do want to play devil's advocate and ask the question - If AP is 'safer' than a human, on a highway lets say, what is wrong with letting it take over? It won't be able to handle every situation but neither can I and it will probably react faster than me unless it is a stationary semi-truck trailer or fire engine. I am against thinking we need to have perfection from these systems, just make it as good as me so I don't have to, that's all.
 
The "battle for the 9's" required that the system be something like 99.999999% accurate, otherwise there would be a statistically unacceptable risk of an airliner crashing, considering how many landings occur annually.
As I recall the number thrown around in the FAA's program office was "five nines". Based on that I'd accept 5 nines in an automobile autopilot. And based on what I experienced on my last road trip we are far from that.

I used to have an AI section in my department and I remember thinking at that time this is BS. Fast forward 40 (?) years and I'm thinking well maybe I was wrong. They may actually have made some progress that I know nothing about. But driving the current "autopilot" in the Tesla I think I was probably right and that Woz is too.
 
I do want to play devil's advocate and ask the question - If AP is 'safer' than a human, on a highway lets say, what is wrong with letting it take over? It won't be able to handle every situation but neither can I and it will probably react faster than me unless it is a stationary semi-truck trailer or fire engine. I am against thinking we need to have perfection from these systems, just make it as good as me so I don't have to, that's all.

AP is currently a driver assist system so by definition, it is not safer than a human alone. However, AP plus an attentive human driver is safer than a human driver alone. And, if you remain attentive and engaged at all times, it is ok to temporarily "let AP take over" but you must be prepared to intervene if need be. But that is where complacency comes in. You start to think it is ok to "let AP take over" because it seems to be handling that long stretch of boring highway driving just fine but it causes you to become less attentive. Sadly, if you stop paying attention, you might not be able to intervene if AP fails. As a driver assist, you are still responsible for driving the vehicle if even AP appears to be driving for you. And that's the paradox of driver assist. They appear to be driving the car when in reality the human driver is actually still responsible for driving the car.

I would rank safety as follows, from least safe to most safe:
driver assist (least safe) ----> human ----> driver assist + human ---> autonomous (most safe)