Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon Musk Says Tesla Is 'Very Close' To Level 5 Self-Driving Technology

This site may earn commission on affiliate links.
I know this thread is about autonomy, but what we have now does 90% of my driving and I would not want to live without it. it is pretty great imo.

Review of Full Self Driving Feature of my New Tesla Model 3

by Henry Farkas


I actually didn't need to buy a car during the COVID-19 pandemic. My wife and I are in our 70s. Where could we go? But Elon Musk took a couple of thousand dollars off the price of the M3. I figured he'd raise the price once there was a cure or a vaccine. I had already test driven all the fully electric cars before the pandemic started, and Tesla's Full Self Driving (FSD) was better than the lane-keeping of the other electric cars. And Tesla gives free software updates. None of the other car companies do that.


I've had the Tesla for around eight weeks now. During this time, I've received three software updates. My previous car, a Chevy Volt that my wife and I leased and then bought at the end of the lease, got one software update at the beginning of the lease when I complained to the dealer that cruise control often stopped working apropos of nothing while I was driving on the Interstate highway in good weather. I had to take the car into the service center for the update, and I had to wait around an hour or so. The update didn't solve the problem, but it reduced the frequency of the issue.


Anyway, back to Tesla. I paid the extra seven thousand dollars for the FSD feature because my wife doesn't think I'm a good driver. So I figured I'd go with the artificial intelligence of the Tesla that's had several billion miles of driving experience. That's more miles than I've had even though I'm very very old.


Sadly, artificial intelligence doesn't benefit from driving experience quite as well as biological intelligence, at least human biological intelligence.


My criterion for excellent driving is driving that doesn't prompt my wife to comment negatively about a particular driving event during a car trip. I, personally, haven't achieved that rarified level of driving excellence except on very short trips to the grocery store or to nearby restaurants to pick up takeout food. In my defense, her driving would, at times, elicit comments from me if it weren't for the fact that such comments from me might have a negative effect on my ability to get lucky.


But I have to say that so far, both of us are way better drivers than the Tesla is.


I hope the Tesla aficionados can restrain themselves from flaming me over the previous statement. I want the Tesla to be as good a driver as a person who makes a living as a chauffeur for rich people, who never gets sleepy, and whose attention never wanders.


Tesla's AI just isn't there yet. My Tesla sees a red light or a stop sign ahead, it flashes a sign on the screen saying that it plans to stop in 500 feet, and it abruptly accelerates. It does stop in time, but why does it need to accelerate noticeably before it starts to slow down? This happens often. When there's time, an excellent chauffeur accelerates and decelerates so gently that passengers don't even notice that the speed of the vehicle is changing.

.

Possibly, not all people want such a sedate driving experience. If that's all it is, then the Tesla engineers can just put in a driving mode. They could call it Sedate Mode, or Chauffeur Mode, or just Make Your Spouse Happy Mode.


There are more issues with FSD than just abrupt changes in speed. When the Tesla is in the right lane on a limited-access highway and it's passing an exit, the painted lines start to widen. The right thing to do if you don't plan to take the exit is to keep going straight. That's not what Tesla AI does. Instead, it starts to veer right in order to stay centered between the lines. It does this with an abrupt noticeable movement. Then, when the line appears between the exit and the travel lane, Tesla AI abruptly switches back to the center of the travel lane. The car should see far enough ahead to know that the lines are widening because there's an exit. It should know that unless there's a turn signal, the exit is not intended to be taken, and it shouldn't veer right to stay centered between the widening lines. Those movements don't merely upset the non-driving people in the car. They make the driver of the following car think, briefly, that the Tesla is about to exit without signaling. If the following driver speeds up figuring that the Tesla is about to exit, there could be an accident when the Tesla abruptly swerves back into the travel lane.


Staying in the center of the travel lane seems to be a priority for the Tesla AI, but recently, it swerved my car toward the Jersey wall (those concrete walls they sometimes put on the side of roads) on the right side of the road. About a year ago, there was a fatality when a Tesla swerved into a Jersey wall. Apparently, the AI that drives a Tesla still doesn't always recognize that Jersey walls are vertical walls rather than horizontal parts of the travel lane.


Finally, there's the really big issue that sometimes the Tesla doesn't see an object stopped in the road. We know what the car sees and what it doesn't see. They show up on the left side of the center screen as we're driving along. I've had the car less than two months, and this has happened to me twice that I've noticed.


In summary, I like the FSD feature, but it's still not a good enough driver to make my wife happy, and it's not safe for the human drivers to take their attention off the road for even a few seconds. Judging by the way my FSD feature works, it's going to be a long time before I can send my Tesla out to work as a taxi when I won't be needing it myself.



Henry
 
Ah - but, wouldn't you agree that even if there is no need (which is not yet proven) there is still an opportunity that's being missed here? We have, for the first time in history, an absolutely massive industry-wide investment, focused on the goal of removing the human driver whilst increasing the safety, efficiency and speed of personal transportation.

Rather than hope each individual company somehow magically solves the entire solution by emulating how humans drive, there's a rather more obvious and interesting opportunity of imbuing the entire fleet with a degree of foresight that we (as humans) would never be able to achieve...

Imagine - you could do things like optimising traffic flow based on weather conditions, automatically re-route everyone to account for major construction, even adjust each individual car's future lane choice based on relevant information - like a breakdown that's occurred further up your route, or to allow unimpeded access for emergency vehicles. Smarter infrastructure and vehicle communication would cut travel time, maximise range and increase overall efficiency whilst being safer and more widely available to all vehicle types, makes and models. Seems like a perfect fit for this current investment climate...

It seems to me like this is the obvious piece of the puzzle that's missing, and that no one seems to be doing much about.

The debate here is about L5 and whether it can be achieved purely by visual means.
The interim steps should have the requirement to be at least as safe as a human driving

Your additional items are nice but only serve to try and distract from the core point seemingly to downplay any Tesla progress just for conflict.
 
  • Like
Reactions: nepenthe
It doesn't really matter if Tesla can get to something anywhere near a fully self driving car (doubtful)... being allowed to use it is a whole other ball game. As I always chime in on these threads: infrastructure changes, and V2I and V2V systems will need to be in place before any kind of autonomous driving is viable, in a meaningful way.

All this "I drive with two eyes therefore that's what autonomous cars should do" is very short sighted (both literally and figuratively speaking)
I agree with you in the context of say not having a driver in the car, or a car not having a steering wheel kind of deal. Otherwise, Tesla has an enormous data set to pull from, with which they can say, over x-xxx million miles of navigate on autopilot being engaged, there were x many accidents or needs to disengage. They will be able to repeat this in relation to stop lights, stop signs, whenever turning through intersections happens, navigating on city streets etc., etc. They could quite literally point to their factual data, to represent the decreased likelihood of an accident occurring while using their systems in relation to the average driver. Do you think, if Tesla requires a driver and a driver to have their hands on the wheel - a regulator would say you cannot provide this functionality, even if it would demonstrably prevent car accidents and the likelihood of deadly accidents?

On your second point, I have a hard time seeing this. Historically, any problem where the parameters can be defined, even loosely, can be "solved" by AI/DNN/ML. Machine Learning has done this with every game it has been trained on, to be superhuman in comparison. Arguing about the time-frame until this is feasible as it relates to getting in the drivers seat, entering a destination and having your vehicle drive you there with your supervision, make sense. Even arguing about the approach other companies are taking and who may get there first makes sense. But to discredit the statement humans with two eyes, or one eye on a pivot can do it, but there's no reason an automated complex system can - does not make sense. If we can do it, define the parameters around "success" than it is not an intractable problem. Instead, it's a question of when.
 
Last edited:
Today, I had an experience with autopilot reminiscent of a fatality I read about some time ago where a white semi pulled out onto a road in front of a Tesla on AP, and AP didn't recognize it. The driver wasn't paying attention, and he died. In my own case, I had AP turned on while on a local road. A white unmarked semi pulled out of a parking lot in front of me, and my AP didn't notice it. The reason I'm writing about it is that I was paying attention so I stopped in time. That little experience made me more aware of white semis without any commercial signage. I saw three more of them in the next 15 minutes of driving. They were all on the road going in the same direction as I was, and my AP did see them. I could tell from my screen. My theory is that the AP interpreted the white truck crossing the road at a 90 degree angle as a cloud instead of a truck.
 
  • Informative
Reactions: diplomat33
But to discredit the statement humans with two eyes, or one eye on a pivot can do it, but there's no reason an automated complex system can - does not make sense. If we can do it, define the parameters around "success" than it is not an intractable problem. Instead, it's a question of when.

That's not actually what I was discrediting. I'm sure that, in the future, an AI could replicate the way we (as humans) drive. Don't mind me too much. I'm just a bit grumpy... I personally feel that the last 10 years could have been spent on a different approach, and we'd likely be much farther along.

I guess it's a bit like a restaurant spending 20 years trying to build a replicant-human WaiterBot™ when they could have just spent a week building a conveyor belt next to the tables, and an app to order from.
 
So before this, you thought Tesla would release level 5 without supervision on the first go?

Well, Tesla was promising this back in 2016, saying it only needed regulatory approval:

6HRdP68.png
 
Seems you like to jump to conclusions...
Given every feature related to Autopilot and FSD has been released with very strict "supervision required" notes, it seems pretty obvious to most that that would continue until FSD reached its reliability numbers.

Hell, even Elon said so himself on the conference call.
Yes, Elon clarified in the conference call. That doesn't excuse his lies. Who is "most"? Most on this thread have "liked" the posts that condemn his earlier statement.
 
  • Funny
Reactions: mikes_fsd
Well, Tesla was promising this back in 2016, saying it only needed regulatory approval:

6HRdP68.png

You could post "2+2=4" and mspisars would give you a disagree vote.
@Soda Popinski just to clarify.
That 2016 screenshot from Model X & S sales page is a black eye for Elon and Co even though it is conveniently cropped (see further below)
But for all these Model 3 and Y owners (and everyone else bitching that purchased their cars since ~2018) to whine about their FSD not being "delivered" is kinda silly.
Especially on this forum. You've (as in diplomat33) been following the company for years, from the announcement of FSD in October 2016 to first Model 3 deliveries was ~10 months (and more than that for volume deliveries).... before you could get a Model 3 they were already delayed by over 10 month for delivering on Autopilot replacement.

So, yes, I disagree that someone like @diplomat33 did not know what they were signing up for when the purchased EAP or FSD packages from Tesla or the fact that the timelines were already behind at the time of their purchase or the inconvenient bolded disclaimers were not there...

Oh and the full quote is... note that I did not highlight the bolded part, that was on the tesla website since October 2016:
upload_2020-7-24_13-18-21.png


p.s. I will continue to disagree where appropriate for me ;)
 
Last edited:
  • Like
Reactions: Soda Popinski
@Soda Popinski just to clarify.
That 2016 screenshot from Model X & S sales page is a black eye for Elon and Co even though it is conveniently cropped (see further below)
But for all these Model 3 and Y owners (and everyone else bitching that purchased their cars since ~2018) to whine about their FSD not being "delivered" is kinda silly.
Especially on this forum. You've (as in diplomat33) been following the company for years, from the announcement of FSD in October 2016 to first Model 3 deliveries was ~10 months (and more than that for volume deliveries).... before you could get a Model 3 they were already delayed by over 10 month for delivering on Autopilot replacement.

So, yes, I disagree that someone like @diplomat33 did not know what they were signing up for when the purchased EAP or FSD packages from Tesla or the fact that the timelines at the time of their purchase or the bolded disclaimers where not there...

I purchased my Model 3 in 2018. I used to be very optimistic about Tesla delivering FSD but I see that I was wrong. That's why I am not expecting driverless L5 from my car any time soon.

Oh and the full quote is... note that I did not highlight the bolded part, that was on the tesla website since October 2016:
View attachment 568449

p.s. I will continue to disagree where appropriate for me ;)

Yes, the bolded part says it is just pending validation and regulatory approval.

And you disagree with facts just because the facts are inconvenient to your "Tesla is always right" view. I've posted simple articles with no opinion, just information, and you disagree.
 
interestingly enough, if you go to Autopilot (screenshot from July 24th, 2020)
you'll find that most of the text has not really changed... (especially the "all you will need to do is get in and tell your car where to go." part)
upload_2020-7-24_14-20-58.png


Note: both the October 2016 and July 2020 screenshots state "full self-driving in almost all circumstances"