Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will it be Full Self Drive or with nags?

This site may earn commission on affiliate links.

D.E.

Uncorked
Oct 12, 2016
972
1,337
MI
When we get FSD, will we be able to let the car drive or will we have to keep hands on the wheel and watch everything it does ready to take over in an instant?

I had in mind I'd like to crawl into the back seat after a late night, say “take me home”, and snooze while the car takes me home much more safely than I could ever do it myself.

Now I'm starting to wonder if the whole full self drive is going to be a just a souped up autopilot, in endless beta. One that does operate the car on the roads but with the requirement that we watch it closely, constant input needed, or get nag screens every 25 seconds or so.

I'm not sure it makes sense to me have both of us driving. Maybe I'll drive but it will watch me, making lots of suggestions. It'll be like my wife. If she goes too, there'll be 3 of us driving. No doubt she'll argue with the car, she already argues with Google maps.
 
You will be able summon your car across the country and there will be no hands to nags for the trip!

But before that happens, it's still beta and a driver is still needed so it will be able to continue to do all the nags.
But before that happens, you'll need to buy a whole new car. No way, not in a 100 years, will you be able to summon AP2 cars across the country - many different way to argue that, but simplest is - unless Tesla retrofits all the cameras with wipers, a cross country trip will dirty them all beyond usability. Personally I don't even think this is going to be the biggest problem, but the easiest to explain to people.
 
But before that happens, you'll need to buy a whole new car. No way, not in a 100 years, will you be able to summon AP2 cars across the country - many different way to argue that, but simplest is - unless Tesla retrofits all the cameras with wipers, a cross country trip will dirty them all beyond usability. Personally I don't even think this is going to be the biggest problem, but the easiest to explain to people.

Agreed. And this will lead to yet another class action lawsuit from those who were naive enough to believe Tesla’s marketing claims that the current vehicles would be capable of FSD.
 
Not specific to Tesla, but have read:

2021 for level 4 (self driving with driver assisting when needed sort of like AP now)
2025 for level 5 full self driving (no assist)

However, the article I read stated that this would be implemented in small volumes and in mobility service fleets rather than personal vehicles.

There’s so much competition in FSD, i think it’s going to be here rather sooner than later.
 
I don't think even Tesla know yet, the cross country drive to pick you is not even hinted at in the wording of the FSD option, so I suspect it will be more a case of extended EAP features and so requiring the driver to be alert than not, certainly for the foreseeable time.
 
There are two levels of FSD activation.
FSD
Starting in August, we may start to see the first features using all 8 cameras in "driver assist" mode, requiring a driver holding the steering wheel.

To allow operation without a driver will require Tesla to complete validation testing (which will likely take years), get regulatory approval (which will likely take years), AND get insurance companies to figure out how they'll cover cars operating without a human driver.

Before all of that happens, what we're likely going to see is FSD operating under driver assist mode - working with little or no human intervention under most road conditions - and that could happen relatively soon, starting with limited access highways, which have the simplest 'rules".
 
There are two levels of FSD activation.
FSD
Starting in August, we may start to see the first features using all 8 cameras in "driver assist" mode, requiring a driver holding the steering wheel.

To allow operation without a driver will require Tesla to complete validation testing (which will likely take years), get regulatory approval (which will likely take years), AND get insurance companies to figure out how they'll cover cars operating without a human driver.

Before all of that happens, what we're likely going to see is FSD operating under driver assist mode - working with little or no human intervention under most road conditions - and that could happen relatively soon, starting with limited access highways, which have the simplest 'rules".
Validation implies the product development is complete and only bug fixes remain. Obtain regulatory approval assumes you have something regulators can approve - they'll never approve some product which even the maker doesn't know how it will work. Tesla has been very glib using such terminology in their FSD description.
 
Approval for FSD (by any manufacturer) will be an interesting process, because of the complexity of software and the enormous number of conditions under which FSD will be expected to operate safer than a human.

If the government (controlled by politicians) insists on proving the software will never have an accident or harm a passenger or anyone else, that may be impossible to ever prove. Politicians could be reluctant to approve the use of FSD as long as they believe there's a possibility an FSD vehicle could cause harm or death.

And even after FSD is approved for use, the software will continue to evolve - learning and improving.

Will also be interesting to see how liability is handled for FSD accidents - which will happen (like AP accidents today). While the owner of a vehicle may have some responsibility for accidents, even if they aren't in the vehicle, it's usually the driver who has primary responsibility. In the case of FSD, it will be software & hardware supplied by a manufacturer. And while the manufacturers want to avoid any liability for accidents - will that really happen? If an accident is caused by the inability of the software to properly detect and respond to circumstances - seems likely the FSD manufacturer will be held responsible, since they are effectively providing the driver.

Will be interesting times...

And by purchasing FSD now, even if FSD isn't approved for unmonitored use, using all 8 cameras for driver assist should improve the safety of driving a Tesla.
 
  • Like
Reactions: cwerdna
FSD will require that you hold the wheel and be ready to take over at any time until both Tesla and the Gov are confident that it's safe. That will take a while. First then, you can smart summon the car from across the country.

Tesla recently figured out that this current "nag interval" is necessary to make people pay attention. Thus this "nag interval" being present on FSD until smart summon is enabled is implied. After smart summon is enabled, there will not be a requirement for a driver, thus no requirement to hold the wheel.
 
Approval for FSD (by any manufacturer) will be an interesting process, because of the complexity of software and the enormous number of conditions under which FSD will be expected to operate safer than a human.
I agree. It will be very interesting, however it won't be Tesla that will be pioneering unattended self-driving approvals. Google/Waymo is the most likely candidate to be the first, with the likes of Uber on their heels. You cannot approve faked marketing videos or Tweets.

Will also be interesting to see how liability is handled for FSD accidents - which will happen (like AP accidents today). While the owner of a vehicle may have some responsibility for accidents, even if they aren't in the vehicle, it's usually the driver who has primary responsibility. In the case of FSD, it will be software & hardware supplied by a manufacturer. And while the manufacturers want to avoid any liability for accidents - will that really happen? If an accident is caused by the inability of the software to properly detect and respond to circumstances - seems likely the FSD manufacturer will be held responsible, since they are effectively providing the driver.
This is where manufacturer owned fleets have a huge advantage. Same entity which manufactured it, owns and operates it. Besides liability this also has a great advantage when it comes to changing out sensors or other hardware changes which may be required as the fleet learns.

And by purchasing FSD now, even if FSD isn't approved for unmonitored use, using all 8 cameras for driver assist should improve the safety of driving a Tesla.
It is extremely unlikely Tesla would compromise EAP safety just to differentiate it from FSD. FSD may have a few tricks up more its sleeve, but not safety related. Notice that EAB is available even in non-EAP cars. Also, once all 8 cameras as up, it would be extremely inefficient for Tesla to train 4 and 8 camera systems separately.

IMO FSD will follow a similar path to AP1 summon. Elon promised AP1 "will find you anywhere on private property". What he delivered is "drives up to 20 feet in straight line while you hold a button down and watch to make sure it doesn't hit anything". For AP2 FSD the promise is "It will eventually find you even if you are on the other side of the country". What will be actually delivered will likely be similarly comparable to the original promise as AP1 Summon is to its originally promised functionality. Elon dreams big.
 
Just wondering what some of you have been smoking because you obviously haven't followed Tesla's AP2 "progress" over the past 18 months. At the current pace, FSD should be ready in time for Elon's landing on Mars.
Maybe.

But also consider that software development in many cases are exponential. Especially in this field of era (Deep Learning) where a lot of infrastructure is missing. You do a lot of invisible work before you can think of releasing anything.

Doing something without proper infrastructure in a software world is like building scaffolds. It's temporary redundant work that has to be rewritten later anyway. AP2 from late 2016 until a few months ago were like that, just a modestly modified GoogleNet.

Seems it's first now as of 10.4 we start seeing incremental progress on Karpathy's platform, and first now Elon says basic infrastructure is complete.

Painting is linear work. If it took you 1.5 year to paint one building, it will probably take 15 years to paint all 10. In terms of software dev, it could take you 1.5 years to see 2% of it, and in 3 years it's mostly done.
 
If Tesla says here's your FSD, it's ready to go, would you trust it enough to drive from the back seat? I value my life more than that.

I've always thought the following would happen:

1. FSD features are rolled out piecemeal. Safety with stationary objects, phantom braking, AEB, and blind spot detection are all improved with more sophisticated software, and probably operate even without FSD for AP 2.0 and above. Nags may be reduced, but it's still just driver assist with no FSD approval. This appears to be happening now with at least one FSD feature promised soon, but not the whole thing at once.
2. FSD is "complete" but still requires regulatory and insurance company approval. What will your insurance company charge for using FSD? Does Tesla have any liability? Will you have to buy FSD insurance from Tesla? Are you now exempt from distracted driving laws? Are you finding "corner cases" where it needs improvement? Does your state allow FSD? Do you now have enough experience with it to take your hands off the wheel and watch a movie on your phone?
3. FSD is approved and legal. It can take you home if you drink too much.
4. FSD is advanced enough to drive with no one in the car. Want to give that a try?

I did buy FSD, mainly with #1 in mind. Anything beyond that is a bonus. And no guess as to a timetable.
 
When we get FSD, will we be able to let the car drive or will we have to keep hands on the wheel and watch everything it does ready to take over in an instant?

I had in mind I'd like to crawl into the back seat after a late night, say “take me home”, and snooze while the car takes me home much more safely than I could ever do it myself.

Now I'm starting to wonder if the whole full self drive is going to be a just a souped up autopilot, in endless beta. One that does operate the car on the roads but with the requirement that we watch it closely, constant input needed, or get nag screens every 25 seconds or so.
Have you consulted the SAE table of levels at Wayback Machine ? Seems to me that "FSD" would need to be level 4 or 5.
 
Maybe.

But also consider that software development in many cases are exponential. Especially in this field of era (Deep Learning) where a lot of infrastructure is missing. You do a lot of invisible work before you can think of releasing anything.

Doing something without proper infrastructure in a software world is like building scaffolds. It's temporary redundant work that has to be rewritten later anyway. AP2 from late 2016 until a few months ago were like that, just a modestly modified GoogleNet.

Seems it's first now as of 10.4 we start seeing incremental progress on Karpathy's platform, and first now Elon says basic infrastructure is complete.

Painting is linear work. If it took you 1.5 year to paint one building, it will probably take 15 years to paint all 10. In terms of software dev, it could take you 1.5 years to see 2% of it, and in 3 years it's mostly done.

Having been involved in software development since the 1980's, I understand what you're suggesting. Here's the problem. Tesla doesn't even have functional code that addresses all the hardware components yet much less code that can actually do something. We've got one or two cameras working out of 8. We've got ultrasound that can't detect a vehicle in a blind spot. We've got summon that "forgets" to open the garage door before backing up. The car can't spot a big ass fire truck stopped in the road much less read a stop sign. Solution: more nags to paint over not being able to spot the fire truck or lane barrier. Seriously, fix the damn problems rather than dumping responsibility back on the driver. Sure, they probably have some (more) development code that hasn't been released, but it's just that and probably full of bugs and shortcomings just like this AP2 code. Sorry to say that Tesla's software development has been more like the painter except it's the guy with one arm. Meanwhile, employees and managers are jumping off the ship like rats after spending hundreds of hours coding crap like Santa and the reindeer on the dashboard. Cute, but...
 
Last edited:
  • Like
Reactions: cwerdna
Have you consulted the SAE table of levels at Wayback Machine ? Seems to me that "FSD" would need to be level 4 or 5.

The "FSD" option on the Tesla order page is certainly describing level 4/5. But if Tesla rolls out self-driving features piece by piece, then the "FSD" option we can buy may start at level 3 in the beginning. It is also possible that Tesla will add the level 3 self-driving features into EAP at least for highway driving and reserve the truly level 4/5 features for people who purchased the "FSD" package. I don't think we know yet quite how Tesla will implement this.