Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD - the tech vs the reality

This site may earn commission on affiliate links.
Small nit-pick: RADAR is fitted to Tesla cars. Competitors (notably Google) use LIDAR and have suggested Tesla won't be able to reach FSD without it. Tesla disagrees...
This is why I keep on mentioning ToF camera technology. These are optical cameras that have digital sensors, with pixels, but each pixel can measure the time it takes for light (probably a laser) to be transmitted from the camera and reflected back. Every pixel records a distance so a 3D image can be formed in real time. High-end smartphones already have them to help cameras fake blurry backgrounds more precisely. There are also apps that let you use the camera to measure things out of reach in three dimensions. It's exciting stuff.
 
  • Like
Reactions: arg
This is why I keep on mentioning ToF camera technology. These are optical cameras that have digital sensors, with pixels, but each pixel can measure the time it takes for light (probably a laser) to be transmitted from the camera and reflected back. Every pixel records a distance so a 3D image can be formed in real time. High-end smartphones already have them to help cameras fake blurry backgrounds more precisely. There are also apps that let you use the camera to measure things out of reach in three dimensions. It's exciting stuff.

What you describe sounds exactly like LIDAR. The problem with lidar is cost and the potential for degredation both on transmission and reception of the light pulses.
Teslas cameras do have overlap and they are also of different magnifications. I suspect there is potential for some clever way to work out distance by virtue of the different mag and because the car is moving so the image changes give some equivelence to 3D.

Human vision has some interesting aspects of memory such that when say strolling in the countryside you remember and assume much of the side views - refresh them with a new glance if necessary - but process the direct image in front of you with greater clarity. The same occurs when you concentrate on an object - you assume the peripheral stuff and magnify the area of concentration by deeper study.
 
ToF has similar functionality to LIDAR but it doesn't require a moving scanner. It's not expensive; my Huawei P30 Pro phone has a ToF camera.

They aren't perfect; interference from other ToF cameras can be an issue and they have to compete with the brightness of the sun in daytime. But then so do conventional optical cameras as used in Teslas.

ToF sensors are a relatively immature technology so they will undoubtedly improve.
 
GPS is also beneficial if a sign is missing or obscured. Imagine careering around a villlage at 60mph because a sign had been knocked and spun out of alignment.

Yes, GPS should be the backup but it's currently a problem being used on it's own. During a single journey of 12 miles from my house the car goes through 2 sections of 30mph limit that it believes are 60mph limit ... not good.
 
indeed, AP1 (MobileEye) TACC is actually very good and much more refined than an AP2 S75 loaner I had for a week a couple of months ago. But owners do report Tesla is gradually improving.

Have you looked at the camera resolution feed to AP1 Mobileye tech? Its sub VGA, and something like 20fps max.

More sensors isn't whats needed, its more intelligent software.

Here is 2D low res picture of a typical UK street, how many people here cannot work out how to navigate the potential obstacles without radar/USS/LIDAR/GPS??

The hard part of FSD is the software not the hardware.

screen-shot-2015-11-12-at-09.32.49.jpg
 
  • Like
Reactions: pow216
Have you looked at the camera resolution feed to AP1 Mobileye tech? Its sub VGA, and something like 20fps max.

More sensors isn't whats needed, its more intelligent software.

Here is 2D low res picture of a typical UK street, how many people here cannot work out how to navigate the potential obstacles without radar/USS/LIDAR/GPS??

The hard part of FSD is the software not the hardware.

screen-shot-2015-11-12-at-09.32.49.jpg

I do get the AI visualisation strategy. Indeed, its proponents point to how we use our own eyes and our brains to drive cars. I actually don't have an issue with the fundamentals and I also believe Tesla is right to pursue this avenue instead of the LIDAR geo-fenced architecture that the competition has chosen.

Turning a 2D view into a 3D world obviously works but is it enough for operating a car 100% adequately? It's all very well saying human vision is like a Tesla's vision, but there are lots of situations when that's not enough. I was driving in London at the weekend at night in the rain and navigating around traffic islands and kerbs takes a lot of concentration. I was using spatial awareness in my head to avoid obstacles I couldn't actually see and at the same time anticipating other traffic. AI can anticipate and obviously has spacial awareness but what if it isn't aware of that obstacle because it never saw it?

And how often do you see other drivers do stupid things - even at low speed?

Coming back to the kerbing thing; as I understand it the ultrasonic sensors can't detect kerbs so I will be very interested to see how a FSD car will cope with these if it can't detect them using the cameras. The pessimist in me thinks there will be lots of FSD Teslas with damaged wheels and suspension. Additional cheap (e.g. ToF) sensors could eliminate that risk.

But let's see - the neural AI nature of the beast should be that the system gets better and better significantly over time. Elon wants FSD to be 'feature complete' in less than a year. He could well be right and I hope he is but unexpected limitations must be a concern. The proof will be in the pudding!
 
I was using spatial awareness in my head to avoid obstacles I couldn't actually see and at the same time anticipating other traffic.

Actually what your brain was doing was making guesses based on experience and hoping for the best.

This is the hardest bit for true AI, our brains are amazing at making decisions without all the information, this allows us to operate quickly and efficiently at the cost of a very small margain for error.

For an AI to really interact in a human society it has to be able to do the same. How do you program an AI to take risks is the real challenge.
 

Google is clearly very close, so maybe AI is progressing much faster than us the public realise.
Yes but Waymo is limited to areas that have been mapped very precisely. The cars are not able to drive outside of these areas. Tesla FSD should be able to cope with roads that no other Tesla has driven before.

It's also suggested that the Waymo hardware is very expensive and saps the car's battery pack, significantly reducing range. It's also ugly and doesn't appear to do the vehicle's aerodynamics any favours, so again less efficiency.
 
  • Like
Reactions: EVMeister
I can't deny this is impressive:

Yes - that is indeed impressive. I see it was posted in April of this year. I wonder what the test conditions were. Was it a route that had been sweated over as a showcase?, or a true "unique" journey that the FSD figured out for itself? If the latter, then there must be many more features that Tesla have developed that have never seen light of day in customers' cars.
 
As an example of just how well the neural net tech is working, the section of footage around 1:05:16 shows the FSD computer very successfully predicting the bends on a country road. Tesla take vast quantities of real-world video data (i.e. from real Tesla cars, driven by real drivers, in the real world) and annotate it, by hand (with humans!), to train the machine as to what's actually happening. Then, when you and I are driving, the computer is interpreting the video images in real time against this pre-learned data. To do the necessary computations quickly enough, Tesla have designed a dedicated chip focused entirely on the needs of self driving.

AP can do lane keeping with no lines now?!
 
Seems absolutely years off to me. AP in my M3 is good but it's just cruise control with lane keep.

I had a AP2? 2017 Model X with FSD for a few days and the NOA was definitely interesting but ultimately not very useful. Lane changing needed my input and was so slow as to be essentially dangerous.

I think this is typical Tesla - overpromise and underdeliver. It's still the best car I have owned but the reality is much worse than the promise in nearly all aspects - FSD/AP just one of them.
 
With even HW2.5 (still fitted as of 11/18, apparently in some UK cars too?) people missing out on new features, and no established upgrade path to HW3+, I think you have to buy FSD for what it is now and accept that you’re funding future development, but that you might (probably) won’t see it yourself without further outlay.
 
I had a AP2? 2017 Model X with FSD for a few days and the NOA was definitely interesting but ultimately not very useful. Lane changing needed my input and was so slow as to be essentially dangerous.

I guess you have not used a version from last month or so then?

I just did 2 stints covering around 225 miles, most of which was on NoA in very busy conditions, my longest continuous runs so far - M3/M25/M40 and M56/M6/M42/M40 (so Birmingham). I didn't use AP/NoA on M25 or M6 roadworks (but used TACC on latter as we were running with a massive Hermes lorry that was too big for comfort in the lanes). 4 disengagements, one the car was probably at its lateral limit (M56/M6 merge), 2 were aborted lane changes where I unintentionally took over and final one was off ramp flare (single split into 2) which I have come to expect. Whilst I had a hand full of reluctant/refused lane changes, all were obvious why - majority were that it seems that it will delay a lane change until it is sure you are holding the wheel (like a nag but without the nag), so a light wiggle of wheel allows it to complete (it becomes instinctive after a short while) and I think 3 were down to cars suddenly coming out from behind and either overtaking (it actually pulled out but didn't pass), or heading for off ramp. I didn't do M25 as when joining from M3, my lane was to merge with M25 and I was running parallel with a lorry - I didn't know how it was going to react, so I played cautions and took over and due to nose to weight of traffic, didn't go back to NoA until M40.

Other than that, and not being speed limit or corner speed aware (M56/M6), it worked very well, no hesitation, no get half way and timeout etc.

My only criticisms, which passenger also mentioned, were that on aborted lane change, the indicator stops so when second attempt is made, its probably confused the car behind with second set of indicating. And my continued biggest criticism, not handling lane flares.

I also witnessed TACC for extended periods, as driver (M6 roadworks) and passenger (M40) and it worked flawlessly. SWMBO on M40 had no issues of hesitation when overtaking (something that put her off before and others also mention on here) and was very impressed (she is a bad passenger and even worse driver when not in control of the car) when in stop start traffic. She was not sure if smoother overtaking was due to leaving a bit more room when pulling out.

Would be interested to know what other peoples experiences are with TACC overtaking on latest software releases.
 
Elon has promised that FSD will be”feature complete” by the end of the year, which is tomorrow. Even given that much of its function is restricted by law, would anyone describe it as feature complete?

My definition of full self driving is not the same as Elon’s. For me it means at least level 4 autonomous driving, and even without regulations it’s still a long way from that.
 
I may as well interject w/Cruise Automation which GM bought a few years back.

They test in the city of SF (Why testing self-driving cars in SF is challenging but necessary) + others places and have taken reporters around before, albeit for imperfect rides. They've put out a bunch of videos at Cruise. I haven't seen some of the latest ones but take a look at say the 1,400 left turns or the double-parked or the San Francisco Maneuvers videos.

Also look at Waymo, Cruise and others at UPDATE: Disengagement Reports 2018 – Final Results. Waymo and GM still lead the pack in California’s new self-driving report cards is for 2017.

Unfortunately, the CA DMV has made it more difficult to access the autonomous testing on CA public roads disengagement reports and unfortunately, most of the links at Autonomous Vehicle Progress are dead but the Google publicity stunt video from 2012 is still working.

One can still get to copies of previous submissions thanks to archive.org. In particular, pay attention to Tesla :), Google/Waymo and Cruise.
Autonomous Vehicle Disengagement Reports 2018
Autonomous Vehicle Disengagement Reports 2017
Autonomous Vehicle Disengagement Reports 2016
Autonomous Vehicle Disengagement Reports 2015

Autonomous Vehicles in California has background info. You might want to look at archive.org for older copies. Keep in mind the CA autonomous vehicle testing stats only reflect autonomous testing done on California public roads.

BTW, it seems almost every time I'm in Mountain View or the surrounding area (not that often), I do see at least one Waymo Pacifica (plug-in) Hybrid minivan. I've seen them merge onto the highway besides driving around city streets. Unfortunately, there's no indicator to be able to know if the safety driver is driving/has taken over or if it's running in autonomous mode.

Also, remember this is a deadline that Tesla has missed: Navigant: GM Cruise, Waymo Lead Tesla And Others On Full Autonomy.
 
Last edited:
We are only on week 40 release. Still another 3 months of releases before we can judge on their end of year status. Plus feature complete does not necessarily mean all use cases/stories done and dusted.

But yes, like pretty much any complex project when making predictions far in advance, actual vs predicted will not correlate as well as some might like. That’s why many projects run on short timeframe iterations that are easier to predict what’s going to be done at end of iteration.

What is clear is that every iteration shows a few steps progress even if there is a step backwards too. But even that is difficult to say for certain when outside of Tesla we don’t know exactly what part of the machine learning neural network we are comparing with what - it is known that Tesla compare behaviour between and (even within) releases.
 
  • Like
Reactions: CMc1