Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will HW3 REALLY deliver FSD ? some questions

This site may earn commission on affiliate links.
indeed, and I posted this almost a year back
Solid state Lidar progressing

LIDAR's laser can also have disavantages, imagine a scenario where every car on the road was emitting multiple LIDAR lasers... there's any number of IR sensors in this world that might be affected, law of unintended consequences?

Man says CES lidar’s laser was so powerful it wrecked his $1,998 camera

My greater question is why Tesla have eschewed stereoscopic vision. Cameras are not expensive. Tesla may be able to discern distance by processing information from single cameras really quite successfully, this largely I understand relies on processing lateral movement of objects in successive frames, hence dead ahead objects are a problem, hence in turn the use of radar (noted that the radar used is doppler and only typically sees moving objects which is why a stationary vehicle straight ahead is a problem and Tesla (and others) have struggled with eg emergency vehicles stopped ahead part in lane), but it has to be irrefutable that stereoscopic vision would be more accurate. Subaru's Eyesight a good example.
 
Doppler radars do see still objects.
But there are too much signals mixed together corresponding to still objects (road, signs, vegetation, fences, etc etc...) not well separated spatially (because the radar only sees space in 1D, distance) so they are not really usable, and the computer discards them.

Basically a Doppler radar sees a point cloud on a 2D graph, one axis is relative speed and the other is distance.

Also, from the radar point of view, still objects are not still, as it measures speed relative to the vehicle.

Just my 2 cents...
 
Last edited:
  • Helpful
Reactions: Brando
One of Tesla’s arguments against Lidar was poor performance in case of rain. Latest AP versions were not super convincing, as NoA disengages with the single little droplet on side cameras, with annoying warnings popping up all the time on the instrument cluster...
 
trying not to laugh at that.

Sooo many people state over and over that Tesla have it all wrong and are going to fail or get left behind, yet somehow they continue to lead the pack. Self-driving is quite probably the area they have the greatest lead of all and this will increasingly show over the next 12 months and beyond.

I'll spare you a disagree but LIDAR is a complete red herring and is precisely the sort of thinking that will get solutions to 99% and get stuck.
Vision based ai dnn learning is clearly the way to go, possibly in time with extended frequency range (UV/IR) higher resolution and dynamic range cameras in future, quite simply because it has the resolution and freqency bandwidth that LIDAR doesnt. The only agrument for LIDAR is that it range finds in the visible spectrum. Tesla have already proved that camera based vision recognition is perfectly adept at evaluating range of objects. Ergo LIDAR is immedaitely negated.

Elon is entirely correct in stating the humans have been drivng quite successfully for years with vision alone.
Computers can process images even faster, but the deep learning is required to match humans ability to analyse.
If Elon was trying to duplicate humans, there would be 2 cameras and 2 microphones on a servo driven swivel, plus some accelerometers (3d linear and angular) - that's it. It seems even Elon thinks radar is useful even though no humans have radar vision. Lidar is the same (actually closer to human vision than radar) - it gives the computers more information to use for driving decisions.

In any case, the leader in self driving, Waymo, has cars driving without any human safety drivers already, and they do have lidars, radars, cameras, and more. So, hold onto you laughing until Tesla FSD with current hardware (or any hardware Tesla is willing to upgrade all AP2.0 cars to for free) is capable of that. You might want to get real comfortable waiting... and maybe find something else to laugh at in the meantime. ;)
 
  • Like
Reactions: thegruf
If Elon was trying to duplicate humans, there would be 2 cameras and 2 microphones on a servo driven swivel, plus some accelerometers (3d linear and angular) - that's it. It seems even Elon thinks radar is useful even though no humans have radar vision. Lidar is the same (actually closer to human vision than radar) - it gives the computers more information to use for driving decisions.

In any case, the leader in self driving, Waymo, has cars driving without any human safety drivers already, and they do have lidars, radars, cameras, and more. So, hold onto you laughing until Tesla FSD with current hardware (or any hardware Tesla is willing to upgrade all AP2.0 cars to for free) is capable of that. You might want to get real comfortable waiting... and maybe find something else to laugh at in the meantime. ;)
Tesla is trying to integrate the FSD hardware into the car so it doesn’t look like the entire JC Whitney catalog has been tackily added.
 
No. FSD is an option you can check when purchasing a car or buy later. It is contractually described here:

https://www.tesla.com/models/design

There is absolutely no mention of L5 or any possibility to have nobody on the driver seat. Just that it works in city streets.

Here is what it said when I bought FSD, it was a bit more comprehensive back then. I am supposed to be able to get in the car and have it drive me anywhere with zero input on my part.

27907532-C031-42C3-87AE-A3F08A950407.jpeg

C645027F-FA90-48DF-991F-50246DBBBBCB.png
 
Don’t forget that Tesla also uses forward looking radar in addition to vision. Radar should be better than Lidar in fog etc.
AND 12 ultrasonic sensors on all Tesla vehicles.

========
side note: HW3 currently refers to = FSD computer version1
FSD version 2 will be at least 3x faster and who knows what other improvements (per autonomous Day presentation). I think this might be the 3rd year of version 2 development?
An example of Tesla continuous improvement methodology, right? So no surprise.

side side note: previous computers Mobile Eye (bought by Intel) then Nvidia were used before Tesla developed the FSD computer.
 
Last edited:
I don't believe HW3 and the current camera system has any chance of realistic FSD. The amount of faults triggered by low winter sun, the inability to clean the rear camera remotely and the tendency (at least in my 2018 S) of the B pillar cameras to fog up (service pending) are basic faults that eed addressing. Also here in the Uk we have so-called smart motorways with overhead gantries displaying variabel speed limits and lane restrictions with no evidnece that Tesla will even read speed signs soon. That and the compelixity of some roundabouts with intremittent traffic lights can be confusing enough to human drivers and maps aren't updated often enough to allow for changes. Finally the cars don't look far enough ahead. If you ever test out current TACC on a winding country A road you instantly recognise that any speed change during a bend is reactive rather then proactive as indeed is the steering which can get quite squirly on bends that aren't a set curvature. That's not to say they will never get there but the lead time will be many years and I cannot believe that HW3 will be up to it and the current camera system certainly isn't - at a minimum it requires redundancy. heaters, wipers and washers.
 
today, I was driving on a highway (rt 101) and there was a box dropped in the middle of the road. if I left the car in L2 autopilot, I would likely have hit that thing or maybe emergency braked, hit it and also be rear-ended.

I saw it enough in advance with my eyes and I changed lanes to avoid it.

I'm not convinced that the hardware on the car (sensors) would be able to 'see' that box fast enough and to know how to avoid it.

there is still too much judgement going on, when driving. that's my main concern.

now, if we can leverage 'group smarts' and have vehicles linked to each other so they can 'tell' each other stuff, that will go a long way toward helping us get to level4. once enough drivers swerve out of the way and enough cameras (as a group, in series, as they avoid it; and in parallel, as several cars see it at the same time and do different things to avoid it) then that info will be broadcast and relayed enough with V2X i/o that cars coming up behind can think 'in advance' and have group vision benefit.

I don't believe tesla has bought into v2x. that's going to be a serious limit, once it really becomes used. yes, its very early, but you need to develop and test that stuff now so that when its mainstream, you are there and ready.

active roads are another thing that will help. beacons that can give cars hints.

bottom line: a car with cameras and radar (etc) is not going to be enough to drive on its own in the street system that was designed for PEOPLE. it will need extra help, and I'm not seeing anyone put up money to enhance the roads to make them 'active'.
 
Tesla is trying to integrate the FSD hardware into the car so it doesn’t look like the entire JC Whitney catalog has been tackily added.
I get that, but because of this they had to redefine Full Self Driving to what EAP used to be. I think Elon may be right that some day we will have cars drive themselves without lidar, but it will be a while and those with lidar will get there first, and will also get to no lidar versions faster too as the lidar equipped cars will generate the immense data set needed to train AI to drive without lidar. Of course by then lidar may be small and cheap, and it will always add an addition layer of safety for driving.
 
The AP2/AP3 sensors provide considerably more information than a human driver has available, with the exception of audible sensors, since there's no indication that Tesla or any of the other FSD systems are listening for audible signals (such as emergency vehicles or trains that are not in line of sight).

The challenge is in interpreting the images, radar and proximity sensors to classify all of the objects, their relative location & speed and how the software should handle each object. The increased processing power of HW3 will allow more software to run in real-time to process and interpret the data - and until Tesla (or anyone else) has FSD working, they can't be sure they have enough processing power.

FSD demonstrations have focused on the simple challenges - not the unusual circumstances that drivers encounter (such as avoiding a pot hole or an object in the roadbed ahead). In the near term, FSD will work best in relatively simple environments - likely first on limited access highways or on urban streets when the vehicle drives a very slow speeds & immediately stops if anything unusual is encountered. FSD in almost any condition or almost any road is much more complicated - and we don't know today if Tesla's sensors or the lidar-based systems from other manufacturers are capable of replacing human drivers.

It's likely Tesla has software that is already recognizing speed limit signs. The signs are pretty easy to identify - and the speed limit numbers are easy to read. The challenge is in determining if those speed limits apply to the vehicle - or to vehicles in other lanes (such as express lanes, exit lanes, frontage roads, ...). Plus, Tesla has to overcome Mobileye's patent. The reason why we don't have speed limit detection is likely more legal than technical right now.

Unfortunately, we can't try the "FSD preview" that was included in the "holiday" release in either our 2017 S 100D or 2018 X 100D. Both vehicles were purchased with FSD - last year Musk stated that FSD owners would get "early access" to new software releases. But the holiday release FSD preview appears to work only on MCU2/HW3 vehicles - and because our 2017 S has MCU1/AP2 it hasn't even received the holiday release yet - and while our 2018 X has MCU2 it still has the HW2 processor and even after 3 updates for the holiday release, still doesn't have access to the FSD preview.
 
When did Tesla claim AP1 would support Smart Summon or NOAP?

The AP1 announcement mentioned pulling out of your garage and driving to your front door.

It mentioned lane keeping, driver-assisted lane change, and managing speed by reading road signs (using Mobileye's technology). That's very different from NOAP that initiates the lane changes and navigates through interchanges and to highway exits.

Since AP1 didn't have the side-facing cameras, it doesn't have enough information to implement Smart Summon and NOAP - which need better object detection on the sides of the vehicle than the proximity sensors can provide.

Tesla (Musk) believes the AP2 (and later) sensor suite should provide enough data to implement FSD - and with the HW3 processor they should have enough processing power to interpret that data.

It's still to early to declare that Tesla will or will not be able to achieve FSD with the current hardware.
 
New When did Tesla claim AP1 would support Smart Summon or NOAP?

Actually, they claimed even more than Smart Summon. Turns out "your Tesla" was that future Tesla you will buy, not the car that got 7.1 released.

Summon Your Tesla from Your Phone

"During this Beta stage of Summon, we would like customers to become familiar with it on private property. Eventually, your Tesla will be able to drive anywhere across the country to meet you, charging itself along the way. It will sync with your calendar to know exactly when to arrive."
 
  • Informative
Reactions: Chaserr
Actually, they claimed even more than Smart Summon. Turns out "your Tesla" was that future Tesla you will buy, not the car that got 7.1 released.

Summon Your Tesla from Your Phone

"During this Beta stage of Summon, we would like customers to become familiar with it on private property. Eventually, your Tesla will be able to drive anywhere across the country to meet you, charging itself along the way. It will sync with your calendar to know exactly when to arrive."

TESLA is like drunk uncle you don't believe everything they say :)
 
  • Funny
Reactions: Az_Rael
Let me start by saying I hope the answer is yes - and I'm rooting for Tesla's success.

I have paid for FSD already, in that spirit.

I have been reading through a lot of posts / threads on AP3 computer and FSD, and it brought some questions to mind.

1. I seem to recall hearing somewhere (can't recall where that they are working on HW4. Why would they be upgrading people to HW3 if HW 4 is in the pipeline, and FSD isn't working yet anyway? Would this not result in double the costs to swap HW2 & 2.5 to 3 then swap all the HW3 to HW4 ?

2. Why are they working on a new chip / hardware HW4 if HW3 is plenty for FSD ? / Why not focus those resources on the AI / solving vision side for FSD to ensure maximum "bang for the buck "?

Thoughts from the TMC intelligencia ?

First lesson don’t trust anything Tesla says.
But to be honest I don’t think it’s going to matter since I don’t believe FSD will happen for a long time if ever.
In fact right now there is a good chance you may see just regular Autopilot get pulled or at least have some major safety changes. It is a huge topic of concern lately and from what I’ve heard it doesn’t look good.
It also never helps when another idiot just got caught recording his car driving while he was in the back seat.
I would say it’s about 5yrs away minimum before we see full FSD cars. Most likely longer. You will see cars that will rad the drivers bac level before you see any FSD cars. So just don’t worry about it cuz u probably won’t own the car when it happens. And the fact you paid the money for it? Just means you will find a buyer who will o my buy one with the upgrade regardless where FSD stands when you do sale it. So it won’t be a complete waste of money. You may never use it but it will add value to a potential buyer as a worst case scenario
 
I don't believe HW3 and the current camera system has any chance of realistic FSD. The amount of faults triggered by low winter sun, the inability to clean the rear camera remotely and the tendency (at least in my 2018 S) of the B pillar cameras to fog up (service pending) are basic faults that eed addressing. Also here in the Uk we have so-called smart motorways with overhead gantries displaying variabel speed limits and lane restrictions with no evidnece that Tesla will even read speed signs soon. That and the compelixity of some roundabouts with intremittent traffic lights can be confusing enough to human drivers and maps aren't updated often enough to allow for changes. Finally the cars don't look far enough ahead. If you ever test out current TACC on a winding country A road you instantly recognise that any speed change during a bend is reactive rather then proactive as indeed is the steering which can get quite squirly on bends that aren't a set curvature. That's not to say they will never get there but the lead time will be many years and I cannot believe that HW3 will be up to it and the current camera system certainly isn't - at a minimum it requires redundancy. heaters, wipers and washers.

Yes to what this poster says. YES.

As another UK based owner, all this is true. Roundabouts. Central reservations. Road surface reflections. Fogging cameras. Parked cars that look just like queuing traffic. Blind bends on a hill. Crazy aggressive busses and taxis. Huge range of non standard road markings. And smart speed limits and lane use that can change as you approach them.

Driving recently on French freeways FSD was determined that I needed to 'merge left' at every off-ramp. Following a US road convention that I haven't seen in Europe.

Traffic lights are also super inconsistent in their placement. Left hand drive and right hand drive cars have to work all over Europe. Temporary traffic lights at construction sites / roadworks often don't work.

As others have stated, bending rules is hard if not impossible for an algorithm to do, and even more 'impossible' for multiple copies of the same algorithm to successfully arbitrate between themselves.

I regularly experience very heavy auto braking for no obvious reason when AP / TACC decides it doesn't like what it thinks it sees.

Another crazy thing is lane changes timing out just as or just after the move is comenced.

The car can't even read speed limit signs, and if it could, it would almost certainly make a lot of errors because of the miriad road sign layouts.
 
  • Helpful
Reactions: VValleyEV