Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will Tesla be able to deliver FSD with HW3.0 and current Model 3 sensor suite, ever?

This site may earn commission on affiliate links.
@Knightshade, thanks for keeping the screenshots, btw. This is helpful.

Now I just realized if we read how Tesla worded this part: "Traffic Light and Stop Sign Control: assisted stops at traffic controlled intersections", I think again it is fair to assume that there is no intention in current version of FSD (the latest one) to deliver turns on intersections. See, this part explicitly mentions "stops" only. For what I think is an obvious reason - Tesla is not going to implement turns on intersections for current FSD buyers.

The more obvious reason is that feature is only about stopping.

Turns would be handled by the NEXT listed feature "automatic driving on city streets" in V2... or autosteer on city streets in V3.


We know that because they've repeatedly said so.

First there was this:

Elon Musk right about when the V2/V3 change happened said:
I feel extremely confident that it would be possible to do a drive from your home to your office most of the time with no interventions by the end of the year.

Which obviously would not be possible without turns at intersections unless your jobs parking lot was on the same street you already lived on.

But then explicitly confirmed it would do turns- this one was about a week AFTER the V3 change-


Elon Musk on Twitter

Elon Musk replying to question about next FSD feature coming said:
Turns on city street intersections is the big one


So as recently as about 2.5 weeks ago they still explicitly state they plan to offer that.

For sure (initially anyway) at L2 (which to V2 and V3 FSD buyers makes them feature complete).
 
  • Informative
Reactions: SergeyUser
I think again it is fair to assume that there is no intention in current version of FSD (the latest one) to deliver turns on intersections
See, those questions are actually easy to answer if you just use your bean AND stop creating questions based in your imagination and dreams rather than simply going with the actual text before you.

For example pothole avoidance is a very late-game thing. You really don't see Tesla talk much about that scenario. Swerving, frankly, is a very dodgy approach to handling such a situation, compared to just take the hit and if it blows the tire handle that (which AP has demonstrated it is very good at, better than most humans at driving to a halt with only 3 wheels).

P.S. Although there is one intersection I know of that I can let my car take a left at by itself, that's a dedicated left lane from the base of a T intersection and there is a lane painted very close to all the way through on the righthand side of it, because there's a second left-turn lane to the right of it. But that's more a curiosity and coincidence, it doesn't require the synthesizing of lanes boundaries. Synthesized lane boundaries are being generated internally in the software but normally doesn't put enough confidence in for them to get used. They can be used occasionally though, on straight stretches.

EDIT: A real world example of FSD using fully synthesized lanes: Google Maps Southbound on S Mount Mariah Rd there is a center line and you can engage AP. Near the 3-way intersection with Old Hwy 105 and Dobbin Rd (traffic through from Mt Mariah has right of way there, hard to see that without going to satellite view) the center line disappears and doesn't reappear before the Stop sign at the Dobbin Saloon. But AP will stay engaged, not even complain, holds to its side of the road (while avoiding the mailboxes that are right on the edge of the pavement), and makes it all the way to Stop at the stop sign.
 
Last edited:
See, those questions are actually easy to answer if you just use your bean AND stop creating questions based in your imagination and dreams rather than simply going with the actual text before you.

For example pothole avoidance is a very late-game thing. You really don't see Tesla talk much about that scenario. Swerving, frankly, is a very dodgy approach to handling such a situation, compared to just take the hit and if it blows the tire handle that (which AP has demonstrated it is very good at, better than most humans at driving to a halt with only 3 wheels).

P.S. Although there is one intersection I know of that I can let my car take a left at by itself, that's a dedicated left lane from the base of a T intersection and there is a lane painted very close to all the way through on the righthand side of it, because there's a second left-turn lane to the right of it. But that's more a curiosity and coincidence, it doesn't require the synthesizing of lanes boundaries. Synthesized lane boundaries are being generated internally in the software but normally doesn't put enough confidence in for them to get used. They can be used occasionally though, on straight stretches.

EDIT: A real world example of FSD using fully synthesized lanes: Google Maps Southbound on S Mount Mariah Rd there is a center line and you can engaged AP. Around the 3-way intersection with Old Hwy 105 and Dobbin Rd the center line disappears and doesn't reappear before the Stop sign at the Dobbin Saloon. But AP will stay engaged, not even complain, holds to its side of the road (while avoiding the mailboxes that are right on the edge of the pavement), and makes it all the way to Stop at the stop sign.

There are many things you can do with potholes. There is also debris on the road. Smaller, larger, animals. Many scenarios. You can change lane, skow down, and yes, if safe - avoid the obstacle without changing the lane. It is different for a human because of slower reaction times. FSD can actually be better.
 
There are many things you can do with potholes. There is also debris on the road. Smaller, larger, animals. Many scenarios. You can change lane, skow down, and yes, if safe - avoid the obstacle without changing the lane.
Yeah, you can just ignore them. My dad does it all the time. ;) Frankly I don't do much for animals, either. Unless they have hooves, and thus are quite sizable.
It is different for a human because of slower reaction times. FSD can actually be better.
Yeah, I agree that line you quoted was garbage. :p Built towards its strengths, AI has the potential of surpassing overall safety for nearly all humans, in some scenarios all of them.

Again, in the big scheme it is the small potatoes. The granularity you need, the pattern matching level required to make it meaningful without turning into a "phantom brake" gong show, is huge. Fresh patch on a pothole looks a lot like a pothole, for example. It is extremely difficult for humans to discern debris you can ignore vs solid chunks that'll do damage, much less an AI NN doing so. It screams "late game", polishing.
 
Interesting. From this podcast (www.youtube.com/watch?v=FIbvt4_InyU&t=1893): "Most humans are good drivers. ... Some people say - "Oooh, machines, they are going to be so much better than humans"... They are not even close to humans. Humans are incredibly good drivers."

This video really is a reality check. Though I am very interested to try out comma.ai in a Tesla to compare the difference of approach. It uses far less compute power than HW3 has at its disposal. I think the way the steering wheel does not fight you for a snap is fantastic and would totally take out the jerkiness of trying to take back control when AP tries to go somewhere you do not want it. You never know, if the view from George Hotz can be busted up in 2020. Battery day when a Tesla talks back to you and drives from A to B. But so far the reasonable view seems to be nowhere near level 4 or 5. Also Volvo just made LIDAR look sexy the race is heating up big time.


Volvo_SPA2_Luminar_Roofline_Integration.jpg
 
The more I see how slow Tesla is at delivering new FSD features, the more I think it will never happen. Assuming current HW3.0 and sensor suite.

I mean - Ok, stop signs. Great. In 5 years development timeline? How long it will be before Tesla releases unprotected left turns on intersections? Another 5 years?

Bottom line - my prediction is Tesla will never be able to release FSD, even in USA, on HW3.0 and current sensor suite.

Now I guess 99% of the forum would disagree with me. I’m curious - is there anyone else who agrees with my prediction?

I think everyone would agree FSD is not simple. The long tail is very long. I am complimented that my ability to drive is hard to replace by AI. I do not attempt to know if the current FSD sensors and computer have the power to achieve the goal. I wish them success and support them in the effort. Even Ark capital calls FSD success by percentage. When you look at what has been achieved. WOW. Soon space X will fly astronauts to space from US soil. So you go from a startup to Paypal and make your 150 Million. Do you then stop and enjoy the rest of your life or risk it all to got up against the forever established big oil, car manufacturing, the aerospace industry and start a solar company to challenge all the established electrical production and distribution network. To manage incredible success in every one of these businesses (also some failure). I am not a fanboy, my eyes are wide open. The challenges are great but if anyone can do it they can. They are now like google in that they attract the best and brightest. They think outside the box. I have just scratch the surface on the positive side of the argument. There is also a massive argument to evaluate against FSD. I paid for FSD to evaluate it for myself and to watch the historic progress of the most innovative company since apple. I am a mechanic by nature and know in high school that the internal combustion engine was a complete waste of our resources (which will be depleted).
 
  • Like
Reactions: SergeyUser
This is what $1000 will get you seems like a very good review. Now I have no idea how far this hardware set could go but it seems very competent at night and in bad weather. What do you guy think? I think Tesla have way more sensors though. But on the very same path and how good is it against the current FSD preview?

 
This is what $1000 will get you seems like a very good review. Now I have no idea how far this hardware set could go but it seems very competent at night and in bad weather. What do you guy think? I think Tesla have way more sensors though. But on the very same path and how good is it against the current FSD preview?

It is a budget version of the bundled base Autopilot. Which is about inline with its pricing. Price is $1000, IIRC Tesla has about $1500 allocated in their prices for base Autopilot.




It looks like a pretty cool driver assist boost for the non-Tesla vehicles it works with.
 
Last edited:
This is what $1000 will get you seems like a very good review. Now I have no idea how far this hardware set could go but it seems very competent at night and in bad weather. What do you guy think? I think Tesla have way more sensors though. But on the very same path and how good is it against the current FSD preview?



It's hard to deny that it's impressive for such a simple setup. I also like the idea of using the driver facing camera to guage attentiveness. IMO, that is the superior approach to just checking for pressure on the steering wheel. Of course, a combination of both would be ideal. Even though, for whatever bizarre reason, Elon didn't care for the camera approach, he had the foresight to put one in the 3 (I'm not sure if the other models have them), so that feature can possibly come at a later date.
 
Elon didn't care for the camera approach, he had the foresight to put one in the 3 (I'm not sure if the other models have them), so that feature can possibly come at a later date.
It is believed that the cabin facing camera doesn't see into the IR, which is a requirement for tracking eyes behind sunglasses. As well I don't know how good a vantage point the cabin facing camera has for that use? So it would probably need a hardware replacement for the camera.

P.S. comma.ai pretty much had to use a camera approach for that anyway, because they have to use only the factory hardware found in the the steering wheels of vehicles they support. Thus lowest common denominator for this feature. They use lowest common denominator for a lot of stuff now, like braking distance and such. That could theoretically change with software updates, you could get more specific tweaking based on a given vehicle model.
 
  • Funny
Reactions: cucubits
It is believed that the cabin facing camera doesn't see into the IR, which is a requirement for tracking eyes behind sunglasses.

Also when it's dark in the car.

Caddy uses a camera that's not only much better placed than Teslas, but projects IR onto the drivers face to help the camera see better... (a number of cell phone front cams use a similar system for face ID and such)


So yeah the camera in the 3/Y is wholly inadequate in type, tech, and placement for this job.

It's another artifact of Tesla thinking during design of the 3 (and carrying through on the Y) that they'd get right to L4 or better so quickly they wouldn't need to CARE if the driver is paying attention enough to bother designing a better check.
 
It is a budget version of the bundled base Autopilot. Which is about inline with its pricing. Price is $1000, IIRC Tesla has about $1500 allocated in their prices for base Autopilot.




It looks like a pretty cool driver assist boost for the non-Tesla vehicles it works with.

It is one thousand dollars for the unit if your car has the ability to function with it. Lane assist, interactive cruise, braking, etc. Do the manufacturers approve the use of the product? There is a price for all those features being in your car. Tesla says autopilot is $1500 of base price. Looks like a great advanced cruise control feature.
 
  • Like
Reactions: SammichLover
Also when it's dark in the car.

Caddy uses a camera that's not only much better placed than Teslas, but projects IR onto the drivers face to help the camera see better... (a number of cell phone front cams use a similar system for face ID and such)


So yeah the camera in the 3/Y is wholly inadequate in type, tech, and placement for this job.

It's another artifact of Tesla thinking during design of the 3 (and carrying through on the Y) that they'd get right to L4 or better so quickly they wouldn't need to CARE if the driver is paying attention enough to bother designing a better check.

Is there any way you could make the large screen on a model 3 emit enough IR? Then I suppose the camera is not tuned for this.
 
Is there any way you could make the large screen on a model 3 emit enough IR? Then I suppose the camera is not tuned for this.
Right. There could be sufficient ambient IR from body heat and such to have enough contrast to make out the eyes. My understanding is the core problem is the camera's sensor not being built to detect into the IR spectrum. Or at least that is what is expected to be the case.

Theoretically Tesla could have sourced a camera capable of detecting in the IR spectrum but configured the overall system to not collect that data and/or not have that feature of the camera enabled. You'd have to talk to someone that's done the investigation of the camera to try assess the chances of that possibility, as it might be nil. I wouldn't hold out a lot of hope of it being the case.
 
Do the manufacturers approve the use of the product?
Unless it somehow can damage the systems in the vehicle (it is pretty new but I can't find any reports of such) triggering a potential warranty claim, or an owner or 3rd party tries to sue the OEM for damage (a crash or whatever) caused by supposed errant mechanical/software behavior of the vehicle, what the OEM thinks of it is really irrelevant.

There could be issues of liability if you tried to use the system beyond its capabilities. Since it is open source theoretically you could upload to it a modified version of the software and try use that more like you would a Level 3 AI. But that's mostly covered under "you're the driver, so responsible for what the vehicle does" liability and potentially some "you have no business operating a vehicle" traffic laws or even misdemeanor or felony infractions.
 
Last edited:
  • Like
Reactions: focher
It is believed that the cabin facing camera doesn't see into the IR, which is a requirement for tracking eyes behind sunglasses. As well I don't know how good a vantage point the cabin facing camera has for that use? So it would probably need a hardware replacement for the camera.
Depending on the cam quality and field of view, it might work, but who knows. Even if it just tracks the person's head, it could still be useful. The lack of IR is unfortunate.
P.S. comma.ai pretty much had to use a camera approach for that anyway, because they have to use only the factory hardware found in the the steering wheels of vehicles they support
Good point.
Also when it's dark in the car.
It's another artifact of Tesla thinking during design of the 3 (and carrying through on the Y) that they'd get right to L4 or better so quickly they wouldn't need to CARE if the driver is paying attention enough to bother designing a better check.
That's pretty much it it seems:

Elon Musk on Twitter
 
Depending on the cam quality and field of view, it might work, but who knows. Even if it just tracks the person's head, it could still be useful. The lack of IR is unfortunate.
Lack of IR spectrum support makes it nigh unusable for this task. Sunglasses are transparent on the IR spectrum. Without IR support the driver cannot wear glasses that obscure the visible spectrum. “No sunglasses for the driver” is not a viable constraint.
 
Lack of IR spectrum support makes it nigh unusable for this task. Sunglasses are transparent on the IR spectrum. Without IR support the driver cannot wear glasses that obscure the visible spectrum. “No sunglasses for the driver” is not a viable constraint.

I think Tesla was banking on their system being so good that driver monitoring becomes less of an issue (and extending the bag duration).