Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomy Investor Day - April 22 at 2pm ET

This site may earn commission on affiliate links.
Now we're getting somewhere. If he said the car can drive anywhere in real traffic with X miles between disengagements, that would be an actual statement. This "feature complete" mumbo jumbo is not.

Disagree. Feature complete, to me, means it can handle city and highway driving and navigate by itself to your destination while obeying traffic signs and lights, but requires your supervision like AP does right now.
 
  • Like
Reactions: RLC3 and WarpedOne
Few things revealed/confirmed on the technical side are confidence building (not if you're a Lidar fan).

-- Many algorithms can be used to obtain depth info from vision.
-- Vision also has the ability to figure out distance with perspective cues. Tesla uses radar distance to automatically train vision neural net without needing human intervention. This the first time I've heard people doing that and it's super smart.
-- Tesla NN is trained to use small car/pedestrian movements (body languages) to tell their intentions. This is also news to me and probably the remaining missing link imo. Only vision has resolution to do that. Lidar can not no matter how much training you give it.
-- Lidar + HD GPS mapping many use will not work. Small change of road feature will collapse the whole system.

Our understandings went full circle from vision can be as good as Lidar to vision is better to it's needed. Elon made a point why put the expensive and power hungry Lidar there if vision is still needed? All those points are not bullet proof, nothing is, but they are all very good points. We'll see if there will be Lidar converts but I'm sure there will be when Tesla turns on the FSD and proves it works.
How is combining radar with cameras any different than combining cameras with lidar? The reason not to use lidar is that it’s expensive which is a perfectly good reason! There’s no need to claim that adding lidar makes a system technically inferior.
Waymo does pedestrian prediction using cameras too.
Pedestrians and Self-Driving Vehicles | Let's Talk Self-Driving
 
Arrghh.

I thought we'd see the test drives, but they NDA'd those.

Bah.

More than anything else the test drives would have told us how far along they really were.

Edit: They didn't do an NDA, but they didn't allow the riders to record any video. We do have some really descriptive experiences though like this one.


It sounds impressive and reassuring for my FSD purchase, but it would have been even more impressive and reassuring if the passengers could have inputted any destination they wanted within a 10-20 mile radius and have the car drive itself there. A set route and gives them a bit more leeway to use some tricks to smooth things out (but having the car adapt to other cars and traffic conditions by itself even on a set route is super impressive on its own), while a variable route would have shown a pretty crazy level of confidence and capability on three months of NN training.
 
  • Love
Reactions: fmonera
You must worry more about the air quality rather than autopilot.
Got that covered. I have 6 air cleaners in my house and a HEPA filter in my central HVAC system. My in house PM2.5 levels are outstanding.

Get the car to drive itself through a car wash...
I use a rinse-less wash and it works great.

I can leave my garage with a spotless clean car, encounter rain and 50 miles into my 130 mile trip lose NoA. It has happened about a dozen times and I make this trip often. I paid for FSD at maximum price. I never complained about the price but I'll keep complaining about the cameras.
 
Got that covered. I have 6 air cleaners in my house and a HEPA filter in my central HVAC system. My in house PM2.5 levels are outstanding.

I use a rinse-less wash and it works great.

I can leave my garage with a spotless clean car, encounter rain and 50 miles into my 130 mile trip lose NoA. It has happened about a dozen times and I make this trip often. I paid for FSD at maximum price. I never complained about the price but I'll keep complaining about the cameras.
That’s nuts. I guess the complaint that Teslas are designed for California is true. I’ve never had the cameras get dirty and I wash my car once a month at most.
 
That video from Tesla was much cooler once I slowed it down and realized what the visualization was actually showing on the left side of the 3’s screen. That would be very cool to have in production vehicles. I wonder if it was at all analogous to what investors saw during their test drives? It sounds like they for sure had a rear view camera window up what they could swipe to show whatever camera view they wanted along with the relevant visualizations showing what the car was tracking on that camera.
 
How is combining radar with cameras any different than combining cameras with lidar? The reason not to use lidar is that it’s expensive which is a perfectly good reason! There’s no need to claim that adding lidar makes a system technically inferior.
Waymo does pedestrian prediction using cameras too.
Pedestrians and Self-Driving Vehicles | Let's Talk Self-Driving

The point is whether Lidar or vision is your primary system. It does appear Lidar as the primary system is inferior to vision and capable NN as the primary system even without regard of cost. If vision is your primary system using radar to complement it is much better than using Lidar. The only shortcoming of radar compares to Lidar is resolution but vision does not need help there.

BTW the way Karpathy describes it it's not vision and radar work together all the time. Vision is doing most if not all the work after properly trained by radar. Using vision only is a lot more elegant solution than to use several types of sensors all at the same time.

Elon's answer to the question of whether there is any use of Lidar was no and he said that with a grin. If you can think of a reason why Lidar is needed or why it helps please let the world know.
 
Last edited:
  • Like
Reactions: Guy V and Scott7
That was the most remarkable presentation I have ever watched, and not because it was some 4 hours long. The technical detail was fantastic and worth a heft price of admission, but then Elon's vision for AP was just stunning. I'm still trying to catch my breath from Elon's closing remarks that I take to mean that Tesla will not be profitable as a car company per se, but he expects it to be highly profitable as a car sharing platform. And in case anybody doubted his resolve, he straight out said in classic Elon fashion that the Tesla cost structure can be summed up as AP R&D.

Wow.
 
  • Like
Reactions: pilotSteve
How is combining radar with cameras any different than combining cameras with lidar? The reason not to use lidar is that it’s expensive which is a perfectly good reason! There’s no need to claim that adding lidar makes a system technically inferior.
Waymo does pedestrian prediction using cameras too.
Pedestrians and Self-Driving Vehicles | Let's Talk Self-Driving
Doesn’t LIDAR have basically the same limitations as vision? From everything I’ve read LIDAR is terrible in rain, fog and water kick-up from leading cars. It’s main benefit over traditional vision systems is accuracy evidently. But if you’re able to tune and process vision to within a reasonably parity with LIDAR then LIDAR becomes redundant. If that redundant system is also very expensive and hard to gracefully integrate into the car, then the argument is hard to make for LIDAR.

RADAR on the other hand does have abilities that LIDAR and vision do not. It can see through things that are visually opaque like the aforementioned weather conditions. The fact that it’s cheap and easily integrated makes is a good complement to a robust vision system. Neither LIDAR nor vision can do the “two cars ahead” prediction that we already see in AP2 and above cars now.

I’m certainly not an expert, but it seems like Tesla is claiming they’ve made LIDAR mostly redundant based on their current and projected abilities with the Vision NN, thus the cost of time and dollars to integrate it isn’t worth the return.
 
Doesn’t LIDAR have basically the same limitations as vision? From everything I’ve read LIDAR is terrible in rain, fog and water kick-up from leading cars. It’s main benefit over traditional vision systems is accuracy evidently. But if you’re able to tune and process vision to within a reasonably parity with LIDAR then LIDAR becomes redundant. If that redundant system is also very expensive and hard to gracefully integrate into the car, then the argument is hard to make for LIDAR.

RADAR on the other hand does have abilities that LIDAR and vision do not. It can see through things that are visually opaque like the aforementioned weather conditions. The fact that it’s cheap and easily integrated makes is a good complement to a robust vision system. Neither LIDAR nor vision can do the “two cars ahead” prediction that we already see in AP2 and above cars now.

I’m certainly not an expert, but it seems like Tesla is claiming they’ve made LIDAR mostly redundant based on their current and projected abilities with the Vision NN, thus the cost of time and dollars to integrate it isn’t worth the return.
They plan to make lidar redundant. It certainly seems theoretically possible but you’re relying on your neural net being able to get depth information reliably. How humans are able to do that is incredibly complicated (having two eyes does nothing for you beyond 30 feet). I’m not saying it can’t be done but I think people here are always underestimating how much reliability you need for self driving. Humans go 150,000 miles between accidents.
I think lidar does have limitations that would limit the weather conditions under which it can be used. Not a problem here in Southern California :cool:
 
  • Like
Reactions: jsmay311
They plan to make lidar redundant. It certainly seems theoretically possible but you’re relying on your neural net being able to get depth information reliably. How humans are able to do that is incredibly complicated (having two eyes does nothing for you beyond 30 feet). I’m not saying it can’t be done but I think people here are always underestimating how much reliability you need for self driving. Humans go 150,000 miles between accidents.
I think lidar does have limitations that would limit the weather conditions under which it can be used. Not a problem here in Southern California :cool:
Right. I don’t know how it compares to LIDAR now, but if the roadmap gives you confidence that you can reach that goal then there’s no point in taking the effort to integrate it, given its drawbacks. The car being able to process more frames means it’s able to infer depth data from many, many perspectives (3 from the front cameras and as many as the frame rate will allow on the singular cameras).

Look at the accuracy of Photogrammetry in something like Meshroom for 3D scanning objects now with a single cell-phone camera that’s moved around an object compared to the older 3D scanning technique of using lasers. Photogrammetry isn’t quite as accurate but it’s very close and useful for the use-case it’s designed for.

TeslaVision is similar to Photogrammetry. Having all the telemetry from the car in regards to speed and relative position, camera position and focal length give them a wealth of data to determine depth. It seems like it’s largely a problem of processing power and data.
 
Elon's answer to the question of whether there is any use of Lidar was no and he said that with a grin. If you can think of a reason why Lidar is needed or why it helps please let the world know.
Right now AP will run into stopped cars, jersey barriers, and semi trucks. Obviously they plan on fixing that but I would assume that having a secondary sensor that could detecy those things would help. Once you start operating in much more complicated urban environments I would imagine that you’ll find more situations where the camera and neural net miss things.
Again I’m not saying lidar is necessary but it sure doesn’t hurt. This whole lidar argument is sort of silly and it sounds like it’s been going on in the forum forever. There’s no way to prove anything until someone achieves safety greater than a human with cameras alone.
 
  • Like
Reactions: jsmay311
OK it's settled. The guy who started the Lidar stuff at Google just said Elon was right about Lidar. What else is new?


We'll see if there will be Lidar converts but I'm sure there will be when Tesla turns on the FSD and proves it works.

Looks we might start to see that happening begining tomorrow.
 
Last edited:
Right now AP will run into stopped cars, jersey barriers, and semi trucks. Obviously they plan on fixing that but I would assume that having a secondary sensor that could detecy those things would help. Once you start operating in much more complicated urban environments I would imagine that you’ll find more situations where the camera and neural net miss things.
Again I’m not saying lidar is necessary but it sure doesn’t hurt. This whole lidar argument is sort of silly and it sounds like it’s been going on in the forum forever. There’s no way to prove anything until someone achieves safety greater than a human with cameras alone.
I’ve never personally argued for or against LIDAR before today. But the presentation was information for a layman to watch. The point cloud Karpathy showed that was generated only from vision data looked really good. The failing of AP up until now are likely software failures rather than failures of the Vision systems eventual capability. That’s not an excuse, but it’s a consideration when deciding whether or not to augment the system with expensive additional hardware.

One of Tesla’s main advantages is the massive dataset, which is only possible when generated by cars with a reasonably priced AV system. If they had to also integrate LIDAR into the AP suite, they sure wouldn’t be able to get away with sticking it on all the cars and only activating via software unlock. Tesla now gathers information from my Model X (which has EAP on AP2.0) hardware AND our Model 3, which has no purchased Autonomy features on 2.5HW. That’s a brilliant strategy that only works if they can produce the AP hardware cheaply.

Frankly, if Tesla were charging $9000 for EAP when I bought my Model X I probably would have skipped the option. They have to balance out the benefits of having more cars in the fleet cheaply vs. the short term benefit of having LIDAR on less cars while they develop the Vision-only system. By choosing Vision-only for the hardware requirement they probably collect an order of magnitude more data, especially considering the cost of an AP system w/LIDAR compared to the overall vehicle cost of a Model 3, which is their most popular model by sales. Look at how angry people got that the Model 3 only reached a base price of $37500 rather than the promised $35,000. A LIDAR unit, from my understanding, is several thousand dollars just by itself. I think a very, very small percentage of M3 buyers would opt for the extra cost. I don’t think the uptake rate on FSD is very high in general, even from the higher priced models.
 
Last edited:
Some more comments from Elon on Tesla FSD Computer vs NVIDIA Pegasus
Someone asked him about this on Twitter and this is his response:
@valueanalyst: Can you prove that@Tesla HW3's 144 TOPS will be enough for FSD with the safety level necessary for regulatory approval and that @nvidia DRIVE Pegasus' 320 TOPS is excessive?

@scottwww: Pegasus is 500W vs HW3’s 72W

@elonmusk: Exactly. Also, you can’t actually use computation from a separate GPU effectively, as you get choked on the bus, so most of the computation is irrelevant. High power, high cooling, but low true, usable TOPS. Worst of all worlds.

@xandriteme: And the FSD computer is cheaper!

@elonmusk: On yeah, that too