Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla autopilot HW3

This site may earn commission on affiliate links.
See here:

dJEQHXs.jpg
They're cameras, not ultrasound sensors? I was responding to the comment saying ultrasound sensors cover 360 degrees.
 
I don’t understand why everyone is so focused on the images of surrounding cars displayed on the screen. In my experience, they are pretty flawless. I took this picture this morning when I pulled into work to illustrate what I’m talking about:

47BDDC11-187B-4C70-822C-D9D8092F8B56.jpeg

I’m parked in my straight parking spot, with a car to my left and nothing to my right. Since I paid for FSD (Full Specter Detection), it is properly showing me the ghost vehicles that surround my car while ignoring the real ones that I'm clearly not interested in.

Feature: complete!
 
FW 8.4 cleans up the binnacle display quite a bit. I can’t recall when exactly it changed in one of the point releases in V8 (when dancing cars were always and everywhere on the older neural net), but it’s that substantial of an improvement. Uh, again. :)

I wasn’t driving by watching the screen so it never really bothered me in V9 when they started using the repeater cameras, but I paid special attention today and, both in motion and stopped, it was very good. The ghost cars are gone...or are they...bwahahaha!!!
 
Tesla just confirmed HW3 is in production in an IR release. Also looks like they're planning an investors autonomy demo on April 19.

Tesla To Host Autonomy Investor Day
PALO ALTO, Calif., April 03, 2019 (GLOBE NEWSWIRE) -- Tesla is making significant progress in the development of its autonomous driving software and hardware, including our FSD computer, which is currently in production and which will enable full-self driving via future over-the-air software updates. With a number of very exciting developments coming in the weeks and months ahead, Tesla will host investors on the morning of April 19th at our headquarters in Palo Alto to provide a deep dive into our self-driving technology and road map.

Investors will be able to take test-drives to experience our Autopilot software first-hand, including features and functionality that are under active development. Investors will also hear directly from Elon Musk, as well as VP of Engineering, Stuart Bowers, VP of Hardware Engineering, Pete Bannon, and Sr. Director of AI, Andrej Karpathy.

The event will be webcast. Additional details forthcoming.
 
Tesla just confirmed HW3 is in production in an IR release. Also looks like they're planning an investors autonomy demo on April 19.

Finaly Tesla is showing the real stuff now. Going to be really interesting to see the capabilities of the HW3 computer/NW. Since this will be a demo on very well known area it might not reflect how this will work in the wild.
"...deep dive into our self-driving technology and road map..." and Karpathy on stage hopefully we will get many new details on NW etc that will make the forums her explode ;)
 
  • Like
Reactions: DDotJ
No word of HW3 in a single opportune time these past few months. Maybe it’s just still not in production and they will pull a q2 lever by announcing and updating the self driving language **again** to suggest level 5?

Who knows, I guess like last night while we waited for the show to start, we must “stay tuned”.

Q2 lever confirmed.
 
Was there a way to check what hardware is installed on your car by using the Dev Tools in Chrome on the Tesla Account Page?

Yes, go to myTesla and inspect your car photo. It will list some design codes. You can also use the json API. My understanding is that while some HW3 cars were produced most are still AP2.5 (or at least were last week). Ordering FSD does not seem to impact the HW3 equation as of last week.
 
The computer knows the direction and approximate location of a car. What it cannot do is say with certainty that the red car whose front is seen by one camera and whose back is seen by a different camera is actually one car and not a car with a flat back and another car with a flat front.

Image fusion is hard.

One thing I think you are missing in addition to this: ranging from camera image is hard too.

I believe this is a much bigger issue for Tesla than image fusion given how much the car distances dance around on both the IC and in @verygreen ’s videos.
 
  • Like
Reactions: rnortman
The industry in general is sure getting better at many things. Too bad this is not always or currently reflected in how Teslas work.
I think we should give them some slack. It is one thing making a paper about something, it’s another thing making a safe critical system or a commercially viable product.

I assume that Tesla has put some Lidars on some cars, used the Lidar to generate ground truths, trained their big neural network on this with radar/sonar/camera as input and range/occupancy as output, has this as one of all the subtasks of their big neural network, where the subtask shares some mid layers with the main tasks such as steering and acceleration.
 
I think we should give them some slack. It is one thing making a paper about something, it’s another thing making a safe critical system or a commercially viable product.

I assume that Tesla has put some Lidars on some cars, used the Lidar to generate ground truths, trained their big neural network on this with radar/sonar/camera as input and range/occupancy as output, has this as one of all the subtasks of their big neural network, where the subtask shares some mid layers with the main tasks such as steering and acceleration.

I would give Tesla a lot of slack had they not pre-sold me Level 5 capable hardware in 2016, Tesla Network details in 2017, coast to coast summon and whatnot — and dissing on those using Lidar etc in the process. But because they have done all that, the perspective changes on what is expected of Tesla.

They set this bar themselves and a lot of people believed them for a long time.
 
I would give Tesla a lot of slack had they not pre-sold me Level 5 capable hardware in 2016, Tesla Network details in 2017, coast to coast summon and whatnot — and dissing on those using Lidar etc in the process. But because they have done all that, the perspective changes on what is expected of Tesla.

They set this bar themselves and a lot of people believed them for a long time.

Tesla network details coming April 22nd...
 
  • Like
Reactions: PaulJohn
Source?

The April 1st, sorry 19th, sorry 22nd event invites I’ve seen make no mention of Tesla Network or ride sharing. Did I miss one?

Twitter seems favorable in the last 9 hours..
It’s there for when we start competing with Uber/Lyft & people allow their car to earn money for them as part of the Tesla shared autonomy fleet. In case someone messes up your car, you can check the video.

And this exchange regarding shared autonomy (edit ninja by @Randy7fx ):
Twitter
That’s exactly the idea. What’s not well understood is that Tesla cars being made *today* will be able to do that for you. Just a matter of finishing the software & going through regulatory approval. Will be explained in depth via live webcast on April 22.
 
  • Like
Reactions: Randy7fx
It may be hard, but we are pretty good at already:
Papers With Code : Monocular Depth Estimation

Monocular depth estimation where you can see the entire object and see it in context is one thing. Depth estimation of a vehicle in the side cameras in a nearby lane -- where you may not see the whole vehicle and can see very little context -- is very different. By "context" I mean you can see depth cues all around the object; like for example the ground, the background, other objects between the camera and the object you're estimating depth to. Without a bounding box around the entire vehicle and without other depth cues, this becomes a very difficult task to perform with the required accuracy. Basically you can say "hey that thing is really close", but you won't know with accuracy which side of the lane line it's on -- unless you have enough context to see both its tires and the lane line.

I know that the current software is incapable of doing this because literally every day during my commute TACC brakes for vehicles in the neighboring lane which are not even making any indication that they might come into my lane and in fact are not particularly close to the lane line.
 
Elons Tweet this morning...

Citing Elon's tweets as a source of facts is rather amusing...

The fact that Elon is talking about the rear-facing camera in the 3 being for autonomous ride sharing tells me that this whole thing is about juicing sales and the stock price. I mean, seriously, who believes that these things will ever be robo-taxis? And by "these things" I specifically mean AP2 cars produced since Dec 2016, not some future product.