Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Firmware 8.0

This site may earn commission on affiliate links.
I'm still on Firmware 17.3.15 (since Jan 24) and have inoperable Cruise Control and intermittent Autosteer (mostly inoperable). Should I be just sucking it up and waiting for an update or take it back in to the SC. I took it in on Jan 25, and they said the engineers in Fremont were aware and working on it.
Autosteer, as you might have noticed in the release notes, is only available on divided freeways below 45 MPH (now 50MPH) and where the lanes are clearly marked. If your using Autosteer at all, then cruse control is operating, it doesn't work without it. So I don't understand why you say it is inoperable?
 
Autosteer, as you might have noticed in the release notes, is only available on divided freeways below 45 MPH (now 50MPH) and where the lanes are clearly marked. If your using Autosteer at all, then cruse control is operating, it doesn't work without it. So I don't understand why you say it is inoperable?

Well, I'm not sure if you're being a wise guy or sincerely asking. So, I'll assume you're a fellow engineer and asking. I drive daily on Route 15 (Leesburg, VA) to the Dulles Greenway which transitions to the VA-267 Toll Road, to Tysons Corner. I updated to 17.3.15 on 26-JAN, and I've had the warning pretty much since that AutoSteer is disabled, and no cruise control. Think I already said this all though hence the original post.

As matter of fact, all of sudden today, I was on a divided highway (RTE 15 heading south from Frederick, MD back to Leesburg, VA) and I was actually able to execute cruise control, but no autosteer. So me and my 300mbps FIOS connection are waiting, but I'm driving my car like a boss in the meantime!
 
No, I just listen to the song from the beginning again or go to the next song. Why would you want to start listening to a song from somewhere in the middle of it?

It becomes really annoying if you are out running errands and you get most the way through a song then it restarts when you get back in then you might get most way through and then get out again and it restarts. Not an issue with songs like 3 minutes long but longer scores they just keep restarting and maybe i want to hear the second half of the score and not the beginning 3x. Audiobooks are one example.

Also it is annoying if I am loading stuff in the car that every time i open a door the stupid song restarts. It is a simple option for resume to be put in the menu or just have it off. Yes, you can advance to the next song but why force that. It is just an annoyance that wasn't there a few builds ago.
 
Quick question: Does AP1 run on the same processor/OS as the main touchscreen, or does it have it's own dedicated MobileEye processor?
Hank, I agree with the other guys that much of "AP" runs on dedicated processors, and while I hate to split hairs, I'd offer that "AP" as we think of the end-to-end set of capabilities, runs in a distributed processor environment in our MS beyond just Mobileye...

My thought being, Mobileye for AP1 may be doing a lot of the heavy work directly related to interpreting the radar, sensors and camera, but something is then coordinating all that, as well as linking it with say Nav data as to which roads TACC/AP can be enabled on and the continual tweaks being analyzed re speed limits and how that influences real time AP speed maximums... It's not just the Mobileye camera reading signs from what I experience, so integration into the Nav data is happening somewhere else. The IC is also displaying AP data for the driver -- how much of that is calculated in another processor and just being integrated for the visual presentation there, vs the IC doing more of the data transformation, I doubt any of us know exactly for a fact. Also, doesn't the CID have responsibility for pulling down and packaging up "AP data" that goes to/from the fleet and the mothership and Elon's database in the sky -- and that data is also used to influence how data from the Mobileye processors is interpreting the real world? My point being again, when we talk processors in our Tesla, IMHO it's sort of like a symphony with many parts contributing to the whole that makes AP what it is. Some, like Mobileye, are more prevalent perhaps when AP is active (and even when not), but without the others as well, there is no complete AP.

I probably just muddied the waters. Sorry 'bout that!
 
Having worked with video analytics in my last job (note, for a defense solution, not automotive), the way the system works is that a video signal is delivered to a video card where the analytics take place (the computer interpreting the image and identifying things to trigger on (i.e., a vehicle, a pedestrian, a sign, whatever).

There are two architectures: server and edge-based. In a server-based solution, the video image is transmitted all the way to the video server where the processing takes place and then digital data (car detected, whatever) is generated. In an edge-based solution, the processing takes place near the camera and just the alerts and a de-rez'd image are transmitted back to the main server (which uses less bandwidth when you are potentially transmitting data from many cameras to a central location). You want very high-definition, high-contrast images for the video analytics to interpret, but images that good aren't necessary for a human to look at in, say a security command center. I am not aware of a camera where the video analytics actually take place in the camera (for our edge-based solutions we took a video signal out of the camera and ran it to a box co-located with the camera for the analytics and then ran fiber from the analytics box back to the main server.

Obviously distance isn't an issue in a car and (from the pictures I've seen of people doing dash-cam installs and pulling the covers off where the forward-looking camera(s) are, it appears that it is a video (and not digital) signal coming out of the camera(s) with the video analytics taking place somewhere else (presumably the GPU(s) behind the dash-board).

I agree with the other comments that the entire autopilot/EAP/FSD system is actually a number of computers/processors, likely on different circuit boards all working together. At a guess, there are at least:
- video boards with GPUs processing the video signal(s) and digitizing them.
- video analytics processors doing the object identification and generating various alerts/triggers (may also be on the video board)
- the navigation system processor and route data (coming from the Garmin nav database and what is displayed on the dashboard, not the 17" screen)
- a steering processor which tracks wheel position
- the TACC computer/processor which tracks pedal position and controls speed/braking
- the radar processor which processes the radar data and turns it into alerts/triggers
- the ultrasonic sensor processor(s) which process the sensor data and turn it into alerts/triggers
and finally
- the autopilot processor which takes the map data (which I suspect is totally separate from the 17" google maps and Garmin navigation databases) and combines it with (a) the video analytics alerts/data, (b) the route data from the Garmin syste, (c) the radar alerts/triggers and (d) the ultrasonic alerts/triggers, interprets it all to create a "picture" around the car and then issues steering and speed commands to the car.

That's part of the complexity of a system like this...and my guess is that Mobileye (AP1) was only providing the video analytics part of the system...possibly the radar and ultrasonic as well but I suspect the overall "brain" of the autopilot (what I call the autopilot processor) has always been Tesla's. Of course I could be all wet on the architecture Tesla has implemented.
 
Having worked with video analytics in my last job (note, for a defense solution, not automotive), the way the system works is that a video signal is delivered to a video card where the analytics take place (the computer interpreting the image and identifying things to trigger on (i.e., a vehicle, a pedestrian, a sign, whatever).

There are two architectures: server and edge-based. In a server-based solution, the video image is transmitted all the way to the video server where the processing takes place and then digital data (car detected, whatever) is generated. In an edge-based solution, the processing takes place near the camera and just the alerts and a de-rez'd image are transmitted back to the main server (which uses less bandwidth when you are potentially transmitting data from many cameras to a central location). You want very high-definition, high-contrast images for the video analytics to interpret, but images that good aren't necessary for a human to look at in, say a security command center. I am not aware of a camera where the video analytics actually take place in the camera (for our edge-based solutions we took a video signal out of the camera and ran it to a box co-located with the camera for the analytics and then ran fiber from the analytics box back to the main server.

Obviously distance isn't an issue in a car and (from the pictures I've seen of people doing dash-cam installs and pulling the covers off where the forward-looking camera(s) are, it appears that it is a video (and not digital) signal coming out of the camera(s) with the video analytics taking place somewhere else (presumably the GPU(s) behind the dash-board).

I agree with the other comments that the entire autopilot/EAP/FSD system is actually a number of computers/processors, likely on different circuit boards all working together. At a guess, there are at least:
- video boards with GPUs processing the video signal(s) and digitizing them.
- video analytics processors doing the object identification and generating various alerts/triggers (may also be on the video board)
- the navigation system processor and route data (coming from the Garmin nav database and what is displayed on the dashboard, not the 17" screen)
- a steering processor which tracks wheel position
- the TACC computer/processor which tracks pedal position and controls speed/braking
- the radar processor which processes the radar data and turns it into alerts/triggers
- the ultrasonic sensor processor(s) which process the sensor data and turn it into alerts/triggers
and finally
- the autopilot processor which takes the map data (which I suspect is totally separate from the 17" google maps and Garmin navigation databases) and combines it with (a) the video analytics alerts/data, (b) the route data from the Garmin syste, (c) the radar alerts/triggers and (d) the ultrasonic alerts/triggers, interprets it all to create a "picture" around the car and then issues steering and speed commands to the car.

That's part of the complexity of a system like this...and my guess is that Mobileye (AP1) was only providing the video analytics part of the system...possibly the radar and ultrasonic as well but I suspect the overall "brain" of the autopilot (what I call the autopilot processor) has always been Tesla's. Of course I could be all wet on the architecture Tesla has implemented.

Interesting post. I thought it was pretty well established that on AP1 Model Ss, the Mobileye chip was mounted behind the rear view mirror and only connected to anything else by relatively low speed CANBus - not enough bandwidth to carry the video, so presumably all the analysis was happening right there (though it turns out the car does pass some low resolution frames of an accident back) - wk057 both described this architecture for us and found the hidden footage on a salvage car he was working on.

The AP1 X is different, though - that camera has what appears to be a coax connection to a processor elsewhere in the car (but everything I've read suggests it's the same processor and edge based architecture, just with a long wire in the middle.)
 
Having worked with video analytics in my last job (note, for a defense solution, not automotive), the way the system works is that a video signal is delivered to a video card where the analytics take place (the computer interpreting the image and identifying things to trigger on (i.e., a vehicle, a pedestrian, a sign, whatever).

There are two architectures: server and edge-based. In a server-based solution, the video image is transmitted all the way to the video server where the processing takes place and then digital data (car detected, whatever) is generated. In an edge-based solution, the processing takes place near the camera and just the alerts and a de-rez'd image are transmitted back to the main server (which uses less bandwidth when you are potentially transmitting data from many cameras to a central location). You want very high-definition, high-contrast images for the video analytics to interpret, but images that good aren't necessary for a human to look at in, say a security command center. I am not aware of a camera where the video analytics actually take place in the camera (for our edge-based solutions we took a video signal out of the camera and ran it to a box co-located with the camera for the analytics and then ran fiber from the analytics box back to the main server.

Obviously distance isn't an issue in a car and (from the pictures I've seen of people doing dash-cam installs and pulling the covers off where the forward-looking camera(s) are, it appears that it is a video (and not digital) signal coming out of the camera(s) with the video analytics taking place somewhere else (presumably the GPU(s) behind the dash-board).

I agree with the other comments that the entire autopilot/EAP/FSD system is actually a number of computers/processors, likely on different circuit boards all working together. At a guess, there are at least:
- video boards with GPUs processing the video signal(s) and digitizing them.
- video analytics processors doing the object identification and generating various alerts/triggers (may also be on the video board)
- the navigation system processor and route data (coming from the Garmin nav database and what is displayed on the dashboard, not the 17" screen)
- a steering processor which tracks wheel position
- the TACC computer/processor which tracks pedal position and controls speed/braking
- the radar processor which processes the radar data and turns it into alerts/triggers
- the ultrasonic sensor processor(s) which process the sensor data and turn it into alerts/triggers
and finally
- the autopilot processor which takes the map data (which I suspect is totally separate from the 17" google maps and Garmin navigation databases) and combines it with (a) the video analytics alerts/data, (b) the route data from the Garmin syste, (c) the radar alerts/triggers and (d) the ultrasonic alerts/triggers, interprets it all to create a "picture" around the car and then issues steering and speed commands to the car.

That's part of the complexity of a system like this...and my guess is that Mobileye (AP1) was only providing the video analytics part of the system...possibly the radar and ultrasonic as well but I suspect the overall "brain" of the autopilot (what I call the autopilot processor) has always been Tesla's. Of course I could be all wet on the architecture Tesla has implemented.

You should charge for that answer. Great over view.
 
Well, I'm not sure if you're being a wise guy or sincerely asking. So, I'll assume you're a fellow engineer and asking. I drive daily on Route 15 (Leesburg, VA) to the Dulles Greenway which transitions to the VA-267 Toll Road, to Tysons Corner. I updated to 17.3.15 on 26-JAN, and I've had the warning pretty much since that AutoSteer is disabled, and no cruise control. Think I already said this all though hence the original post.

As matter of fact, all of sudden today, I was on a divided highway (RTE 15 heading south from Frederick, MD back to Leesburg, VA) and I was actually able to execute cruise control, but no autosteer. So me and my 300mbps FIOS connection are waiting, but I'm driving my car like a boss in the meantime!

Went to the SC in Tysons yesterday and they updated my firmware to 17.3.28. Guess what...no more 'autosteer disabled' or 'cruise control is disabled'. And, cruise / aurosteer str available, on the same roads I've been traveling.