ccharleb
Member
And even if you pause the song (on Slacker), the song will restart the next time you come back.Is anyone bothered by the fact the songs restart instead of just resuming?
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
And even if you pause the song (on Slacker), the song will restart the next time you come back.Is anyone bothered by the fact the songs restart instead of just resuming?
Autosteer, as you might have noticed in the release notes, is only available on divided freeways below 45 MPH (now 50MPH) and where the lanes are clearly marked. If your using Autosteer at all, then cruse control is operating, it doesn't work without it. So I don't understand why you say it is inoperable?I'm still on Firmware 17.3.15 (since Jan 24) and have inoperable Cruise Control and intermittent Autosteer (mostly inoperable). Should I be just sucking it up and waiting for an update or take it back in to the SC. I took it in on Jan 25, and they said the engineers in Fremont were aware and working on it.
Ughhhh didn't realize. I hate it!And even if you pause the song (on Slacker), the song will restart the next time you come back.
Autosteer, as you might have noticed in the release notes, is only available on divided freeways below 45 MPH (now 50MPH) and where the lanes are clearly marked. If your using Autosteer at all, then cruse control is operating, it doesn't work without it. So I don't understand why you say it is inoperable?
Anyone else constantly getting auto park not available or disabled warnings every time you slow down?
No, I just listen to the song from the beginning again or go to the next song. Why would you want to start listening to a song from somewhere in the middle of it?Is anyone bothered by the fact the songs restart instead of just resuming?
Some songs are actually chapters of audio books and 30 minutes long.No, I just listen to the song from the beginning again or go to the next song. Why would you want to start listening to a song from somewhere in the middle of it?
No, I just listen to the song from the beginning again or go to the next song. Why would you want to start listening to a song from somewhere in the middle of it?
Quick question: Does AP1 run on the same processor/OS as the main touchscreen, or does it have it's own dedicated MobileEye processor?
Quick question: Does AP1 run on the same processor/OS as the main touchscreen, or does it have it's own dedicated MobileEye processor?
Hank, I agree with the other guys that much of "AP" runs on dedicated processors, and while I hate to split hairs, I'd offer that "AP" as we think of the end-to-end set of capabilities, runs in a distributed processor environment in our MS beyond just Mobileye...Quick question: Does AP1 run on the same processor/OS as the main touchscreen, or does it have it's own dedicated MobileEye processor?
AP1 runs a separate Mobileye EyeQ3 chip. The main touch screen runs on a Nvidia Tegra chip.
Having worked with video analytics in my last job (note, for a defense solution, not automotive), the way the system works is that a video signal is delivered to a video card where the analytics take place (the computer interpreting the image and identifying things to trigger on (i.e., a vehicle, a pedestrian, a sign, whatever).
There are two architectures: server and edge-based. In a server-based solution, the video image is transmitted all the way to the video server where the processing takes place and then digital data (car detected, whatever) is generated. In an edge-based solution, the processing takes place near the camera and just the alerts and a de-rez'd image are transmitted back to the main server (which uses less bandwidth when you are potentially transmitting data from many cameras to a central location). You want very high-definition, high-contrast images for the video analytics to interpret, but images that good aren't necessary for a human to look at in, say a security command center. I am not aware of a camera where the video analytics actually take place in the camera (for our edge-based solutions we took a video signal out of the camera and ran it to a box co-located with the camera for the analytics and then ran fiber from the analytics box back to the main server.
Obviously distance isn't an issue in a car and (from the pictures I've seen of people doing dash-cam installs and pulling the covers off where the forward-looking camera(s) are, it appears that it is a video (and not digital) signal coming out of the camera(s) with the video analytics taking place somewhere else (presumably the GPU(s) behind the dash-board).
I agree with the other comments that the entire autopilot/EAP/FSD system is actually a number of computers/processors, likely on different circuit boards all working together. At a guess, there are at least:
- video boards with GPUs processing the video signal(s) and digitizing them.
- video analytics processors doing the object identification and generating various alerts/triggers (may also be on the video board)
- the navigation system processor and route data (coming from the Garmin nav database and what is displayed on the dashboard, not the 17" screen)
- a steering processor which tracks wheel position
- the TACC computer/processor which tracks pedal position and controls speed/braking
- the radar processor which processes the radar data and turns it into alerts/triggers
- the ultrasonic sensor processor(s) which process the sensor data and turn it into alerts/triggers
and finally
- the autopilot processor which takes the map data (which I suspect is totally separate from the 17" google maps and Garmin navigation databases) and combines it with (a) the video analytics alerts/data, (b) the route data from the Garmin syste, (c) the radar alerts/triggers and (d) the ultrasonic alerts/triggers, interprets it all to create a "picture" around the car and then issues steering and speed commands to the car.
That's part of the complexity of a system like this...and my guess is that Mobileye (AP1) was only providing the video analytics part of the system...possibly the radar and ultrasonic as well but I suspect the overall "brain" of the autopilot (what I call the autopilot processor) has always been Tesla's. Of course I could be all wet on the architecture Tesla has implemented.
Having worked with video analytics in my last job (note, for a defense solution, not automotive), the way the system works is that a video signal is delivered to a video card where the analytics take place (the computer interpreting the image and identifying things to trigger on (i.e., a vehicle, a pedestrian, a sign, whatever).
There are two architectures: server and edge-based. In a server-based solution, the video image is transmitted all the way to the video server where the processing takes place and then digital data (car detected, whatever) is generated. In an edge-based solution, the processing takes place near the camera and just the alerts and a de-rez'd image are transmitted back to the main server (which uses less bandwidth when you are potentially transmitting data from many cameras to a central location). You want very high-definition, high-contrast images for the video analytics to interpret, but images that good aren't necessary for a human to look at in, say a security command center. I am not aware of a camera where the video analytics actually take place in the camera (for our edge-based solutions we took a video signal out of the camera and ran it to a box co-located with the camera for the analytics and then ran fiber from the analytics box back to the main server.
Obviously distance isn't an issue in a car and (from the pictures I've seen of people doing dash-cam installs and pulling the covers off where the forward-looking camera(s) are, it appears that it is a video (and not digital) signal coming out of the camera(s) with the video analytics taking place somewhere else (presumably the GPU(s) behind the dash-board).
I agree with the other comments that the entire autopilot/EAP/FSD system is actually a number of computers/processors, likely on different circuit boards all working together. At a guess, there are at least:
- video boards with GPUs processing the video signal(s) and digitizing them.
- video analytics processors doing the object identification and generating various alerts/triggers (may also be on the video board)
- the navigation system processor and route data (coming from the Garmin nav database and what is displayed on the dashboard, not the 17" screen)
- a steering processor which tracks wheel position
- the TACC computer/processor which tracks pedal position and controls speed/braking
- the radar processor which processes the radar data and turns it into alerts/triggers
- the ultrasonic sensor processor(s) which process the sensor data and turn it into alerts/triggers
and finally
- the autopilot processor which takes the map data (which I suspect is totally separate from the 17" google maps and Garmin navigation databases) and combines it with (a) the video analytics alerts/data, (b) the route data from the Garmin syste, (c) the radar alerts/triggers and (d) the ultrasonic alerts/triggers, interprets it all to create a "picture" around the car and then issues steering and speed commands to the car.
That's part of the complexity of a system like this...and my guess is that Mobileye (AP1) was only providing the video analytics part of the system...possibly the radar and ultrasonic as well but I suspect the overall "brain" of the autopilot (what I call the autopilot processor) has always been Tesla's. Of course I could be all wet on the architecture Tesla has implemented.
Well, I'm not sure if you're being a wise guy or sincerely asking. So, I'll assume you're a fellow engineer and asking. I drive daily on Route 15 (Leesburg, VA) to the Dulles Greenway which transitions to the VA-267 Toll Road, to Tysons Corner. I updated to 17.3.15 on 26-JAN, and I've had the warning pretty much since that AutoSteer is disabled, and no cruise control. Think I already said this all though hence the original post.
As matter of fact, all of sudden today, I was on a divided highway (RTE 15 heading south from Frederick, MD back to Leesburg, VA) and I was actually able to execute cruise control, but no autosteer. So me and my 300mbps FIOS connection are waiting, but I'm driving my car like a boss in the meantime!