Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot slowed down driving towards a red light

This site may earn commission on affiliate links.
Personally I'd rather not having any traffic light engagement until traffic lights are made more intelligent.

Where the traffic light talked to the car, and got information from cars well before they reached the stop light. That way the light can optimize traffic flow.

Plus then if someone doesn't stop for the light the light can then curse out the driver through the cars speaker system. If that's not enough the light can remember the car and purposely mess with that person from then on.
 
I don't know how they can implement color detection of a red traffic light when the front camera only sees in black & white!
Here is an example of the picture quality of the front camera retrieved by Jason Hughes from a salvaged model S (original tweet: Jason Hughes on Twitter):
Position of the light in the traffic light.
 
Position of the light in the traffic light.
I was thinking the same thing, but there are many different types of traffic lights, including traffic lights with only 2 colors: Red, and blinking yellow, or those horizontally positioned where the red is either on the left or the right, or both!
Just look at these:
G23CX.jpg
canstock23031568.jpg
double-red.jpg
 
I don't know how they can implement color detection of a red traffic light when the front camera only sees in black & white!
Here is an example of the picture quality of the front camera retrieved by Jason Hughes from a salvaged model S (original tweet: Jason Hughes on Twitter):

Interesting. We'd been told it was monochrome before, but this is the first footage I've seen.

We were also told it was sensitive to near IR, which is clearly not the case (in a daylight seem like this, the leaves would all be white if it was IR sensitive.)

I was surprised at how low the frame rate appears to be, but that could be from the Twitter/web side rather than the original source.
 
Interesting. We'd been told it was monochrome before, but this is the first footage I've seen.

We were also told it was sensitive to near IR, which is clearly not the case (in a daylight seem like this, the leaves would all be white if it was IR sensitive.)

I was surprised at how low the frame rate appears to be, but that could be from the Twitter/web side rather than the original source.
He said in his tweet response that it was converted to low res B&W to get it through the CAN bus. So I assume it is actually a color higher res video that is saved to the MCU. Whatever it is it's very interesting.
 
  • Informative
Reactions: TaoJones
He said in his tweet response that it was converted to low res B&W to get it through the CAN bus. So I assume it is actually a color higher res video that is saved to the MCU. Whatever it is it's very interesting.

I didn't think any images were passed over CANBus - just steering commands and object data.

Lots of folks have said it was a monochrome camera, which has a number of inherent advantages - far less image processing and higher resolution from the same sensor with a much lower data rate mostly.

Then again, lots of folks said it was IR sensitive, too.
 
I didn't think any images were passed over CANBus - just steering commands and object data.

Lots of folks have said it was a monochrome camera, which has a number of inherent advantages - far less image processing and higher resolution from the same sensor with a much lower data rate mostly.

Then again, lots of folks said it was IR sensitive, too.
I have no idea, I was just summarizing what I thought Jason said about it in that link above.
 
The event log is stored in the MCU, so the EyeQ3 (in the camera housing IIRC) has shrunk the images to low-res B&W before transmitting them over the CAN to the MCU, along with any other relevant data.
 
My Model S in full Autopilot initiated a deceleration towards a red light with no vehicles in front of it nor in the ajacent left lane.

I cancelled in the last moments before seeing what would happen but it really was satisfying, if even just a sensor glitch, to see this promised capability "in action" for a moment. For me Autopilot has been a major safety feature and stress reliever and I look forward to using it more often in more of my driving.

Has anyone else experienced this ? I am on an 7.1 I think 2.13.x still with no recent updates.

It was most likely a coincidence, as the weather has been unusually cold causing some sensor issues at times.

But I am really hopeful this is coming soon. This and stopping and resuming at stop signs, at *least* when there are no other cars around, would be yet another one of those special moments we get to share with our cars.

Interesting Autopilot related threads:
This *might* be the data that Mobileye passes to the Tesla computers ??? Without the video from what we've always been told though.
This is what Autopilot sees. • /r/teslamotors

Recent "secret" Elon visit to Mobileye to see the next gen:
Elon Musk reportedly visited Mobileye to test tech for next gen Tesla Autopilot

Every time I read about newly discovered AP function it makes me think that Autopilot and Tesla should really communicate better about what AP will or will not do. Guessing why AP acts certain way decreases safety and Tesla should do all they can to avoid it.
 
I see, so Jason didn't do it, that's just the way it works.

What he has done is access the accident event log and extract / decode the log data - which includes these screenshots. Am guessing that he made the GIF himself (hence the framerate).

Wouldn't be surprised if this is a standard feature of the MobilEye solution that other OEMs like Mercedes also take advantage of.
 
He accessed the crash log of the car he's tinkering with.

From what I know this is the first report of any video passthru capability from MobilEye to the MCU. And first time I've ever heard that Tesla might have video or stills from accidents. Impressive, wk wherever you are :)

Every time I read about newly discovered AP function it makes me think that Autopilot and Tesla should really communicate better about what AP will or will not do. Guessing why AP acts certain way decreases safety and Tesla should do all they can to avoid it.
I actually think the best way to approach today's Autopilot is incredibly simple:

YOU ARE DRIVING ! Let it try for as long as you are comfortable, and especially during long boring drives, but never ever assume it is doing anything other than assisting you.
 
  • Like
Reactions: EVie'sDad and msnow
He accessed the crash log of the car he's tinkering with.

From what I know this is the first report of any video passthru capability from MobilEye to the MCU. And first time I've ever heard that Tesla might have video or stills from accidents. Impressive, wk wherever you are :)


I actually think the best way to approach today's Autopilot is incredibly simple:

YOU ARE DRIVING ! Let it try for as long as you are comfortable, and especially during long boring drives, but never ever assume it is doing anything other than assisting you.

The point is that when you learn as you go you have to make a larger "gray zone" to detect false positives -where you think AP is wrong but it turns out it's just driving/reacting a bit differently than you would.