Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • We just completed a significant update, but we still have some fixes and adjustments to make, so please bear with us for the time being. Cheers!

Crazy/Jittery Cars on Blindspot Monitor?

640k

Member
Jul 15, 2019
928
623
Cincinnati
is it normal for the cars to kind of "freak out" and spin around/flutter all over the place? are my sensors ok?

it also thinks there's 4 motorcycles stacked on top of each other in my garage. it does some funky things.
 

RTPEV

Member
Mar 21, 2016
811
791
Durham, NC
Yeah, that's pretty normal, at least when you are stopped, although I've noticed that with more recent updates it's a bit more stable than it used to be.

In the garage it thinks my wife's Bolt is a semi-truck!
 
  • Love
Reactions: 640k

Octo

Member
Jun 28, 2019
386
402
Dallas
It’s the car’s current object detection capability. Should be fixed before it becomes a robotaxi as it won’t be able to drive by itself if it thinks the real world is full of frantically spinning and teleporting vehicles.
 
  • Funny
Reactions: DriveMe

jebinc

M3 LR AWD w/FSD and white premium interior
Jun 19, 2019
3,407
1,684
Seattle area
It's "normal," in that we all seem to suffer from that affliction... It's "not normal" in that it really shouldn't be happening. Tesla has yet to release an update to address this, but we continue to get regular updates with new games and other parlor tricks!
 

Wingsy

Member
Apr 30, 2019
124
158
Mocksville, NC
My M3 did not do this until it spent a couple days at the repair facility a few weeks ago. They jittered a little but I never saw them spin and dance around like they do now. To quote a passenger new to Tesla, "Jesus". Not a thing that Tesla should be proud of, and needs fixing if they want to rely on word-of-mouth for advertising.
 

RTPEV

Member
Mar 21, 2016
811
791
Durham, NC
It’s the car’s current object detection capability. Should be fixed before it becomes a robotaxi as it won’t be able to drive by itself if it thinks the real world is full of frantically spinning and teleporting vehicles.

I think it's more of a visualization problem, and not an inherent problem with how the car is deciding how to act upon it's environment. It's basically the object recognition has a more difficult time recognizing the orientation of stationary vehicles, and being stationary (both you and the other cars), it's not like the neural net is going to decide to steer to avoid a car that looks like it's pointed in your direction (but not moving). Even at slow speeds when the cars are still jittery, the path planning is not taking any adverse action. Actual closing speed data showing the adjacent vehicles on a non-intersecting path would seem to have a higher weight than the object recognition "seeing" a sideways car for a frame or two.

That said, it would be nice if the visualization software would "smooth" the movement somewhat so the cars looked more stable. And at times, it seems like this has been done. When I first got my car, adjacent cars were always shaking. In later s/w updates they have been rock solid, only to come back again in later updates.
 

Octo

Member
Jun 28, 2019
386
402
Dallas
I think it's more of a visualization problem, and not an inherent problem with how the car is deciding how to act upon it's environment. It's basically the object recognition has a more difficult time recognizing the orientation of stationary vehicles, and being stationary (both you and the other cars), it's not like the neural net is going to decide to steer to avoid a car that looks like it's pointed in your direction (but not moving).

From what I understand (based on NN demos), the NN assigns probabilities to what it detects and does so several times a second.

I’m less concerned about the random rotation from a safety/practical POV (I do think it’s a massive mistake and lapse of judgment by the responsible people at Tesla to release something that looks so broken for your marquee feature).

I’m more concerned that there seems to be a lack of SW layer that intelligently interprets the NN output.

The visualization suggests that there’s no plausibility filtering and the car simply renders (and likely uses) what the NN spits out. This is especially annoying if the probability of an object hovers around the visualization (or detection who knows) threshold. Then cars pop in and out of existence or morph from car to truck and back.

I’m most concerned that some of the decisions that the car makes are based on spurious, badly filtered NN output, for example phantom braking or collision alerts for objects that don’t exist in reality (and vice versa) and poor continuation of broken lane markers (that’s the reason why I don’t even use AP anymore because it’s too dangerous/stressful on Dallas highways that apparently aren’t as cleanly marked as the ones that people on this forum use with NOA).

So I don’t think it’s only a visualization problem. Right now, at least my Model 3, makes decisions based on flawed NN vision.
 

OCR1

Active Member
Jan 28, 2018
3,748
4,093
Southern California
It’s actually pretty fun to watch. I’ve been meaning to take some videos when it’s doing crazy stuff and add a music track to it. Just haven’t got around to doing it yet. Haven’t figured out the right song either.
 

640k

Member
Jul 15, 2019
928
623
Cincinnati
From what I understand (based on NN demos), the NN assigns probabilities to what it detects and does so several times a second.

I’m less concerned about the random rotation from a safety/practical POV (I do think it’s a massive mistake and lapse of judgment by the responsible people at Tesla to release something that looks so broken for your marquee feature).

I’m more concerned that there seems to be a lack of SW layer that intelligently interprets the NN output.

The visualization suggests that there’s no plausibility filtering and the car simply renders (and likely uses) what the NN spits out. This is especially annoying if the probability of an object hovers around the visualization (or detection who knows) threshold. Then cars pop in and out of existence or morph from car to truck and back.

I’m most concerned that some of the decisions that the car makes are based on spurious, badly filtered NN output, for example phantom braking or collision alerts for objects that don’t exist in reality (and vice versa) and poor continuation of broken lane markers (that’s the reason why I don’t even use AP anymore because it’s too dangerous/stressful on Dallas highways that apparently aren’t as cleanly marked as the ones that people on this forum use with NOA).

So I don’t think it’s only a visualization problem. Right now, at least my Model 3, makes decisions based on flawed NN vision.
i'm 100% with you on this. who cares what the car is doing/thinking in the background? the visualization of the representation of where the computer is calculating the position and location of the cars should employ a layer of basic logic, even if it's "late". on the road, they could switch up the accuracy/timing but when you're at a traffic light or in a parking lot, it either needs to not be there or use some logical assumptions about the positioning of objects/vehicles.
 

Niroc

Member
Apr 28, 2019
110
68
Portland Oregon
I am relieved to know that this isn’t an isolated event to my m3. Sitting a stop light sometimes the other lane a car is jittering and is going to ram into me. It is a bit unsettling.

I no longer trust the images of what is there any longer and ignore that display. As for being observant what is around me, I went old school and bought some small cheap octave mirrors. It really helps and can monitor my blind spots.

I am hoping that Tesla has a future fix for this
 

Niroc

Member
Apr 28, 2019
110
68
Portland Oregon
I am relieved to know that this isn’t an isolated event to my m3. Sitting a stop light sometimes the other lane a car is jittering and is going to ram into me. It is a bit unsettling.

I no longer trust the images of what is there any longer and ignore that display. As for being observant what is around me, I went old school and bought some small cheap octave mirrors. It really helps and can monitor my blind spots.

I am hoping that Tesla has a future fix for this

Sorry not octave, blind spot mirrors. Need more coffee before posting
 

RDoc

S85D
Aug 24, 2012
2,719
1,567
Boston North Shore
i'm 100% with you on this. who cares what the car is doing/thinking in the background? the visualization of the representation of where the computer is calculating the position and location of the cars should employ a layer of basic logic, even if it's "late". on the road, they could switch up the accuracy/timing but when you're at a traffic light or in a parking lot, it either needs to not be there or use some logical assumptions about the positioning of objects/vehicles.
IMHO this is indicative of a major software architectural design problem. Tesla seems to just be using the sensor data directly rather than building a model of the world which is updated by the sensors and which the car uses to make driving decisions.

My concern is that without the model it's going to be subject to errors like seeing a truck in the opposite lane, then when the truck illegally turns left across the car's lane, it disappears because the camera is fooled by lighting etc. In a model based system, it would know that trucks don't simply vanish, in a sensor only driven system, out of sight, out of mind.
 

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top