Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Gps Based Systems' Vulnerability, Tesla Noa Hacked

This site may earn commission on affiliate links.
  • Informative
Reactions: 1 person
I do think concerns about these types of attacks is a little bit overblown. I'd be much more worried about someone throwing a brick through my windshield or a widespread attack through the OTA update software.

While I don’t think this attack vector is likely to be a major problem I have to disagree with your examples. GPS spoofers are commercially available and require no access to the car. They seem a far more likely weapon of choice for a casual vandal or troublemaker than throwing bricks or hacking OTA updates. I mean, anyone can operate a GPS spoofer and it doesn’t feel as much a crime or feel as easy to get caught with as throwing a brick. And hacking OTA updates requires far more expertise. No?
 
While I don’t think this attack vector is likely to be a major problem I have to disagree with your examples. GPS spoofers are commercially available and require no access to the car. They seem a far more likely weapon of choice for a casual vandal or troublemaker than throwing bricks or hacking OTA updates. I mean, anyone can operate a GPS spoofer and it doesn’t feel as much a crime or feel as easy to get caught with as throwing a brick. And hacking OTA updates requires far more expertise. No?
It's much easier to obtain a brick anonymously than it is to obtain a GPS spoofer. I don't know. I feel like someone using a GPS spoofer is just as likely to get caught. There are a million ways to cause random carnage that don't require any outlay of money.
A hack of the OTA update system could potentially kill tens of thousands of people. That's pretty scary to think about. Damn, now I'm thinking about it and it's terrifying what could be possible. Imagine every Tesla in the country starting up and driving full speed into pedestrians.
Obviously this would require a very skilled group of hackers with knowledge of many of the system in the car.
EDIT: This should definitely be a Black Mirror episode!
 
This sort of attack kind of validates Elon's assertion that vision should be the primary sensor input for FSD. If your system relies on HD maps and GPS, a spoofing attack is really bad. If your vision system (and related NN) is really good at interpreting the environment, this should be much less of an issue. Obviously, Tesla hasn't yet moved to a vision-only (or -mostly) system, given the outcome of the test. It will be interesting to see how they progress over the rest of this year and into next year. Clearly, the NN needs to get a lot smarter. Andrej and his team will be busy.
 
  • Like
Reactions: willow_hiller
This sort of attack kind of validates Elon's assertion that vision should be the primary sensor input for FSD. If your system relies on HD maps and GPS, a spoofing attack is really bad. If your vision system (and related NN) is really good at interpreting the environment, this should be much less of an issue. Obviously, Tesla hasn't yet moved to a vision-only (or -mostly) system, given the outcome of the test. It will be interesting to see how they progress over the rest of this year and into next year. Clearly, the NN needs to get a lot smarter. Andrej and his team will be busy.

That’s not exactly true though because HD maps can rely also on visual landmarks, not just GPS, and thus can offer a layer of protection against spoofing which reliance on regular maps does not.
 
  • Like
Reactions: croman
Is the GPS on Teslas really accurate enough that they're using it to navigate off-ramps?
I always thought that type of accuracy required inertial guidance which would also provide protection against spoofing.
Also, this is suspicious:
The driver immediately took manual control but couldn't stop the car from leaving the road.
This seems like a much bigger problem than GPS spoofing!
 
And on a personal note, I have to say Tesla did an amazing job with the autopilot and navigate on autopilot and they are truly paving the way for the entire automotive industry, pushing everyone to get this technology out there.
Our entire team truly admires their tech and we hope to see them succeed in releasing safe, reliable, autonomous systems into the market.
 
And on a personal note, I have to say Tesla did an amazing job with the autopilot and navigate on autopilot and they are truly paving the way for the entire automotive industry, pushing everyone to get this technology out there.
Our entire team truly admires their tech and we hope to see them succeed in releasing safe, reliable, autonomous systems into the market.

The driver immediately took manual control but couldn't stop the car from leaving the road."

Agreed that this seems like the bigger problem. Do you guys have video from inside the car of the intervention? I'm confused about why you could not stop the car from leaving the road! It seems like it would be straightforward with extremely fast reactions (which presumably you would have if you were expecting the car to veer off the road).
 
Seems like Tesla could shut this down fairly easily if they put some effort in to it.

Couple of point to ponder:

The car is already keeping an inertial location solution for tunnels and other areas where it loses GPS reception based on speed and course, and it monitors speed and acceleration and a thousand more parameters continuously.

So If the car was looking for it, it should have seen an impossible jump in GPS location that the inertial solution didn't support, and known it was being spoofed. Then it can yell for help, go to a fallback logic and inertial Nav only, use one of the other two GPS networks (if only one is being spoofed,) or take other action.

The car is also always connected to a cell network, which means it always knows the exact signal strength to all of the cellular towers in range. If necessary, this can provide a location from the tower IDs/locations and signal strengths, like cell phones sometimes use, though it isn't anything like GPS accuracy of course. Could use this solution to validate the GPS solution, or compare the three GPS systems against each other (though they might all be spoof together. Also not positive if Tesla has hardware for all three.)

I am really surprised that NoA jumped over lane lines to exit when it thought it was by its exit. AP normally refuses to cross solid lines under any circumstances, and I know I've seen the car start planning for a lane change coming up to an interchange, and then give up when the line goes solid (construction zone with solid lane lines on my commute a quarter mile short of an interchange.)

It looks to me like whatever logic caused it to bypass the AP rules following the lines is the only reason this test had any dramatic results.
 
Seems like Tesla could shut this down fairly easily if they put some effort in to it.

Couple of point to ponder:

The car is already keeping an inertial location solution for tunnels and other areas where it loses GPS reception based on speed and course, and it monitors speed and acceleration and a thousand more parameters continuously.

So If the car was looking for it, it should have seen an impossible jump in GPS location that the inertial solution didn't support, and known it was being spoofed. Then it can yell for help, go to a fallback logic and inertial Nav only, use one of the other two GPS networks (if only one is being spoofed,) or take other action.

The car is also always connected to a cell network, which means it always knows the exact signal strength to all of the cellular towers in range. If necessary, this can provide a location from the tower IDs/locations and signal strengths, like cell phones sometimes use, though it isn't anything like GPS accuracy of course. Could use this solution to validate the GPS solution, or compare the three GPS systems against each other (though they might all be spoof together. Also not positive if Tesla has hardware for all three.)

I am really surprised that NoA jumped over lane lines to exit when it thought it was by its exit. AP normally refuses to cross solid lines under any circumstances, and I know I've seen the car start planning for a lane change coming up to an interchange, and then give up when the line goes solid (construction zone with solid lane lines on my commute a quarter mile short of an interchange.)

It looks to me like whatever logic caused it to bypass the AP rules following the lines is the only reason this test had any dramatic results.

Of course there is the biggest thing it could do: use vision to understand the road, the signs and the rules of the road, so no amount of spoofing would work (at least unless it was done visually).

But Tesla is probably a really long way from that, which is why NoA is based on rudimentary lane-keeping vision + GPS/maps.
 
Seems like Tesla could shut this down fairly easily if they put some effort in to it.

Couple of point to ponder:

The car is already keeping an inertial location solution for tunnels and other areas where it loses GPS reception based on speed and course, and it monitors speed and acceleration and a thousand more parameters continuously.

So If the car was looking for it, it should have seen an impossible jump in GPS location that the inertial solution didn't support, and known it was being spoofed. Then it can yell for help, go to a fallback logic and inertial Nav only, use one of the other two GPS networks (if only one is being spoofed,) or take other action.

The car is also always connected to a cell network, which means it always knows the exact signal strength to all of the cellular towers in range. If necessary, this can provide a location from the tower IDs/locations and signal strengths, like cell phones sometimes use, though it isn't anything like GPS accuracy of course. Could use this solution to validate the GPS solution, or compare the three GPS systems against each other (though they might all be spoof together. Also not positive if Tesla has hardware for all three.)

I am really surprised that NoA jumped over lane lines to exit when it thought it was by its exit. AP normally refuses to cross solid lines under any circumstances, and I know I've seen the car start planning for a lane change coming up to an interchange, and then give up when the line goes solid (construction zone with solid lane lines on my commute a quarter mile short of an interchange.)

It looks to me like whatever logic caused it to bypass the AP rules following the lines is the only reason this test had any dramatic results.


Looks like this spoofing is more sophisticated than that. In this example there is no jump in the position:

 
Looks like this spoofing is more sophisticated than that. In this example there is no jump in the localization:


Impressive and disturbing that they are able to spoof it that precisely. You'd still see a disconnect between the inertial solution and the GPS one, and a deviation to the side like that should be especially obvious since the car knows very well if the wheels are turned.

But you're right that it'll require much closer review than I was anticipating, and might require some work to enhance the inertial solution to make it accurate and reliable enough to see the spoof as it happens.
 
Looks like this spoofing is more sophisticated than that. In this example there is no jump in the position:
That's a cool trick but it seems like it would be more effective and cheaper to just attach a bomb to the car instead if you're really intent on causing someone harm.
Anyway, the part of the story where the car makes the wrong turn is explained by this:
image.png

The part about not being able to avoid going off the road is not believable IMHO.
Tesla Model 3 Spoofed off the highway - Regulus Researches Hack Navigation System Causing Car to Steer off Road | Regulus Cyber
 
I understand these guys are trying to sell their protection solutions, but I do wish their marketing team wouldn't make up ridiculous things. It's completely absurd to say that they couldn't prevent the car from steering off the road - though if that is actually the case, it would be a cause for great concern (and there are persistent questions about how strong the car's steering can be, which no one knows the answer to)!
 
Regulus Cyber - another bunch of wankers with nothing better to do

I think the activity of such groups are pretty important. Better to reveal security issues before one misuses them.

Some sanity check can be implemented to prevent such attacks based on the car's acceleration, wheel speed and steering sensors but all these devices have some tolerances that accumulate over distance when one uses them for estimating the car's trajectory. I think small divergence in driving path can go unnoticed (assuming cameras are blind).
 
I think the activity of such groups are pretty important. Better to reveal security issues before one misuses them.

Some sanity check can be implemented to prevent such attacks based on the car's acceleration, wheel speed and steering sensors but all these devices have some tolerances that accumulate over distance when one uses them for estimating the car's trajectory. I think small divergence in driving path can go unnoticed (assuming cameras are blind).
I stand by my scoff!