Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Gps Based Systems' Vulnerability, Tesla Noa Hacked

This site may earn commission on affiliate links.
Agreed that this seems like the bigger problem. Do you guys have video from inside the car of the intervention? I'm confused about why you could not stop the car from leaving the road! It seems like it would be straightforward with extremely fast reactions (which presumably you would have if you were expecting the car to veer off the road).

The car deviation occurred instantly as soon as the spoofing signal was picked up by the Tesla's GNSS receiver, the right turn, slow down and wheel spinning was a matter of 1-2 seconds because the Model 3 assumed it is a just about to miss the exit. This combined with the fact there was a 'dotted' white line to the right of the car in that exact moment (the turn to the pit stop) meant the car was instantly turning, and that it was too late to grab the wheel and attempt to turn left back to the highway. It was all really fast and by the time the car did the small right turn it already passed the dotted line and it was no longer possible to return to the highway lane at that point. - I hope that makes sense, the wording in the media has some dramatization and that creates confusion. (This blog post makes it a lot clearer)

And to be honest we did not expect it to happen so fast, any spoofing attempt we did on Model S, for example, wasn't instant and there was some delay between signal transmission and navigation effect. So despite being alert, it would take most Tesla drivers by surprise.
As you would probably agree, many veteran Tesla drivers that use AP often, tend to be less alert since they trust the system to perform properly. This is true to all lvl 2+ autonomy drivers. You can't expect people to be 100% alert at all times, at some point, road attention gives away to fatigue, mobile phone usage or just drifting in thoughts... we can't count on the human factor for the long run if we want the AV revolution to succeed.

And remember the main issue here is not the current Tesla AP systems, which require driver attention at all times. The main concern is this provides ALL automotive companies with an idea of how dangerous it is to utilize GNSS for navigation decisions. It might not jeopardize the driver safety directly, but it means an attacker could externally and remotely force a car to take a turn it is not supposed to take. (And accelerate\decelerate as well). And that is the good case scenario, some cars we tested use GNSS for speed limit information and this can cause them driving 100 MPH on a 30 MPH road which means that by the time they reach that unexpected little town intersection they won't be able to break in time.

This is something both Regulation and the automotive industry should prepare for and plan to protect against. Either by findings mitigation methods by relying upon other sensors or by using anti-spoofing technologies on the GNSS receiver.

ezgif-4-9f9c6ca88de7.gif


P.S. Cameras, Radar and LiDAR can also be spoofed, some quite easily. So sensor cybersecurity, in general, is a crucial aspect for AV safety.
 
And remember the main issue here is not the current Tesla AP systems, which require driver attention at all times. The main concern is this provides ALL automotive companies with an idea of how dangerous it is to utilize GNSS for navigation decisions.
Any autonomous vehicle that does not require driver attention can’t rely on GNSS systems anyway. To achieve safety greater than a human it would have to be able recognize that the pullout is not in fact an exit.
Also, this attack involves attaching a transmitter to the car. If you have physical access to the car there are a million other attacks that one could do.
 
Any autonomous vehicle that does not require driver attention can’t rely on GNSS systems anyway. To achieve safety greater than a human it would have to be able recognize that the pullout is not in fact an exit.
Also, this attack involves attaching a transmitter to the car. If you have physical access to the car there are a million other attacks that one could do.

You are absolutely right, but since all sensors can be jeopardized, why rule out specifically GNSS? Instead, each sensor requires its own protection, GNSS is just one piece of the sensor fusion puzzle. Furthermore, at the moment there are no real navigation pathing alternatives to GNSS (In my first comment I explained why localization is problematic).

During the Tesla 3 experiment, the spoofing antenna was mounted on the roof because we used a very small, low gain antenna, so we won't affect any other nearby cars, phones, or infrastructure. The radius of that antenna was less than a meter forcing us to attach it to the roof for safety reasons. A real attacker wouldn't care about the collateral effect.

The spoofer can easily use an off the shelf high-gain directional antenna to get a range of up to a mile. If they add an amplifier, a range of a few miles is very much possible. It has already been proven that spoofing can even occur across dozens of miles, for example in the Black Sea spoofing attack in June 2017.
As you can see in this image below, for example, we use an off the shelf directional antenna that can spoof up to a mile without attaching anything physical to the car:
0014-_ABZ1826-1024x683.jpg
 
The car deviation occurred instantly as soon as the spoofing signal was picked up by the Tesla's GNSS receiver, the right turn, slow down and wheel spinning was a matter of 1-2 seconds because the Model 3 assumed it is a just about to miss the exit. This combined with the fact there was a 'dotted' white line to the right of the car in that exact moment (the turn to the pit stop) meant the car was instantly turning, and that it was too late to grab the wheel and attempt to turn left back to the highway.
So was it "instantly" turning or did it trigger the turn signal and then, over the course of 1-2 seconds, slow down and turn the wheel. Were the driver's hands on the wheel like they are supposed to be? If they were, I don't see how it was impossible to take control. As soon as the car turns the wheel in a way you don't want it to, you stop letting it. This should take less than a second. Was this on a public road or a closed track?
 
  • Like
Reactions: MorrisonHiker
The car deviation occurred instantly as soon as the spoofing signal was picked up by the Tesla's GNSS receiver, the right turn, slow down and wheel spinning was a matter of 1-2 seconds because the Model 3 assumed it is a just about to miss the exit.

I hope that makes sense, the wording in the media has some dramatization and that creates confusion. (This blog post makes it a lot clearer)

I guess. The blog post says: "Although the car was three miles away from the planned exit when the spoofing attack began, the car reacted as if the exit was just 500 feet away—abruptly slowing down, activating the right turn signal, and making a sharp turn off the main road. The driver was not prepared for this turn and by the time he regained manual control, it was too late to attempt to maneuver back to the highway."

This seems overly dramatic as well... I think if I had been your marketing department, I would have written: "The driver did not have his hands on the wheel when the attack was initiated, and chose not to intervene for 1-2 seconds, to see what would happen, so by the time he reacted to the car's steering input it was too late to safely return to the highway". That seems like a nice accurate description based on what you said. It's better than the blog post description, and better than "The driver immediately took manual control but couldn't stop the car from leaving the road" (which is, approximately, completely false, per your description above).

The situation is totally different with a hand on the wheel and an appropriate response to unexpected car steering inputs. I think most Tesla drivers are used to undesired steering inputs now with LDA/ELDA. ;) Maybe Tesla is training us to react to GNSS spoofing?
 
Last edited:
As far as I am concerned, this is all marketing until we see a video from inside the car during the attack, showing the car’s reactions and the driver’s reactions.

I’m not saying satellite navigation spoofing or jamming is not a big concern, I just think everyone has something they want to sell. Clearly this is proof of concept, and a slight concern for the currently implemented extremely primitive systems that Tesla and others have. But for the systems we have today which are not autonomous, it’s a pretty minor issue. The cars themselves can do much more dangerous things with their current software, without even requiring spoofing!
 
You are absolutely right, but since all sensors can be jeopardized, why rule out specifically GNSS? Instead, each sensor requires its own protection, GNSS is just one piece of the sensor fusion puzzle. Furthermore, at the moment there are no real navigation pathing alternatives to GNSS (In my first comment I explained why localization is problematic).
If the Model 3 were a military vehicle I would agree. Real autonomous vehicles do not rely on GNSS to avoid going off the road. Even if such attacks did become common in the future they could be countered by very primitive inertial guidance using the speed and steering wheel angle sensors. It would require a very high level of technical skill to track the movement of a car and spoof a signal that was close enough to the inertial measurements to not be detected.
I could order an amplifier and RF signal generator online and knock out GPS for a city. It would require much less skill than spoofing and yet no one seems to do it.
 
Favo - This wheel turning and blinker all happened at the same time. Over the course of 1-2 seconds, the car was already within the pit stop. Driver's hands were resting on his lap above the wheel at that instant, in order to see clearly, without any unintentional driver involvement, how the car behaves during spoofing. By the time the driver reacted and grabbed the wheel and regained manual control, it was too late to attempt to maneuver back to the highway safely.

It was done on a remote public road with 2 lanes that eventually lead to an exit, so it would enable the NOA feature.

Alan - good advice regarding the wording, thank you.
 
Over the course of 1-2 seconds, the car was already within the pit stop. Driver's hands were resting on his lap above the wheel at that instant, in order to see clearly, without any unintentional driver involvement, how the car behaves during spoofing. By the time the driver reacted and grabbed the wheel and regained manual control, it was too late to attempt to maneuver back to the highway safely.

We'd definitely all love to see the video from inside the car of this sequence, which presumably you guys have.
 
Everyone has something to sell. Tesla included.

So just because someone has something to sell doesn’t automatically mean it is nonsense. But of course it can be nonsense (see: FSD features in 3-6 months, buy a Tesla now). We need to calmly and fairly assess to whole.
 
Following the large publicity and massive amount of questions that followed we decided to do a live webinar on the Tesla model, 3 experiments share some insights with you all, including some unseen footage. We might have industry experts\Tesla reps too.You are all invited to register here -
Welcome! You are invited to join a webinar: The Forgotten Side of Automotive Cybersecurity - Protecting Navigation & Sensors. After registering, you will receive a confirmation email about joining the webinar.
Regulus Tesla 3 Webinar.png
 
Driver's hands were resting on his lap above the wheel at that instant, in order to see clearly, without any unintentional driver involvement, how the car behaves during spoofing. By the time the driver reacted and grabbed the wheel and regained manual control, it was too late to attempt to maneuver back to the highway safely.

Note that the driver violated the "Always Keep Your Hands on the Wheel" warning that Tesla Autopilot displays prominently:

1078702682_4615625972001_1511ATC-Talking-Cars-Episode-81-Still.jpg


"Driver's hands resting on his lap" is a direct violation of that rule and warning.

Nevertheless I agree that Tesla should take GPS spoofing seriously - they should probably OTA update their driving logic to prioritize the inertial integration of the accelerometers over the GPS signal to prevent "abrupt" changes to the driving solution.

But as others have said, at this moment this attack is probably a low priority concern to anything but high value targets who probably utilize professional drivers anyway and who probably won't be travelling in an unarmored Tesla within gun sight distance of hostiles. ;)
 
Last edited:
  • Like
Reactions: MorrisonHiker
GPS spoofing attack scenarios have been around for as long as the system has been deployed. While an issue, it is certainly not on my list of things to ever worry about as the chances of someone targeting me with this attack is only slightly higher than having an asteroid land on my house. There are many easy ways to mitigate the effects of a GPS attack especially since GPS receivers now ship with GLONASS and Galileo receivers also. A manufacturer could compare the coordinates from all three and throw out erroneous data. There are also the visual cues mentioned above that car manufacturers could you to verify GPS results.