TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker or making a Paypal contribution here:

Security researcher disabled self-driving car with a laser and a Raspberry Pi

Discussion in 'Cars and Transportation' started by Urgo, Sep 7, 2015.

  1. Urgo

    Urgo Member

    Aug 25, 2015
    Raleigh, NC
    Saw this in the news today. Autopilot in the Tesla shouldn't be effected by this specifically since it is using cameras instead of these Lidar sensors but still worth noting when thinking of all the things Tesla needs to be considering before releasing auto pilot to the world.

    Self-driving cars can be fooled by fake cars, pedestrians and other bogus signals

  2. pmadflyer

    pmadflyer Member

    Jan 8, 2015
    Shawnee, KS
    I believe in most places in the US, it is illegal to shine visible lasers in the direction of a person's face, as it can cause temporary blindness and cause an accident. Shining a visible laser at an aircraft is a felony and there was recently a push to enforce it after an aircraft (possibly in NY) was forced to abort a landing due to laser light entering the cockpit from a ground based laser pointer. Although I expect a filter to be put on future lidar systems, its interesting that the sensors are sensitive to visible light.
  3. FlasherZ

    FlasherZ Sig Model S + Sig Model X + Model 3 Resv

    Jun 21, 2012
    Or permanent.

    There's an IT networking saying for those people who deal with long-haul fiber optic cables... "Do not look into laser with remaining good eye."
  4. RichardL

    RichardL Member

    Oct 6, 2013
    San Carlos, California
    Hard to think of this as any more of a 'vulnerability' than having a large cardboard cutout of a person and holding it up suddenly and then calling that a vulnerability for a 'regular' car!
  5. RDoc

    RDoc S85D

    Aug 24, 2012
    Boston North Shore
    IMHO this is a real vulnerability that should get fixed before release. In answer to the above comments:

    a) I didn't see anything that talked about visible lasers, most likely he used the same frequency as the lidar.
    b) The restrictions on lasers has to do with power output, low powered lasers such as the pointers used in classrooms and likely the ones used in this case aren't restricted.
    c) This is nothing like holding up a cutout. Lasers are very small and easily concealed, so a hidden hacker, or one in another moving vehicle could spoof the lidar.

    I suppose the fix would be putting a code into the lidar pulse and only accepting pulses with the same code.

    Then there's the question of existing radar equiped cars such as Teslas. While accidental interference doesn't seem to be a major issue, I wonder if a malevolent person could cause a car to suddenly brake on the highway by spoofing the radar?
  6. Johan

    Johan Took a TSLA bear test. Came back negative.

    Feb 9, 2012
    Drammen, Norway
    The principal discussion here is: Do self driving cars have to be perfect, non-manipulable, foolproof, or do they only need to be (far) better and safer than human drivers? Shining a laser or a bright light in the eyes of a human driver will have the potential to cause accidents. This is bascically the same, regardless of the frequency of the light. However, it's clearly malevolent behaviour. Here, let me come up with some other things that will impede self-driving cars and can cause accidents with them: strong electromagnetic pulses that disable all the on-board electronics, projecting fake imagery on the road using projectors, spray painting over speed signs and other traffic signs, dressing children or animals up in camoflauge of some kind etc. etc.

    Of course there systems will be prone to manipulation, some forms the same as could be used to target a human, sometimes new types of manipulation. So what?
  7. davewill

    davewill Member

    Feb 5, 2014
    San Diego, CA, US
    I would be more concerned if it could keep the car from seeing an obstruction. If something is blinding the sensors I certainly want the car to stop, not keep going.
  8. David99

    David99 Active Member

    Jan 31, 2014
    Brea, Orange County
    +1 to Johan. Self driving cars are not perfect. Of course any system can be fooled. Any security camera can be disabled by putting a plastic bag over it. A lock can be broken, a window glass can be smashed, a tire can be deflated with a nail, a driver can be blinded with a strong light or laser, a traffic light can be disabled, ...
    I hate those stupid articles that try to make it sound like this was an issue.

Share This Page