Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation AI Software Technology Hardware Science

Researchers Discover How To Fool Tesla's Autopilot System (cnet.com) 177

An anonymous reader writes from a report via CNET: Researchers from the University of South Carolina, Zhejiang University and Qihoo 360 have discovered how to fool Tesla's Autopilot sensors, according to a report from Wired. The researchers were able to trick the system into thinking an object didn't exist when it did, and that an object existed when in fact it did not. Therefore, possible security concerns arise as Autopilot could drive incorrectly, potentially putting passengers and others in danger. CNET reports: "Two pieces of radio equipment were used to convince Tesla's radar sensor that a cart was not placed directly in front of it. One of those pieces, a signal generator from Keysight Technologies, costs about $90,000. The group also tricked the car's short-range parking sensors into malfunctioning using about $40 worth of equipment. Wired points out that this was, thankfully, a rather difficult feat. Most of the technological tomfoolery was done on a stationary car. Some of the required equipment was expensive, and it didn't always work. But it brings up an important point -- even though Autopilot is quite capable, there's still no substitute for an attentive human driver, ready to take control at a moment's notice."
This discussion has been archived. No new comments can be posted.

Researchers Discover How To Fool Tesla's Autopilot System

Comments Filter:
  • Illusions (Score:5, Funny)

    by Anonymous Coward on Thursday August 04, 2016 @05:28PM (#52647301)

    Because it's so hard to make humans see or not see things.

    • Re:Illusions (Score:5, Insightful)

      by Guybrush_T ( 980074 ) on Thursday August 04, 2016 @05:47PM (#52647477)

      This actually shows how great autopilot can be, especially if it combines a camera, radar, and other sensors, compared to humans who can be sooo easily defeated.

      But in any case, this is just about attacking a car : it is illegal. There are many other (cheap) ways to cause an accident : blow a tire, use light, fumes, oil, ice, or use a missile. If someone wants to attack a car, there are plenty of choices.

      Maybe the only difference here is that it may be hard to understand afterwards what happened. The secret services may like that.

      • by Anonymous Coward

        I think this study is useful only if it tells us something about how Autopilot might fail in a real world situation. E.g. if the weather service doppler radars are possibly going to confuse Tesla sensors, it might be nice to know about that sooner rather than later. I can't see that the study does that. But maybe there's some useful substance there if one digs a bit.

        "there's still no substitute for an attentive human driver, ready to take control at a moment's notice." That seems to me a very silly noti

        • 1. If I have to give the car my full attention, why don't I just drive the damn thing?

          Drive one, then you'll understand. All my car has is adaptive cruise control. It reduces stress while driving significantly. Even just being able to rest your foot helps. I'll never buy another car without it.

          • Re:Illusions (Score:4, Insightful)

            by fluffernutter ( 1411889 ) on Thursday August 04, 2016 @07:42PM (#52648197)
            But if you relax that much you're risking getting into an accident. At least, I'd have to relax that much.
          • I don't understand, I speed up and slow down using my fingers while on cruise control.
      • You can cut the brake lines or plant a bomb in any car, that will do the job quite well. You can also use a bigger car to ram the car off the road, it should also work.

        However, messing with the car using radio waves (be it confusing the radar, hacking via bluetooth or wifi or hacking the car via the internet) does not leave evidence. Or, at least does not leave any evidence that may be used to identify you.

        So, standing on a bridge and dropping bowling balls or bricks on cars, that are driving under the brid

      • by DrXym ( 126579 )

        This actually shows how great autopilot can be, especially if it combines a camera, radar, and other sensors, compared to humans who can be sooo easily defeated.

        It's only great if it actually works in every circumstance, or if it largely works but the driver's attentiveness is enforced to act as a backup just in case. Then and only then can it be said to be safer than a driver by themselves in every scenario.

        But in any case, this is just about attacking a car : it is illegal. There are many other (cheap) ways to cause an accident : blow a tire, use light, fumes, oil, ice, or use a missile. If someone wants to attack a car, there are plenty of choices.

        And now there will be plenty more. It's emerging technology and techniques to grief / attack it will emerge too. A sharpie pen could cripple a car. Strong light or radio interference might blind the car. Coated / reflective glass on buildings, or carried by pas

      • by rhazz ( 2853871 )

        Maybe the only difference here is that it may be hard to understand afterwards what happened.

        I don't know about that. When a serious accident occurs and you see a large group of nerds in lab coats hurriedly packing up strange equipment into briefcases, piling into a van, and speeding away, we can safely assume there was outside interference.

      • Indeed it is illegal. Saying this is a flaw with their system is like saying a Russian or Chinese guy who runs out and leaps on your hood for an insurance scam is a flaw in the driver.

    • Re:Illusions (Score:5, Insightful)

      by halltk1983 ( 855209 ) <halltk1983@yahoo.com> on Thursday August 04, 2016 @06:09PM (#52647663) Homepage Journal
      Yeah, like this car that crashed into a painted tunnel scene: https://i.imgur.com/mOTHgnfl.j... [imgur.com] People make mistakes. Machines make mistakes, because people made the machines. It's good to improve the machines through testing like this, but let's be honest: the time when the machine is the better driver is fast approaching, especially considering the number of distracted drivers there are on the road.
    • My daughter seems to have a high fail rate in (not)seeing. Equipment cost (pre-failure, discounting cost of the actual car): $0. I don't want to discuss equipment costs post-failure.

      If she rode in an auto-piloted car, she would have someone else to blame, and my lawyer would have someone else to sue.

      I'm all for the auto-pilot solution.

    • by torkus ( 1133985 )

      Malicious acts can potentially cause Bad Things. New at 10.

      Or go one further....a $10 laser pointer can temporarily blind pilots flying large planes.

      A driver swerving around can easily cause a crash.

      It's the paranoid 'what if' mentality that's resulted in so many pointless laws and regulations around new technology.

    • Driving people have blind spots, so illusion is not required.
  • by roca ( 43122 ) on Thursday August 04, 2016 @05:31PM (#52647335) Homepage

    If you spent the same resources to fool a human driver, how hard would that be?

    • by MightyYar ( 622222 ) on Thursday August 04, 2016 @05:38PM (#52647401)

      I don't think you need to get very fancy... I would think a laser or extremely bright light bought from eBay would similarly blind a human "sensor".

      • I don't think you need to get very fancy... I would think a laser or extremely bright light bought from eBay would similarly blind a human "sensor".

        It is in fact very capable of doing so and for that reason it is illegal to point it at aircraft

      • Oh it can be done with far less tech than that. When Anna Nicole Smith worked as a underwear model for H&M back in 1993 http://z.cdn-expressen.se/imag... [cdn-expressen.se] there where reports of drivers in Norway that drove off the road, somehow they where not looking at the road anymore :)
      • by torkus ( 1133985 )

        You've got one built into every single car...high beams are notorious for blinding drivers on dark roads. (not to mention being extremely annoying even when you've got other cars/lights around)

    • Re:How about humans? (Score:5, Interesting)

      by hawguy ( 1600213 ) on Thursday August 04, 2016 @05:38PM (#52647405)

      If you spent the same resources to fool a human driver, how hard would that be?

      Exactly, for far less than $90,000 you can set up a water curtain projection system that would fool any unsuspecting driver. Put one on a highway and show a film of an approaching wrong-way driving semi and let the hilarity commence.

      They are already used as hard-to-miss warning signs on some roads: https://youtu.be/Dk9DjO-_rT8 [youtu.be]

      • The key difference is that any system designed to fool a human is blindingly obvious to any passerby. They will spot the danger, call the cops, and the device will be torn down, and the person behind it arrested if they're hanging around nearby.

        A system designed to fool radar or sonar or lidar can be invisible to people, and go unnoticed and undetected even after it's caused an accident. If it's a portable system (like mounted in a van), the perp can simply drive off. meanwhile it takes the cops, NTSB
    • by Tablizer ( 95088 )

      I almost flattened a pedestrian the other night who was wearing clothing that blended into the background. His pants were the same color as the road, and his shirt the same color as the foliage on the other side of the road, and he happened to be lined up so that his shirt boundary matched the road/foliage boundary of the other side of the street. The stealth was probably not intentional, but effective nevertheless.

      Reminds me of a prank where actors wore stripes that blended into the Abbey Road crossing, ma

    • For WAAAY less than 90K I can easily cause a human to not see something is there... because the $300 high powered laser would have blinded them....

      This article is dumb. There are as many attacks against human drivers that are just as cheap or cheaper that would work better. Want to crash a car at highway speed? Get a slingshot and a piece of spark plug ceramic and take out the poor bastard's windshield on a tricky corner at night. Or shoot a tire out. Or hit them with a 20000 lumen spotlight.

    • Comment removed based on user account deletion
    • Spend?! Hell, I can jump in front of a human controlled car at a blind intersection and earn money while I sleep*!

      *in traction of course.

  • Of course you can trick any sensor invented by man some way or other. That's nothing new. We even know tons of ways to trick the sensors made by god/nature aka our eyes as well. Shine a bright light into them for $10 or maybe $100 and the driver will be forced to drive blind. Or you can have a $0 natural snow storm and the driver will also be on literally very dangerous ground: zero visibility and icy roads.

    The point is not that either can be fooled, the point is, is the mechanical sensor better or at least

    • by Anonymous Coward

      1) In a busy street, you could fuck with a lot of autopilots undetected, but you can't do the same thing with bright lights. What is more, a bright light doesn't trick your brain - you react to it accordingly. Since car autopilots are just (relatively) stupid computers that do no more and no less than what their developers programmed to do, they are MUCH easier to game with more human cunning.

      2) Your snowstorm is irrelevant, as this is something one prepares for, or chooses to avoid.

      3) There has been no ind

  • Researchers have discovered a way to disable the human "autopilot" system using just $10 worth of equipment. By shining a flashlight in their eyes, they are able to totally disable the primary optical sensors, so preventing the navigation system from avoiding objects in the car's path. While this was only tested in a lab environment, it is feared that Russian agents and ISIS terrorists could use a similar technique to cause mass casualties.

    • Assuming the tech is developed to the point of being relatively portable, it'd be easier to get away with spoofing an autopilot system than a human. Someone standing on the side of the road shining a laser pointer at your car is pretty easy to spot by both you and other bystanders. Someone standing on the side of the road wearing a backpack with a radar jamming device isn't quite as obvious. Pick a convenient corner with a big dropoff, stand there and watch a tesla or two go over the edge, then walk away. N
    • by pakar ( 813627 )

      Just you wait... there will soon be a flashlight and laser-pointer licence.....

  • by Anonymous Coward

    Don't drive behind a truck full of signal generators on a bumpy road.

  • So for $90,000 you can set up a trap for a Tesla car running with autopilot, and possibly cause a crash and kill the driver. Lots of people can do that a lot cheaper using a gun. So what's the problem?
    • Tesla is one of the more popular cars in places like Seattle, Vancouver, and SF, that's what.

    • by LWATCDR ( 28044 )

      Of course it is silly since the autopilot has been fooled just by nature.

    • by dohzer ( 867770 )

      Guns are expensive. Why not just throw a brick through a speeding car's windshield?

    • by swb ( 14022 )

      It's reasonable research. The equipment may cost $90k now, but that doesn't mean that the techniques developed couldn't be refined, specialized and miniaturized for cheap in the future.

      Guns and bricks and home made spike strips work, but they have difficult limitations that make them limited use techniques. Jamming autopilot in some way may be something extremely hard to detect, possible to do at distance, more selectively or more en mass without much risk of detection.

      I think it's reasonable to think abo

      • It's reasonable research. The equipment may cost $90k now, but that doesn't mean that the techniques developed couldn't be refined, specialized and miniaturized for cheap in the future.

        How about when the gigafactory for autosensing disablement equipment gets going?

        • by swb ( 14022 )

          Isn't that the kind of thing the Chinese would turn out? AFAIK that's where people get cell jammers.

    • Using a gun is cheaper, but leaves evidence and if somebody sees you using the gun, they might tell the police.

      On the other hand, if the radio equipment can be placed inside a backpack (or two), then there is no evidence that I caused the accident. I am just standing there minding my own business and then walk (or drive) away. If the place usually is full of people, nobody would notice or remember me and would not be able to link me to the accident.

      Also, the researchers most likely used general purpose equi

  • Seriously, who autoenables autonomous driving but a fool.

    Now, if you'll excuse me, I'll go back to using my 60 mpg stick vehicle that doesn't bleed security info as it was made in 1989.

    • by AK Marc ( 707885 )
      I'd rather walk than drive a Geo Metro. I'll get there faster, too.
    • Now, if you'll excuse me, I'll go back to using my 60 mpg stick vehicle that doesn't bleed security info as it was made in 1989.

      The bleeding comes after even a minor freeway crash in that coffin, which has to be a Geo Metro or a CRX HF; both complete deathtraps.

  • This is stupid (Score:4, Insightful)

    by AK Marc ( 707885 ) on Thursday August 04, 2016 @05:42PM (#52647435)
    Yes, someone going through great effort can cause a crash. I've know cases where people stoodn on overpasses and threw down bricks to cause crashes. Nobody published papers on the "brick loophole" in car security. In most of the examples, it'd have been easier to just cut the brake lines. But we have to target the sensors to get media attention, for a non-story.
    • Yes, someone going through great effort can cause a crash. I've know cases where people stoodn on overpasses and threw down bricks to cause crashes. Nobody published papers on the "brick loophole" in car security. In most of the examples, it'd have been easier to just cut the brake lines. But we have to target the sensors to get media attention, for a non-story.

      We're really good at spotting things that distract humans, not so much at spotting things that distract AIs. Even though this was a deliberate attack there could be other things unintentionally causing interference and leading to crashes, or other classes of attack we haven't discovered yet and may be a lot easier to execute.

      • by AK Marc ( 707885 )
        Having to use $90k of specialty gear to trick a sensor doesn't sound like something that is a likely inadvertent operational condition, or they'd have used a cheaper/easier way to trigger the misoperation.

        We need an AI smart enough to identify their own misoperation before it's affected operation. That's how human's generally work. You recognize when you get tired, before it causes errors, but will still generate errors if you choose to operate machinery while impaired. You recognize when oncoming light
        • They most likely needed to figure out how to trick the autopilot. For that you need general-purpose equipment that can be configured to produce various signals. When you find out what signal causes the intended outcome, you can build much cheaper equipment that is only capable of producing that one signal.

    • I've know cases where people stood on overpasses and threw down bricks to cause crashes. Nobody published papers on the "brick loophole"

      Don't need bricks; get a scantly-clad attractive female* to walk down the side of a busy road. I can think of several times where such distractions almost got me into a tangle. And it's legal, unlike bricks.

      I'd be happy to assist in such research by inspecting the applicants for free.

      * Or scantly clad males might work also, who knows. Didn't mean to be discriminatory. (In

    • Also Tesla will probably fix this particular sensor loop holes in the future.
  • There are plenty of optical illusions for people, too.

    Unless this was done on purpose/malevolently (and that could be prosecuted regardless), this seems to me far different from things like hacking into the car's computer itself.

  • So it's easy to do an DOS attack to auto drive cars just wait for the days when you can drop an $40 box and shut down a major road

  • For that much money (and even with an order of magnitude less), there are probably a ton of ways to fool humans. What exactly are they accomplishing here?
  • The last sentence sums up what Elon Musk has been saying about AutoPilot:

    "even though Autopilot is quite capable, there's still no substitute for an attentive human driver, ready to take control at a moment's notice."

    The technology is not called "self driving" - it is called autopilot. Similar to plane where course and speed are maintained. Tesla reminds users to keep hands on the wheel and remain attentive.

    No news here. Couple that with the cost of the hack, and there is not much to report. I could fool a

    • The technology is not called "self driving" - it is called autopilot. Similar to plane where course and speed are maintained.

      0.o "Autopilot" literally means "self piloting".

      Seriously Tesla/Musk apologists words mean things - and autopilot does not mean what you keep claiming it does. It doesn't mean "an assistant which still requires constant human monitoring and supervision", it means "an automatic system that replaces human operators". From the first line of the Wikipedia entry on autopilots [wikipedia.org] e

  • So not something you would ever encounter.
    We aren't worrying about people using high end equipment to make a car crash.
    We are worrying about a passive object that is shiny and reflective in just the right way to make it invisible.
    Or an active object the size of a small house that because of the light colored paint appears invisible on the horizon.

    Those are legitimate concerns at this point.
    ZOMG system crashes when exposed to light at 572nm flashing at 13.37HZ while the left rear window control is pressed in

  • Perspective (Score:5, Insightful)

    by HangingChad ( 677530 ) on Thursday August 04, 2016 @05:53PM (#52647539) Homepage
    That it takes $90,000 worth of equipment and then always doesn't work right is pretty darn impressive to me. Where I live a good 30 percent of drivers are too old to be behind the wheel and another 10 percent are functional alcoholics. Share the road with south Florida drivers long enough and you'll be begging for autopilot.
  • Obviously, you want a self-driving car to have the best possible auto-pilot technology you can put into it. But purposeful attempts to trick it into not detecting objects, or into thinking objects are there that aren't really there? That means nothing, IMO. What matters is that it does a reliable job of these things in real-world situations where nobody is TRYING to fool the system.

    Human drivers see things all the time and misinterpret them. (There's that popular photo going around social media where someon

  • Did they paint a tunnel in a wall?
  • And that is the reference here. Automatic drive systems do not need to be perfect to be a good replacement. They just need to be better than the average driver. They will start to safe lives when they are better than bad drivers though.

    Of course, everybody believes themselves to be good drivers, but the simple statistical reality is that most are in the range from somewhat above average to really bad.

  • Couldn't you just use black tape to cover the road markings and white tape to make new ones, to veer the car off the road and over a cliff?

  • For that kinda scratch, I'm pretty sure we could rig something up that fools a human driver too.
  • Wow. Talk about reaching. I'm no tesla fan boy, but saying you can trick a radar system with RF equipment is like saying you can talk down a building with dynamite. Of course you can! In fact there exists RF test sets specifically for simulating radar targets for easier system validation.
  • People are susceptible to optical illusions. So are machines when you understand the assumptions made. People and machines don't have to make the same assumptions but each is fallible in it's own way. As they say nothing is perfect. No one and no thing is perfect. But is it good enough? Or which is better?

  • You can also blind a human driver by shining a sufficiently bright light in the eyes. This fact has not caused anyone to say, "even though human drivers are quite capable, there's still no substitute for a good horse."

  • >"there's still no substitute for an attentive human driver, ready to take control at a moment's notice."

    That would be utterly useless. Try "at a fraction of a second's notice". And that simply won't happen once one is in any type of autopilot mode.

    • Empirically false. I remain attentive while the cruise control is engaged, and that's a type of autopilot.

      • I would hardly call cruise control a type of auto-pilot. Auto-pilot requires the car to steer. Controlling speed is nothing in comparison.

        Once the car is doing the two things that require the 99% of the drivers attention- steering and speed, almost no "driver" will be able to keep their attention available "at a fraction of a second's notice". Just human nature. It would be like watching paint dry.

  • by l3v1 ( 787564 )
    Yay, signal sensing equipment can be fooled by sending it fake signals!

    Oh, and rain is wet.
  • You can always DoS a human driver too. Laser pointers, or even perfectly natural feminine breasts could distract a human driver.

  • Comment removed based on user account deletion
  • I'm sure there are much better ways to "fool" it with much more low-tech items.

    How does it distinguish the density of an object it detects? Does a paper bag blowing across the road get marked as something it can drive over or will it take evasive action? Where does it draw the line between running over the paper bag and steering into the car in the next lane to have what it might think is a "softer" crash? And how does it tell the difference between a bag and a child, or even a pigeon?

    Isn't most of its a

  • one could argue that you could easily blind a human pilot with a floodlight, he will also not know if there was an object in front of him or not ..
  • A crack that allows you to control a large number of vehicles IS important and needs to be solved
    OTOH, a hack in which you modify a single car to interfere with another single car's ability to drive is absolutely WORTHLESS. Why?
    Assume that somebody is going to go after another person because they are rich or something. Then said person likely is driving themselves OR has a chauffer.
    Let assume that this is used to go after a kid or a neighbor, etc. IOW, you just want to punish somebody. Now, you have to

It is easier to write an incorrect program than understand a correct one.

Working...