Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation AI

A Sleeping Driver's Tesla Led Police On A 7-Minute Chase (sfchronicle.com) 346

"When a pair of California Highway Patrol officers pulled alongside a car cruising down Highway 101 in Redwood City before dawn Friday, they reported a shocking sight: a man fast asleep behind the wheel," reports the San Francisco Chronicle: The car was a Tesla, the man was a Los Altos planning commissioner, and the ensuing freeway stop turned into a complex, seven-minute operation in which the officers had to outsmart the vehicle's autopilot system because the driver was unresponsive, according to the CHP...

Officers observed Samek's gray Tesla Model S around 3:30 a.m. as it sped south at 70 mph on Highway 101 near Whipple Avenue, said Art Montiel, a CHP spokesman. When officers pulled up next to the car, they allegedly saw Samek asleep, but the car was moving straight, leading them to believe it was in autopilot mode. The officers slowed the car down after running a traffic break, with an officer behind Samek turning on emergency lights before driving across all lanes of the highway, in an S-shaped path, to slow traffic down behind the Tesla, Montiel said. He said another officer drove a patrol car directly in front of Samek before gradually slowing down, prompting the Tesla to slow down as well and eventually come to a stop in the middle of the highway, north of the Embarcadero exit in Palo Alto -- about 7 miles from where the stop was initiated.

Tesla declined to comment on the incident, but John Simpson, privacy/technology project director for Consumer Watchdog, calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving...

"They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."
This discussion has been archived. No new comments can be posted.

A Sleeping Driver's Tesla Led Police On A 7-Minute Chase

Comments Filter:
  • "I just had a hell of a dream. What the?!..."

  • by Anonymous Coward on Saturday December 01, 2018 @07:40PM (#57733928)

    Thank you, Elon

    • If the driver dies, does the car just keep going?
      • If the driver dies, does the car just keep going?

        That's an interesting question. Imagine a driver that had a sudden heart attack and died. I assume that a car that keeps driving, with a dead man behind the wheel, is preferable to a car veering wildly into traffic. The car will stop eventually after the fuel runs out or the battery dies (too), which I assume would result in the car puttering to a halt in the middle of the road. This might not be ideal but still preferable to many more likely alternatives where an auto-pilot is not present.

        • Comment removed based on user account deletion
          • Comment removed based on user account deletion
            • by mentil ( 1748130 )

              Most human drivers can't handle all weather/roads/obstacles. Autonomous cars can handle some roads/places now, and the 'coverage map' will gradually increase. The question is how long until it does better than the average human, in various conditions, and my WAG is 3 years for the leading solution.

        • by meglon ( 1001833 )
          I can see a tie in with a new smart watch (from Tesla, of course) that monitors vitals. If a person's vitals are in distress, it drives them to the closest hospital.
        • Imagine a driver that had a sudden heart attack and died. I assume that a car that keeps driving, with a dead man behind the wheel, is preferable to a car veering wildly into traffic.

          Hence my comment above.

          In my town we recently had a driver shoot himself in rush-hour traffic. His pickup veered across the center line and wiped out a whole family on the other side.

      • by whoever57 ( 658626 ) on Saturday December 01, 2018 @10:14PM (#57734538) Journal

        If the driver dies, it is unlikely that the driver's hands will stay on the steering wheel, which will prompt the Autopilot software to eventually stop the car.

  • Not Less Capable (Score:5, Insightful)

    by Anonymous Coward on Saturday December 01, 2018 @07:42PM (#57733942)

    "They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."

    And yet...nobody was hurt, no cars were wrecked...so it did much better than any other car with a sleeping driver.

    • by Luthair ( 847766 ) on Saturday December 01, 2018 @07:45PM (#57733964)
      This time, but it could just as easily been the Walter Huang whose tesla ran itself into the concrete barrier.
      • Re:Not Less Capable (Score:5, Interesting)

        by ceoyoyo ( 59147 ) on Saturday December 01, 2018 @11:30PM (#57734872)

        So a Tesla on autopilot with a sleeping driver has a worst case scenario that's about the same as the best case for a regular car with sleeping driver?

        • Re:Not Less Capable (Score:5, Interesting)

          by AmiMoJo ( 196126 ) on Sunday December 02, 2018 @06:35AM (#57735720) Homepage Journal

          A better comparison would be other cars with level 2 automation. They all check that the driver it attentive in various ways, and take action if they think the driver is asleep.

          Some use IR cameras to check that the driver is paying attention to the road, for example.

          There is also the issue of what to do if the driver is asleep. Some make more noise or vibrate the wheel/seat. Some like Tesla just stop in the middle of the road, others keep going on the assumption that it's better not to park in the fast lane of the motorway.

          Basically all of them have limitations and none of them handle the driver asleep failure mode very well.

    • Someone with a "Good Head" on his shoulders, and he still does. :)
    • by Ogive17 ( 691899 )
      Or maybe other drivers would not have got behind the wheel in that circumstance.
    • Re: (Score:3, Insightful)

      by quantaman ( 517394 )

      "They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."

      And yet...nobody was hurt, no cars were wrecked...so it did much better than any other car with a sleeping driver.

      So it sounds like he was also drunk at the time, but we're really dealing with 3 possible scenarios:

      a) He would have driven and passed out / fell asleep no matter what car he owned.

      b) He would have driven no matter what car he owned, but he only fell asleep in the tesla since the autopilot was doing the driving.

      c) He only drove because he was relying on the tesla to do the driving.

      So in scenario a) the tesla definitely made things safer, but in b & c the tesla caused the incident to occur. That's the pr

  • by NonFerrousBueller ( 1175131 ) on Saturday December 01, 2018 @07:44PM (#57733952)

    Driver fell asleep at the wheel, and instead of crashing into things as in a conventional car, semi-autonomous vehicle came to complete stop with no loss to life or property.

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Saturday December 01, 2018 @07:45PM (#57733960)
    Comment removed based on user account deletion
  • by 93 Escort Wagon ( 326346 ) on Saturday December 01, 2018 @07:49PM (#57733990)

    I'm not going to lambaste Tesla over this.

    The guy was drunk. Has he driven drunk before, in the Tesla or in another car (whether he's gotten caught or not)? Did he intend to have the Telsa drive him home, or did he start driving himself and just fell asleep?

    It does seem obvious that the driver made some very bad decisions, regardless.

    • by Kjella ( 173770 )

      It does seem obvious that the driver made some very bad decisions, regardless.

      Well, duh. The question is weather he made even stupider decisions because he thought tech would save him. And the answer to that is, yes people do. A good example is cell phones, a lot of people think a rescue will come for them. When you were on your own, people prepared better for survival. They knew if they got lost or trapped in a storm they'd probably have to ride it out on their own. Today we see a lot of people who are completely unprepared for the unexpected. They have exactly what they need and no

      • The question is weather he made even stupider decisions because he thought tech would save him. And the answer to that is, yes people do.

        Indeed. As safety features increase, especially in vehicles, people tend to increase risky behavior to match. Airbags and crumple zones made people drive faster, etc.

        • At least airbags and crumple zones are destructive, so, people usually do not want to damage their car on purpose, even though they drive less carefully now.

          On the other hand, I can totally see something like ABS and traction control abused for driving fast on a slippery road ("It's OK, my car has ABS and traction control, it drives on ice just as well as on asphalt"). The automatic emergency stop can be used in place of regular stopping, until it fails one day and you hit another car.

          Autopilot-type feature

      • The question is weather he made even stupider decisions because he thought tech would save him.

        Was there a thunderstorm? Blizzard? I find myself wondering what the weather had to do with what happened, or didn't.

        For what it's worth, the description of events in TFA makes it look like the only dangerous things happening that night were police officers swerving across traffic lanes to keep other cars away from the idiot driver (I won't say "drunk driver", since it was mentioned that he was given a field-s

  • Just keep think that, that works. :)
  • by fred911 ( 83970 ) on Saturday December 01, 2018 @07:52PM (#57734004) Journal

    is that the driver was too exhausted to be driving in the first place. And, that had he not be driving a car equipped with as an intelligent a safety system, there would have been a substantially higher probability of injury, loss of life or property.

    " calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving... "

    Those of us who know better call this FUD.

  • by Archfeld ( 6757 ) <treboreel@live.com> on Saturday December 01, 2018 @07:53PM (#57734018) Journal

    Did he pass the FST ? The title of the article read he was a drunk driver but no where in the article does it state he was. Did he just decide to take a nap ? Why should Tesla be held to truth in advertising while other car manufacturers can show their cars doing rail slides on bridges and many other behaviors that a car can not accomplish ? Missing far too many basic facts to really render any sort of judgement. The Chronicle should fire whomever wrote this trash and hire a qualified writer/reporter, say your average 5th grader.

  • Tesla's Fault (Score:5, Insightful)

    by Dan East ( 318230 ) on Saturday December 01, 2018 @08:02PM (#57734078) Journal

    Tesla declined to comment on the incident, but John Simpson, privacy/technology project director for Consumer Watchdog, calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving...

    "They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."

    So let me see if I have this straight. 10,000 people a year die in DUI crashes, yet all these drunk drivers are not the fault of Ford, Toyota, Chevy, Nissan, etc. The liability is totally on the driver that decided to operate a vehicle while intoxicated.

    However when someone drives a Tesla drunk, it is Tesla's fault. Yes, that makes perfect sense, Mr. Simpleton. I mean Simpson.

    • by Ksevio ( 865461 )

      However when someone drives a Tesla drunk, it is Tesla's fault. Yes, that makes perfect sense, Mr. Simpleton. I mean Simpson.

      Well the driver was the one arrested in this case, not the Tesla

      • by meglon ( 1001833 )
        That Tesla should be thankful it's not running on a gas/ethanol mixture. Hmm... i can foresee that possibility being the final straw for Skynet.
    • I agree with you. But Simpson's point is the distinction between a Ford owner saying "I'm to drunk to drive, I'd better call a cab," and a Tesla owner saying "I'm to drunk to drive, but I don't need to call a cab because my car has autopilot!"

      I agree with you because he hasn't proven this isn't a case of "I'm not too drunk to drive.... zzzzzzzzz" You need to disprove that possibility before you can arrive at the conclusion he's asserting. But contrary to what you claim, there is a (possible) distincti
  • They have given us no reason to believe that the driver actually thought the car was fully capable of autonomous driving. People unintentionally fall asleep behind the wheel even in cars with no autonomous capability at all. Naturally they tend to crash.

    So maybe he thought the car was more capable or maybe he meant to stay awake but failed. If the latter, the car likely saved his and perhaps other's lives.

  • Right... So the CIA can pilot your Tesla into a wall at high speed, but they can't stop the car of a sleeping driver?

    • by meglon ( 1001833 )
      Well sure they could... but then they'd have to kill you. So if you think about it, them driving your car into the wall is simply a way to be more efficient.
    • by mentil ( 1748130 )

      In Soviet Russia, KGB drive wall into YOU!

  • Yes, they wrongly convinced owners that this car could do exactly what it did.
  • he was "driving outside the box."

  • A story involving the police without police shooting someone? Or shooting someone's dog? Or choking someone? Or otherwise injuring them for no reason? Are you sure this happened in America? The description doesn’t sound like American police.

    If the story is true, I would like to thank the police for not opening fire on the car. Or the driver after the car was stopped. Or random others. Or dogs that might have been in the area.

    Good job police. Keep it up.

  • If the car had realized that no one was responding and then executed a safe pull-over I would buy that autopilot could help *in this case*. However, this is too close to call. In a normal car, the driver may have been killed instantly or his foot may have left the gas pedal and slowed to a stop, or he may not have fallen asleep at all. In this case, a car kept driving for 7 minutes in which case a person could have been hit. The police had to intervene so they may have been injured as well. It's really
  • Clearly John Delorean had the better idea with a car you snort started and would then follow a white line anywhere. Now that is an autopilot.
    • by mentil ( 1748130 )

      Unfortunately, no matter what you input, the final destination would always be a river in France.

  • A weakness of autonomous vehicles in general is sensor failure. I know a guy who's become reliant on parking sensors since getting a new car a few months ago. Twice the sensor has flaked out - not warned of a nearby object - and he's bumped into it. This has led to thousands of dollars in damage. And it fails silently, and intermittently.

    So - it'll be quite important to stick to the old way of driving - i.e. you doing it visually - and not relying on a sensor and software solely, for, it seems to me, the fo

  • very complex operation indeed. They had to drive in front of it and slow down.

    • The final fix may be a one line change; but your debugging methods can be "complex"; you can't expect an average CHP office to know how autopilot software works (a normal cruise will just slam on you straight ..the one without any adaptive functions). Sure moving on S curve is not complex once you know it is safe to do.

"If the code and the comments disagree, then both are probably wrong." -- Norm Schryer

Working...