A Sleeping Driver's Tesla Led Police On A 7-Minute Chase (sfchronicle.com) 346
"When a pair of California Highway Patrol officers pulled alongside a car cruising down Highway 101 in Redwood City before dawn Friday, they reported a shocking sight: a man fast asleep behind the wheel," reports the San Francisco Chronicle:
The car was a Tesla, the man was a Los Altos planning commissioner, and the ensuing freeway stop turned into a complex, seven-minute operation in which the officers had to outsmart the vehicle's autopilot system because the driver was unresponsive, according to the CHP...
Officers observed Samek's gray Tesla Model S around 3:30 a.m. as it sped south at 70 mph on Highway 101 near Whipple Avenue, said Art Montiel, a CHP spokesman. When officers pulled up next to the car, they allegedly saw Samek asleep, but the car was moving straight, leading them to believe it was in autopilot mode. The officers slowed the car down after running a traffic break, with an officer behind Samek turning on emergency lights before driving across all lanes of the highway, in an S-shaped path, to slow traffic down behind the Tesla, Montiel said. He said another officer drove a patrol car directly in front of Samek before gradually slowing down, prompting the Tesla to slow down as well and eventually come to a stop in the middle of the highway, north of the Embarcadero exit in Palo Alto -- about 7 miles from where the stop was initiated.
Tesla declined to comment on the incident, but John Simpson, privacy/technology project director for Consumer Watchdog, calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving...
"They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."
Officers observed Samek's gray Tesla Model S around 3:30 a.m. as it sped south at 70 mph on Highway 101 near Whipple Avenue, said Art Montiel, a CHP spokesman. When officers pulled up next to the car, they allegedly saw Samek asleep, but the car was moving straight, leading them to believe it was in autopilot mode. The officers slowed the car down after running a traffic break, with an officer behind Samek turning on emergency lights before driving across all lanes of the highway, in an S-shaped path, to slow traffic down behind the Tesla, Montiel said. He said another officer drove a patrol car directly in front of Samek before gradually slowing down, prompting the Tesla to slow down as well and eventually come to a stop in the middle of the highway, north of the Embarcadero exit in Palo Alto -- about 7 miles from where the stop was initiated.
Tesla declined to comment on the incident, but John Simpson, privacy/technology project director for Consumer Watchdog, calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving...
"They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."
When driver wakes up (Score:2)
"I just had a hell of a dream. What the?!..."
Re: (Score:2)
better than a dead driver (Score:5, Insightful)
Thank you, Elon
Re: (Score:2)
Re: (Score:2)
If the driver dies, does the car just keep going?
That's an interesting question. Imagine a driver that had a sudden heart attack and died. I assume that a car that keeps driving, with a dead man behind the wheel, is preferable to a car veering wildly into traffic. The car will stop eventually after the fuel runs out or the battery dies (too), which I assume would result in the car puttering to a halt in the middle of the road. This might not be ideal but still preferable to many more likely alternatives where an auto-pilot is not present.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Most human drivers can't handle all weather/roads/obstacles. Autonomous cars can handle some roads/places now, and the 'coverage map' will gradually increase. The question is how long until it does better than the average human, in various conditions, and my WAG is 3 years for the leading solution.
Re: (Score:3)
Before an arrow can move a distance, it must first move half that distance. Therefore, it will never move.
Re: (Score:3)
Re: better than a dead driver (Score:5, Informative)
It's not quite there. It doesn't yet read stop lights. It'll stop if there's a car ahead of you, but not if there isn't one. It also doesn't know how to handle small traffic circles or 90 degree turns. Aka: it's not yet intended for city driving.
But it's pretty dang close to being a home-to-destination solution. Navigate-On-Autopilot was a big step in that direction.
Note that even when the car "can" do everything on its own, that doesn't mean it going to jump straight to Level 5 autonomy. For the foreseeable future, "human + vehicle" will continue to be safer than "vehicle alone".
Re: (Score:3)
Re: (Score:2)
Imagine a driver that had a sudden heart attack and died. I assume that a car that keeps driving, with a dead man behind the wheel, is preferable to a car veering wildly into traffic.
Hence my comment above.
In my town we recently had a driver shoot himself in rush-hour traffic. His pickup veered across the center line and wiped out a whole family on the other side.
Re:better than a dead driver (Score:5, Insightful)
If the driver dies, it is unlikely that the driver's hands will stay on the steering wheel, which will prompt the Autopilot software to eventually stop the car.
Not Less Capable (Score:5, Insightful)
"They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."
And yet...nobody was hurt, no cars were wrecked...so it did much better than any other car with a sleeping driver.
Re:Not Less Capable (Score:4, Insightful)
Re:Not Less Capable (Score:5, Interesting)
So a Tesla on autopilot with a sleeping driver has a worst case scenario that's about the same as the best case for a regular car with sleeping driver?
Re:Not Less Capable (Score:5, Interesting)
A better comparison would be other cars with level 2 automation. They all check that the driver it attentive in various ways, and take action if they think the driver is asleep.
Some use IR cameras to check that the driver is paying attention to the road, for example.
There is also the issue of what to do if the driver is asleep. Some make more noise or vibrate the wheel/seat. Some like Tesla just stop in the middle of the road, others keep going on the assumption that it's better not to park in the fast lane of the motorway.
Basically all of them have limitations and none of them handle the driver asleep failure mode very well.
Re: (Score:3)
All I need to know is there some way for lawyers to sue over it so they don't go hungry?
Because the tech is almost certainly better than humans right now, to say nothing of the near future, so any delays kill more than they save.
Re:And a Los Altos planning commissioner, (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
"They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."
And yet...nobody was hurt, no cars were wrecked...so it did much better than any other car with a sleeping driver.
So it sounds like he was also drunk at the time, but we're really dealing with 3 possible scenarios:
a) He would have driven and passed out / fell asleep no matter what car he owned.
b) He would have driven no matter what car he owned, but he only fell asleep in the tesla since the autopilot was doing the driving.
c) He only drove because he was relying on the tesla to do the driving.
So in scenario a) the tesla definitely made things safer, but in b & c the tesla caused the incident to occur. That's the pr
Re: (Score:2)
Re:Not Less Capable (Score:4, Insightful)
Asleep or not, I think the best autonomous cars out there today if they replaced all drivers with them would cause far fewer than 35,000 fatalities, yes.
Think about that. The US turned its country upside down because of 9/11, but cars cause a 9/11 worth of death every month. Why no "war on car accidents" and a couple trillion for autonomous vehicle implementation?
Re: (Score:3)
If "another country" and by that you mean Russia or China lights off a nuke powered EMP that strong, self driving cars coming to a halt will be the least of everyone's worry. All "regular" cars manufactured later than the 90s will stop working as well, but again we wouldn't be giving a crap about that, we'd be more concerned with all the nukes flying and **civilization ending**. At least the carbon emissions and global warming will probably get sorted out with the nuclear winter...
Re: (Score:3)
But how many dipshits are going to [deliberately] take a nap behind the wheel of a non-"autopilot" car?
Probably the ones who get drunk enough to pass out while driving, and then start driving.
Or, the other side of the coin... (Score:4, Insightful)
Driver fell asleep at the wheel, and instead of crashing into things as in a conventional car, semi-autonomous vehicle came to complete stop with no loss to life or property.
Re:Or, the other side of the coin... (Score:5, Insightful)
Would he have fallen asleep if the car didn't have "autopilot"?
Of course not. This "falling asleep at the wheel" phenomenon is totally new and only happens in Teslas.
It certainly hasn't been going on all over this country for most of the past century causing deaths on a daily basis. No sir.
Re:Or, the other side of the coin... (Score:5, Insightful)
Re: (Score:3)
Yes, really. I saw it right here -> https://www.merriam-webster.co... [merriam-webster.com]
Re:Or, the other side of the coin... (Score:5, Insightful)
On the contrary. It enabled seven whole minutes of unsafe driving with nobody conscious behind the wheel. When you think about it, that's pretty remarkable. If you nod off in a normal car, you're pretty much dead, and there's a decent chance you'll take other people with you. If you nod off in a Tesla, there's a nonzero chance you'll get pulled over in a complex traffic break seven minutes later, not having killed anybody.
Re:Or, the other side of the coin... (Score:4, Insightful)
On the contrary. It enabled seven whole minutes of unsafe driving with nobody conscious behind the wheel. When you think about it, that's pretty remarkable. If you nod off in a normal car, you're pretty much dead, and there's a decent chance you'll take other people with you. If you nod off in a Tesla, there's a nonzero chance you'll get pulled over in a complex traffic break seven minutes later, not having killed anybody.
And probably quite likely to be off the road for a while, since it is unlikely that you will have your driver's license back anytime soon.
The thing is, the less work involved/required for driving, the more likely it is that you are going to fall asleep--that's why it's the long-straight-into-the-horizon stretches tend to have problems here. If car's autopilot functions take too much of the burden off of a human driver without being to the point where it is not going to be any problem if the driver falls asleep at the wheel? All it's doing is making it more likely that it'll happen.
Maybe the priority over 'can drive self at constant speed in constant direction until a solid object is collided with' should have been to have the car capable of pulling itself over to the side of the road safely should the driver become incapacitated? Or at least not requiring a complex traffic break to be pulled over by the cops?
Re:Or, the other side of the coin... (Score:5, Insightful)
The thing is, the less work involved/required for driving, the more likely it is that you are going to fall asleep--that's why it's the long-straight-into-the-horizon stretches tend to have problems here.
But autopilot is the life-saver here, not the killer.
I've driven 600 km at night, after three full days of working long hours and it was damn tough to stay awake. In a car that had zero drive-assist systems.
Unless you want to install dead-man switches in cars, autopilot is really what you want.
Re: (Score:3)
You drove 373 miles after 3 days of little or no sleep??
No, working long hours. That means leaving the office around 8 pm, getting some dinner, dropping into bed around 10 pm, maybe an hour talking to the wife at home and surfing the Internet. Waking up a 7, get breakfast in hotel and take the shuttle bus to the office. So I did get my 7-8 hours of sleep. Just not much else in the way of relaxing.
Parts of your brain were likely fully asleep while you soldiered on. Not cool.
I agree on that. Definitely wasn't an enjoyable experience. And I did indeed stop two or three times to get a short nap. I'm a very careful person, my wife would say ov
Re: (Score:2)
Would he have even gotten in the car if he didn't know people he had autopilot to fall back on?.
Of course not. This "overconfidently getting in the car when too drunk to drive" phenomenon is totally new and only happens in Teslas.
It certainly hasn't been going on all over this country for most of the past century causing SCORES of deaths on a daily basis. No sir.
Re: (Score:3)
Comment removed (Score:5, Informative)
Re: (Score:2)
Officers said that they were unable to get the man's attention.
Even if the man was asleep, how on earth do you sleep through a police siren right next to your car? Those things are made to be loud enough to cut through a car's noise insulation from a distance, leave alone in proximate.
I grew up on a college campus and managed to routinely sleep through my neighbor's wild parties, including the one where there were firetrucks only a couple meters away from my bedroom.
In this case, though, my guess is alcohol played a role. You can do a lot of fun and interesting things with a person in a drunken stupor, especially if you know how to get into their phone & they keep their phone logged into Facebook. (Being known to take this view also can prevent people from drinking themselves into
Not enough info to blame Tesla... or not (Score:5, Insightful)
I'm not going to lambaste Tesla over this.
The guy was drunk. Has he driven drunk before, in the Tesla or in another car (whether he's gotten caught or not)? Did he intend to have the Telsa drive him home, or did he start driving himself and just fell asleep?
It does seem obvious that the driver made some very bad decisions, regardless.
Re: (Score:2)
It does seem obvious that the driver made some very bad decisions, regardless.
Well, duh. The question is weather he made even stupider decisions because he thought tech would save him. And the answer to that is, yes people do. A good example is cell phones, a lot of people think a rescue will come for them. When you were on your own, people prepared better for survival. They knew if they got lost or trapped in a storm they'd probably have to ride it out on their own. Today we see a lot of people who are completely unprepared for the unexpected. They have exactly what they need and no
Re: (Score:2)
Indeed. As safety features increase, especially in vehicles, people tend to increase risky behavior to match. Airbags and crumple zones made people drive faster, etc.
Re: (Score:2)
At least airbags and crumple zones are destructive, so, people usually do not want to damage their car on purpose, even though they drive less carefully now.
On the other hand, I can totally see something like ABS and traction control abused for driving fast on a slippery road ("It's OK, my car has ABS and traction control, it drives on ice just as well as on asphalt"). The automatic emergency stop can be used in place of regular stopping, until it fails one day and you hit another car.
Autopilot-type feature
Re: (Score:2)
Was there a thunderstorm? Blizzard? I find myself wondering what the weather had to do with what happened, or didn't.
For what it's worth, the description of events in TFA makes it look like the only dangerous things happening that night were police officers swerving across traffic lanes to keep other cars away from the idiot driver (I won't say "drunk driver", since it was mentioned that he was given a field-s
Re: (Score:3)
I only have a problem with someone sitting in a control center and getting any car he wants to stop. A low-range signal that can be sent from a police car to a car they need to stop, however, is harder to argue against.
Re: (Score:3)
A low-range signal that can be sent from a police car to a car they need to stop, however, is harder to argue against.
No, it is not. All you need to do is steal one of those transmitters and now you can stop any car? Or reverse-engineer it and build your own? Argument accomplished.
Re: (Score:3)
By that argument, all you need to do is steal a police car or make a white car look like a police car.
That actually is a problem, and that's why there are laws about that. It's illegal both to impersonate the police, and to include those features on your vehicle. That differs substantially from the scenario which we are discussing because you can observe those features from a distance — many of them are passive, and cannot be disabled. Cops are on the lookout for other cops, and if they see something that looks like a cop but isn't a cop, they become quite incensed.
Re: (Score:3)
Make the system an integral part of the police car - you can't steal it without ripping the car apart at the very least.
That's not how this works. That's not how any of this works.
Re: (Score:3)
I'm sorry, am I using the wrong words?
You're being wholly unrealistic. You're using the wrong ideas.
If the physical machine the police car would need in order to send a signal to an autonomous vehicle to make it stop is sitting within the police car's instrument panel, or in some other not-immediately-possible-to-disassemble spot, you need to either steal the police car or strip it to get to this specific machine.
At least part of the communications device you're talking about is going to have to be outside the vehicle. But the first thing you need to understand about cop cars is that they always have been derived from ordinary vehicles. They're not purpose-designed to be cop cars, they just have some upgraded components — if that. They might just have a particular mix of components. They have to be modified into cop cars, and they have to be maintai
Must Stay Awake, must stay awake.. (Score:2)
Re: (Score:2)
Nah. Play the one eye game. Never fails (to turn into the two eye game).
The only thing this proves... (Score:4, Insightful)
is that the driver was too exhausted to be driving in the first place. And, that had he not be driving a car equipped with as an intelligent a safety system, there would have been a substantially higher probability of injury, loss of life or property.
" calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving... "
Those of us who know better call this FUD.
Re: (Score:2)
Junk story (Score:3)
Did he pass the FST ? The title of the article read he was a drunk driver but no where in the article does it state he was. Did he just decide to take a nap ? Why should Tesla be held to truth in advertising while other car manufacturers can show their cars doing rail slides on bridges and many other behaviors that a car can not accomplish ? Missing far too many basic facts to really render any sort of judgement. The Chronicle should fire whomever wrote this trash and hire a qualified writer/reporter, say your average 5th grader.
Comment removed (Score:5, Informative)
Tesla's Fault (Score:5, Insightful)
Tesla declined to comment on the incident, but John Simpson, privacy/technology project director for Consumer Watchdog, calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving...
"They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."
So let me see if I have this straight. 10,000 people a year die in DUI crashes, yet all these drunk drivers are not the fault of Ford, Toyota, Chevy, Nissan, etc. The liability is totally on the driver that decided to operate a vehicle while intoxicated.
However when someone drives a Tesla drunk, it is Tesla's fault. Yes, that makes perfect sense, Mr. Simpleton. I mean Simpson.
Re: (Score:2)
However when someone drives a Tesla drunk, it is Tesla's fault. Yes, that makes perfect sense, Mr. Simpleton. I mean Simpson.
Well the driver was the one arrested in this case, not the Tesla
Re: (Score:2)
Re: (Score:2)
I agree with you because he hasn't proven this isn't a case of "I'm not too drunk to drive.... zzzzzzzzz" You need to disprove that possibility before you can arrive at the conclusion he's asserting. But contrary to what you claim, there is a (possible) distincti
Re: (Score:2)
Tesla is not responsible for drivers making stupid decisions. That's absurd.
Re: Tesla's Fault (Score:2)
Re: (Score:2)
Conclusion not supported. (Score:2)
They have given us no reason to believe that the driver actually thought the car was fully capable of autonomous driving. People unintentionally fall asleep behind the wheel even in cars with no autonomous capability at all. Naturally they tend to crash.
So maybe he thought the car was more capable or maybe he meant to stay awake but failed. If the latter, the car likely saved his and perhaps other's lives.
Right.... (Score:2)
Right... So the CIA can pilot your Tesla into a wall at high speed, but they can't stop the car of a sleeping driver?
Re: (Score:2)
Re: (Score:2)
In Soviet Russia, KGB drive wall into YOU!
It never happened, it can't happen! (Score:2)
He wasn't asleep, (Score:2)
he was "driving outside the box."
Re: (Score:2)
So the cops came along to put him in a box to sober up.
Problem solved.
Story is hard to believe (Score:2, Troll)
A story involving the police without police shooting someone? Or shooting someone's dog? Or choking someone? Or otherwise injuring them for no reason? Are you sure this happened in America? The description doesn’t sound like American police.
If the story is true, I would like to thank the police for not opening fire on the car. Or the driver after the car was stopped. Or random others. Or dogs that might have been in the area.
Good job police. Keep it up.
Too close to call (Score:2)
Delorean (Score:2)
Re: (Score:2)
Unfortunately, no matter what you input, the final destination would always be a river in France.
A weakness of autonomous vehicles in general (Score:2)
A weakness of autonomous vehicles in general is sensor failure. I know a guy who's become reliant on parking sensors since getting a new car a few months ago. Twice the sensor has flaked out - not warned of a nearby object - and he's bumped into it. This has led to thousands of dollars in damage. And it fails silently, and intermittently.
So - it'll be quite important to stick to the old way of driving - i.e. you doing it visually - and not relying on a sensor and software solely, for, it seems to me, the fo
"complex operation" (Score:2)
very complex operation indeed. They had to drive in front of it and slow down.
Re: (Score:2)
Re: (Score:3)
The CEO of Waymo!
Re: And some idiot just yesterday INSISTED... (Score:5, Insightful)
Nor anyone else for that matter. I'm not sure what this idiot's plan was, but except for getting pulled over, the car was going it's job. What I'm not sure of is why they didn't just merge in front of it and hit the brakes? The Drive On Nav feature still requires driver input to merge around slow traffic (by activating the indicators).
By the way, plenty of people fall asleep behind the wheel, and drive under terrible circumstances (drunk, high, exhausted). Whatever this driver did, he is responsible for what he is doing, not Tesla.
Re: And some idiot just yesterday INSISTED... (Score:5, Insightful)
Probably they didn't want to create a chain reaction rear end accident. By running a traffic break behind the Tesla before then slowing down and bringing the Tesla to a halt by slowing a patrol car down in front if it, they cars behind were already slowed down a bit, there was more of a gap between following cars and the Tesla, and drivers behind anticipated that something was going on ahead. This reduced the chances of someone crashing into the Tesla when it stopped in the middle of a traffic lane for no apparent reason in the early morning hours on a freeway.
At least that's my guess.
Re: And some idiot just yesterday INSISTED... (Score:5, Insightful)
Nor anyone else for that matter.
Yep.
"They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case."
Apparently the car was quite capable of not hitting anything and of coming to a complete stop when something was put in its way.
"That's a huge problem."
Given that people fall asleep at the wheel every single day and crash, I'd call it a huge win.
Re: And some idiot just yesterday INSISTED... (Score:5, Insightful)
One of the main points of an autonomy system is to improve safety via pairing the vehicle's constant attentiveness with a human's decision-making ability. The more annoying you make your system with its nagging, the less people will use it, defeating any safety advantages.
Tesla actually has spent money on the hardware that would be needed. Look [wp.com]. You see that? That's a driver-facing camera. Every Model 3 has one. So what "cost savings", exactly, do you think they're getting?
Tesla has experimented endlessly over the years with nag frequencies, types of nag, and types of driver monitoring. This is what they've arrived at as the best balance between "encouraging people to actually use it" and "discouraging inattentive driving". And by and large, it works very well - even if some drunk happened to pass out at the wheel. Which, while we're on that subject... what's the alternative? Have drunks ever been prone to not driving? When a drunk passes out at the wheel, would you rather the car just crash? It's still DUI either way, but in the former case, everyone walks away unscathed, while in the latter case some random person has a drunk crash into and possibly kill them.
That's not to say that the current approach is perfect - far from it. There's a difference between a naggy, "oh my god you looked away from the windshield" system, and a system that can detect if a person has passed out (but still had their hands on the wheel), for example. Implementing the latter would very much be a good thing. But with the former, if you drive people off of using it, you lose out on any potential for improving safety.
Re: (Score:3)
There's a difference between a naggy, "oh my god you looked away from the windshield" system, and a system that can detect if a person has passed out (but still had their hands on the wheel), for example. Implementing the latter would very much be a good thing. But with the former, if you drive people off of using it, you lose out on any potential for improving safety.
There's also a sensible middle ground. If someone looks away for the windshield for long enough, then they ought to be reminded to keep their eyes on the road. Just like the steering wheel nags, there's a sweet spot for eyes-on-road nags as well.
Re: And some idiot just yesterday INSISTED... (Score:4, Informative)
On a limited-access highway, "coming to a stop" is usually WORSE than "staying in the lane & maintaining a normal cruising speed". Autopilot lanekeeping & collision-avoidance is now better on average at avoiding accidents than most human drivers. It's UNUSUAL situations that create the danger.
A car staying in its lane & moving appropriately is the norm on a freeway. A car stopped on the shoulder creates an active road hazard for everyone else. Remember, his autonomous vehicle wasn't the only one on the road. It's not a safety improvement if your "safety feature" creates a situation that's MORE dangerous than letting the car just do what it does best -- follow the lane, avoid collisions, and behave in a predictable manner.
If you really need to punish drivers, add warning lights to visually communicate to other drivers that a car is operating without active human control. On a limited-access highway not under construction in good weather, inattentiveness with autopilot is a statistical non-issue... and in the real world, it's probably a net improvement over human drivers who are semi-distracted. On a non-freeway that has at-grade cross traffic, it's a genuine hazard. In bumper-to-bumper city-street gridlock, it's a non-issue (the car can stop within inches if necessary). On city streets at 30mph, it's likely to be dangerous.
The point is, there is no blanket "one size fits all" rule. Autonomous systems have known constraints. Stay within them, and you're fine. Deviate, and you create problems. Ignore the constraints entirely, and you're in uncharted territory risk-wise. Following lines & not colliding is easy. Finding a safe place to pull over is enormously harder.
Re: (Score:3)
Autopilot lanekeeping & collision-avoidance is now better on average at avoiding accidents than most human drivers.
This (along with the assumption that autopilot is always going to be operational and operate as intended) is clearly the assumption underlying your entire post, and I'd be surprised if you have any real data to back that. If you'd like to share some, I'd be happy to look at it. (Remember, it has to span a reasonable sample of all potential weather conditions, road conditions, and routes, not just a cherry-picked sandbox.)
A car stopped on the shoulder creates an active road hazard for everyone else.
Limited-access highways are designed with emergency breakdown lanes that put the car
Re: (Score:3, Insightful)
Whatever the case, Alexander Samek needs to be fired by the city of Los Altos and given the same punishment as any "commoner" would receive for a DUI (ie. suspended licence, large fines and mandatory DUI classes).
Stupid fucks like him are responsible for murdering people every day.
Re: (Score:3, Interesting)
Re: And some idiot just yesterday INSISTED... (Score:5, Informative)
Emergency services should have an override on automated vehicles, specifically for situations like this.
They've got one, the same one they use with human drivers. They used it. Problem solved.
Re: And some idiot just yesterday INSISTED... (Score:4, Insightful)
so can specified overrides
Re: And some idiot just yesterday INSISTED... (Score:4, Informative)
Besides, the police DO have an override- they did exactly that. They boxed in the car and forced it to stop.
They didn't even have to box it in. They just put one car in front of it and slowed down to a halt. If you were driving the car, you would have just changed lanes, but this car was on autopilot and doesn't change lanes on its own.
The reason it took so long is that you can't just stop a car on the motorway, because of the huge risk that others drive into it. One driver had to slow down traffic behind the car to create a huge gap for safety, that's what took the time.
Re: (Score:2)
We get it, the left fucking HATE Tesla / all things Elon.
Not just left-wingers. Right-wing Breitbart only says negative, sarcastic things about Musk and his companies.
I don't know why. Maybe Breitbart thinks his companies currently get government subsidies.
But besides subsidies, I think there's some emotional reason that they don't like Musk.
Re: (Score:2, Interesting)
Because the right-wing hate California with an irrational degree of passion.
I'll leave it to others to explain why a right-wing rag might hate a state that is highly successful yet has mostly liberal policies.
Re: (Score:2)
Re: (Score:3)
Plenty of people survive falling asleep behind the wheel.
Just not usually the ones who are in the path of the car.
The driver? Pretty much they stand a good chance of living in a modern car no matter what they hit.
Re: (Score:2)
Re: (Score:2)
If it isn't something like slamming into a wall, a drunk or asleep driver's actually got a better chance of survival, mostly because they'll be limp. This applies even at highway speeds--my parents got to see people walk away from managing to land their car in the neighbor's tree one Thanksgiving because they were all impressively drunk. (Not sure how fast they were going, except to pull off the accident they did, they had to have been going over 90mph...in a residential neighborhood in the middle of a ci
Re: (Score:2)
I'm pretty sure this isn't even the first Tesla driver to be pulled over for driving while unconscious. It's at least the second time in the Bay Area this year. There was one on the Bay Bridge back in January, another one back in May of 2016 (as seen in a video clip on YouTube), and at least one in between involving a Tesla mobile service vehicle.
Re: (Score:2)
My friend was driving home at night one time. He suddenly woke up to a loud screeching sound as the side of the car were dragging along the fence in the centre divider. He grabbed the steering wheel, got back into the lane and continued home - never to repeat that mistake again. His Volvo 240 was not pretty on that side.
Many years later I talked to a lady who gets off work at 10pm. She told be that she have caught herself nodding off several times driving home. I was horrified but suspect she's not the only
Re: (Score:2)
Found this: "The Autopilot feature includes machine steering, collision avoidance, assisted lane changing and adaptive cruise control. On a well-marked highway, the car can nearly drive itself, although the human driver is expected to remain alert and take over the controls when necessary. The system periodically warns drivers to put their hands on the steering wheel, and the car will slow down and eventually stop if they don't."
Notice that last line, I guess it didn't...
Re: (Score:2)
Normally falling asleep at the wheel at highway speeds is fatal
I'm not convinced that is true at all, especially in an $80K vehicle.
Re: (Score:2)
Re: (Score:2)
Full autonomy would require following all of the laws... including those that require drivers to give way to emergency vehicles and pull over for police cars flashing their lights. They aren't going to wait for every emergency vehicle in the nation to be equipped with some communications system, so they will train their networks to respond in the same way humans are supposed to.
I doubt this has been a priority for Tesla with the current requirement that a fully aware licensed driver be at the wheel. When it