Tesla Autopilot, Distracted Driving To Blame In Deadly 2018 Crash (theverge.com) 171
Slashdot readers TomGreenhaw and gollum123 are sharing the findings from a National Transportation Safety Board (NTSB) investigation into a fatal Tesla Model X crash that occurred in 2018 near Mountain View, California. The agency says Tesla's Autopilot system and the driver's distraction by a mobile device were two of the probable causes of the crash. The Verge reports: The safety board arrived at those probable causes after a nearly two-year investigation into the crash. NTSB investigators also named a number of contributing factors, including that the crash attenuator in front of the barrier was damaged and had not been repaired by California's transportation department, Caltrans, in a timely manner. Had the crash attenuator been replaced, NTSB investigators said Tuesday that the driver, Walter Huang, likely would have survived. The NTSB shared its findings at the end of a three-hour-long hearing on Tuesday. During the hearing, board members took issue with Tesla's approach to mitigating the misuse of Autopilot, the National Highway Traffic Safety Administration's lax approach to regulating partial automation technology, and Apple -- Huang's employer -- for not having a distracted driving policy. (Huang was playing the mobile game on a company-issued iPhone.) "In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss," NTSB chairman Robert Sumwalt said at the end of the hearing on Tuesday. "We urge Tesla to continue to work on improving their Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken where necessary. It's time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars."
Rei Defending Tesla In 5 (Score:3, Funny)
4
3
2
1
Re: (Score:2)
I mean I've weaned myself of wanting a Tesla but I must say, if someone put his mobile home on cruise control and went to take a nap, we'd all go "What a complete moron".
I'm not sure why it should be different with autopilot. The term autopilot is only used in movies to describe something that can make a pilot obsolete. Real world aviation doesn't work that way either.
I mean why does nobody sue Tesla because the cars don't go plaid when they activate ludicrous speed?
Re: (Score:2, Informative)
Hi, I'm Troy McClure! You may remember me from such films as....
Wait, wrong thing :)
The Slashdot summary is of course incomplete. The things NTSB blamed were:
* Tesla for not doing more to stop distracted driving and the software not preventing the accident
* The driver for becoming overconfident, to the point that he was playing a video game when the crash happened, and not once applied the brakes or steered
* Apple and other smartphone makers for not locking out games, texts, etc when
Re: (Score:2)
Because they already answered that question in the most embarrassing way possible -- no, it does not [arstechnica.com]:
Re: (Score:2)
1) This was about NTSB, not NHTSA
2) QCS is a guy (Randy Whitfield) working out of his house, nitpicking and attempting to throw out the vast majority of the data because trivial details were incorrectly reported, leaving only data that didn't invert the conclusion unchallenged. It's like citing Watts Up With That in a climate change discussion. It's amazing that anyone ever gives him the time of day, but when it comes to Tesla...
That said, Whitfield sometimes works with plaintiffs attorneys, so he knows
Re: (Score:2)
I literally presented you with a link of a court of law stating exactly what I said re: Whitfield's deliberate cherry picking methodology. How much better of a reference could I have presented than a court of law? Certainly not "Timothy Lee at Ars Technica presents everything Randy says as if it's gospel".
Re: (Score:2)
Tesla did defend itself, so much so that it decided that it would front-run the investigation [washingtonpost.com]. Now it has 90 days to respond to the NTSB report, [theverge.com] which if it squanders like the previous comment period on driver assistance safety systems, will remain its own damn fault.
Re: (Score:2)
Tesla's release of information was in response to the fact that the NTSB had already been making public claims about the case, yet was banning Tesla from doing likewise.
I absolutely would have done the exact same thing Tesla did.
You clearly weren't watching the hearing, so let me help you out: during the hearing the NTSB railed against pretty much everyone under the sun for not responding to their letters, not just Tesla. Now they're demanding again responses
wait (Score:4, Insightful)
Isn't this the guy that kept complaining to Tesla's feed about the trouble spot around 101S / left-side exit to 85, and finally smacked into the barrier with no hands on the wheel? The bad part was the water barriers were missing because of a recent previous accident, so his accident was into the concrete, and it killed him. Lesson, don't take your hands off the wheel especially at known problematic points. I thought this was painted as 100% driver error already.
Re:wait (Score:5, Informative)
I don't believe there have been water (or sand) barriers at that location for many years, if ever.
There has been an (somewhat) energy absorbing barrier since at least 2011 [goo.gl] (and as I recall it was not unusual to see it in a state of disrepair) but it had been upgraded by November 2017 [goo.gl] and looks much the same in May 2019 [goo.gl].
You are correct that at the time of the crash, eleven days earlier the barrier had been hit by a Prius going 70 MPH (the driver apparently walked away from that one) and the barrier had not been reset/repaired by Caltrans yet - in part probably because the CHP failed to notify Caltrans.
Re:wait (Score:4, Insightful)
Yes. The short version is this:
1. he already knew that autopilot had an issue going through that section of road, because he had complained to Tesla Service about it, and then complained to friends about Tesla Service.
2. there was no "crash attenuator" on that concrete divider because someone else had already destroyed it in a crash, and CalTrans hadn't bothered to replace it
3. he used autopilot there anyway, even though he had issues with it before
4. while using autopilot in a place where it obviously didn't work properly as evidenced by this guy's own previous experience, he thought it prudent to break out his iPhone and fuck around instead of paying attention. I wouldn't be surprised if the EMTs found the phone still running some god damn video game at the scene.
Being as you have to agree to a big fat notice before being able to turn on autopilot that says you will stay attentive and be ready to take control of the vehicle, I'm finding it hard to see how this isn't 100% driver error. He drives any other car into that highway divider and there wouldn't be any story about it at all - for example, the crash that took out the "crash attenuator" before him - what make and model was that car? Nobody knows, because it wasn't a Tesla.
Re: (Score:2)
Examples of cognitive dissonance:
and
or
and the reply posted 5 hours before to the same parent comment
Re: (Score:2)
Being as you have to agree to a big fat notice before being able to turn on autopilot that says you will stay attentive and be ready to take control of the vehicle, I'm finding it hard to see how this isn't 100% driver error.
The accident was 100% avoidable by the driver, but that doesn't mean that the accident was 100% driver error. Similarly, just because an improved Tesla system could have prevented the accident doesn't mean it was 100% Tesla error. As accident investigation teams often correctly conclude, the accident was a combination of multiple factors, each of which individually would have mitigated or completely avoided the accident. Yet, that ability of each factor to individually avoid the accident never means that
Re: (Score:2)
Where is your evidence that LIDAR would of prevented this crash scenario ?
The big problem for a computer is understanding what is street furniture and what is a stationary object such as a stopped vehicle. In this accident the barrier was deformed which will make its identification hard for a computer.
The main failing is the poor standard of US freeway line markings and barrier locations. The concrete barrier was supposed to have a crushable barrier to help save lives when a car hit it.
In the UK, this crash
Truth hurts (Score:5, Insightful)
Not only was the guy an Apple software developer, not only did he previously complain to his family his Tesla acted erratically on that stretch of road, not only did he notify Tesla of this erratic behavior, he then went and drove that same stretch of road while playing a video game on his phone using "auto pilot" rather than driving it himself.
Tesla can certainly bear some of the blame for this crash, but the final responsibility falls on the driver who couldn't be bothered to drive his vehicle but instead relied on software he suspected, or had reason to suspect, was not up to the task of driving for him. From another story [marketwatch.com]:
Re: (Score:2)
He totally meant to be paying attention and yank the steering wheel at the last second, but he was totally beating his high score on Temple Run.
Re: Truth hurts (Score:2)
Clearly this was all Apple's fault for not telling their employees to not play games while driving. /s
This whole incident is stupid. Distracted driving caused this crash.
The driver knew (and apparently regularly complained) about the limitations of autopilot doing exactly what it did. So as far as he should have been concerned it behaved as it was designed to do.
A corporate policy... Do we need a corporate policy on not murdering people too. "The FBI said that the Green River Killer's employer had no policy
Re: (Score:2)
In my state we don't have "energy absorbers" at all, if you hit a divider, you hit a divider.
Re: (Score:2)
Re: (Score:2)
As the report notes there is blame for the system not checking if the driver is paying attention - torque on the wheel is meaningless and it didn't even detect that for 7 seconds before the accident.
The report recommends that Tesla stops allowing that to happen. Maybe reduce the maximum no-torque detection time to 1 second to start with, but really they need to just abandon that idea and switch to using attention monitoring cameras instead. Cadillac, Nissan and Lexus all use cameras. As a bonus you can go h
Re: (Score:2)
Agreed (Score:2)
Autopilot gives you multiple warnings about keeping your hands on the fucking wheel. This is a Darwin Award for sure.
Re: (Score:2)
Re:Truth hurts (Score:5, Insightful)
Caltrans failed to replace the crash barrier. Had any one of these failures not occurred, the accident would not have happened.
Er. the *accident* still would have happened without Caltrans. The fatality might have been averted though.
Re: (Score:3)
Not necessarily. The Autopilot failure occurred in part because Caltrans didn't mark the exit with crosshatches leading up to the gore zone, didn't maintain the lane markings, etc. So there's actually a good chance that the entire accident would not have happened if Caltrans had done its job properly.
Re: (Score:2)
Even if Caltrans had fixed the crash barrier before the accident, it's not the case that "the accident would not have happened" - however the severity of the accident might have been much less and perhaps the driver would have survived with few injuries.
The driver of a Prius that hit it eleven days earlier at 70MPH supposedly walked away from it but a Model X weighs about 70% more than a Prius so the barrier would have to absorb more energy and that might affect how much it helps (if, for example, a Prius a
Re: (Score:2)
Re: (Score:2)
I think Caltrans should take some of the blame for the frequency at which the barrier has been hit... and the bad lane markings. It appears (on streetview) they have since added chevrons for the non-lane lane, but it looks like people are still hitting the thing.
Re: (Score:2)
Worse, the intersection used to be considerably safer, until Caltrans made it markedly worse a couple of years before this accident by changing things so that the lane splits off much later and at a much steeper angle.
Re: (Score:2)
That thing probably gets hit at least once a week from idiots not paying attention.
Re: (Score:2)
The driver of a Prius that hit it eleven days earlier at 70MPH supposedly walked away from it but a Model X weighs about 70% more than a Prius so the barrier would have to absorb more energy and that might affect how much it helps
From the charring and fire extinguisher foam you can see that there was a massive bonfire in the passenger compartment. Perhaps had something to do with it? The passenger compartment appears to have been snipped open with jaws of life [wikipedia.org] rather than torn away. This suggests that the impact itself may not have been the cause of death.
Re: (Score:2)
It was reported at the time that the impact was so severe it compromised the battery pack. Lithium + water = fire.
I saw this on Oprah... (Score:2)
She was pointing at everybody in turn, saying "You get some blame! You get some blame! And you get some blame! Everybody gets some blame!"
No Distracted Driving Policy? (Score:5, Insightful)
Companies need to have policies their employees follow the law ? Or what, it's implicitly condoning the behaviour ?
Re: (Score:2)
...and he was commuting in his personal car, not even driving on company business or in a company car.
I wonder if the NTSB thinks that Apple is at fault in all distracted driver accidents where the driver is using an iPhone because the idiot didn't follow the law with respect to not using the phone (hands on) while driving. If he had killed someone else, the driver would have been responsible for those damages (of course, Apple, Tesla, the CHP, and Caltrans would be sued also -- sue everyone in sight is wha
Re: (Score:2)
NTSB also blame the phone manufacturer (I guess that's also Apple) for not somehow detecting the person using the phone should have be driving:
Electronic device manufacturers have the capability to lock out highly distracting functions of portable electronic devices when being used by an operator while driving, and such a feature should be installed as a default setting on all devices.
Quite how phones are supposed to distinguish a driver from a passanger, I have no idea.
Prizes for all (Score:2)
Huang was playing the mobile game on a company-issued iPhone.
I'd call "Darwin Award", but those are just the participation trophies now. The Apple employment, though, adds a nice Jungian touch.
Re: (Score:2)
Re: (Score:2)
Hmm... you know how Luke Skywalker and Han Solo keep coming back over and over? Sequel after sequel? And how actually they are coming back from King Arthur and Merlin, and a million "rakish rogue" characters from Westerns?
That's because they're archetypes of the Collective Unconscious, as Jung would put it.
At base, one could say you have no choice but to identify with it and find it interesting, and the next variation on "Han Solo", and the next, and the next... because to do that is intrinsic to the stru
Re: (Score:2)
Sounds like a long winded philosopher who says many words with little meaning.
Level 5 (Score:4, Informative)
Re: (Score:2)
The problem with partial self-driving levels (Score:4, Informative)
1) If it seems to be self-driving fine it will lull you into not really or fully paying attention to the situation (the road, the context).
2) Google study found that it took multiple seconds (I can't remember how many exactly) for a not really paying attention driver to figure out what's up and execute an appropriate response, once warned that they should take over.
Several seconds amounted to a football-field distance or something, at highway speeds.
Re: (Score:2)
Re: The problem with partial self-driving levels (Score:2)
Re: (Score:2)
Personal responsibility. Put the fucking phone down or face the consequences, just like in literally any other car.
Why is that hard?
Re: (Score:2)
You can't make things idiot proof. There will always be some casualties along the way.
Re: (Score:2)
Re: Level 5 (Score:2)
Re: (Score:2)
Indeed, only time will tell. My concern is that there's always another edge case situation Tesla can't think of in advance and code for. So instead of a general driving solution, "Go A to B without hitting anything", there will always be some gotcha in the Tesla piece meal plan. Then someone dies due to computer glitch.
While not condoning Tesla in any way, it's important to keep in mind that automated driving doesn't need to flawless, It just needs to do less mistakes than human drivers - it's not like we have a death rate of 0 today.
Re: (Score:2)
It just needs to do less mistakes than human drivers
latest statistics of tesla accidents with and without AP point to it being close to that already.
Re: (Score:2)
I think you misrepresent a "computer glitch" as being a bug but the fact is that the scenario was not supported. AI will need to be taught what a damaged crash barrier looks like.
It is the same problem as a baby that has to learn about its surroundings. AI has to learn, rather than be programed with an algorithm from a human.
Just like a baby, Tesla's approach makes certain features mature before other features.
As an adult, you have to react to edge cases in life and you can fail. Similarly, AI can fail.
The
Re: (Score:2)
waymo has an agenda...
anyway, autopilot works fine IF you know its limits and you monitor it properly.
its a benefit. you don't have to micro-poll 'are you centered?' and 'are you at the right following distance?'. it takes care of that and you just have to poll at 1/10 or less - to watchdog the system, as it were.
its a net gain, really. those who have never experienced it, you need to get first-hand experience or your comments are worthless on this subject.
its NOT an all or nothing thing. each bit that
Re: (Score:2)
Lets say you had two types of Teslas, one with the current autopilot and one which only assisted in steering when you got close to drifting across lanes or another car and sounded an annoying alarm when it did (which you couldn't turn off). All else being equal which do you think would cause less accidents and drive into less firetrucks?
Steering assist is fine, steering automation is fucking insane ...
Also lets say a Tesla on autopilot has a cop in front of the next stationary car it plows into, do you thin
Re: (Score:2)
The one where the computer does more will almost certainly have fewer accidents, statistically. When someone doesn't pay attention to the road or falls asleep at the wheel
Re: (Score:2)
The main issue here is finding who is responsible (read: who has to pay):
Tesla for overselling the "autopilot"?
NHTSA for negligent maintenace of the infrastructure?
The idiot inside a Tesla for being overconfident on the automated system?
All of the above?
Re: (Score:2)
Literally hundreds of thousands of people using it just fine, but yours drove "scary af" - you don't think it's possibly operator error, do you?
Re: (Score:2)
Waymo has said in interviews studies have shown them people need a full auto driving (level 5) and not assist or people will just not pay attention when they need to be.
The funny thing is, that's equally true at full manual driving (level 0) -- occasionally people stop paying attention when they need to be, and an accident results. Doesn't seem to stop anyone from driving, though.
Re: (Score:2)
Level 4 includes fine for everything but severe weather and fine for everything but geofenced.
Both of those would be perfectly safe if think.
Re: (Score:2)
I'm utterly surprised that a company still developing a product is spreading FUD in order to throw shade at a competitor that has something you can buy and use right now. That's basically like saying that cruise control shouldn't exist because people will just not pay attention to what speed they are going.
Can we all just agree that the driver still has the responsibility to control the vehicle, regardless of what "level" of driver assistance tech there is? And it's very ironic that a Google property woul
Its either SELF driving or its not (Score:2)
Agreed (Score:5, Insightful)
In a car on a busy road, that several seconds is often fatal, or ensuring some kind of accident.
Re: (Score:2)
The name autopilot is also used in boating, where you have to be alert most of the time. The problem is not the name. The problem is the drivers with rectal-cranial inversion. I don't mind if they kill themselves but they can kill others, and that's a problem.
Re: (Score:3, Insightful)
Re: Its either SELF driving or its not (Score:2)
Re: (Score:2)
...spoken as someone who has likely NEVER tried the assist techs.
do yourself a favor; stop writing about things you don't know and EDUCATE yourself.
take a test drive, at least.
Re: (Score:2)
More comfort means better ability to focus means safer driving.
I don't have anything that turns my wheel, and I actually like to manage my cruise speed rather than use the adaptive control, but I imagine the collision avoidance is safer than not having it (I sure hope o never find out though).
This is some serious idiocy (Score:2)
Re: (Score:2)
Their autopilot page said full self driving in gigantic letters as the very first think you see.
Nope.
The first thing you see is a section [tesla.com] titled "Autopilot and Full Self-Driving Capability" Right under that, they say:
Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.
Re: (Score:2)
Full Self-Driving Capability are intended for use with a fully attentive driver
That reads like Unlimited!* (*28800bps!!!)
If you must literally be as attentive, then what is the utility beyond relaxing your muscles? You can't be using a phone, because that might delay your reactions.
Re: (Score:2)
Autopilot doesn't mean the airplane flies itself. All it does is keep a heading and altitude.
Re: This is some serious idiocy (Score:2)
Re: (Score:2)
Legalese? It seems to me to be in pretty plain language. Is it doublespeak? Sure, but if it appears in their other marketing materials, the vehicle's owner's manual, and is disclosed at the time of sale, then I'd consider each owner sufficiently notified as to what is meant by "Full Self-Driving" vs "Autonomous".
Having said that, I doubt any disclosure occurs at the time of sale - so I would expect at least some liability imposed on Tesla for accidents involving failures of the system.
I don't own a T
Re: (Score:2)
IMO, that's entirely the wrong takeaway here. Nobody(*) intends to fall asleep at the wheel. The correct takeaway is that you saw someone fall asleep while driving 75 miles per hour and live to tell about it.
Re: (Score:2)
Full Self-Driving Capability
This is like the furniture store 'up to 90 percent off' sales, where nothing is really on sale. If those words are used then Tesla should be sued into oblivion.
Yep the bold faced lie is right there on the support page:
Autopilot and Full Self-Driving Capability Features
Self driving, I don't think there is any confusion as to what that implies (unless you are a weasel lawyer). In fact it says FULL self driving.
Re: (Score:2)
Someone should stuff you into the trunk of a Tesla and then drive it off a cliff. Autonomously, of course. You are literally worth less than the bacterium floating in front of my nose at this moment.
Not Tesla's fault. (Score:4, Insightful)
Stop blaming the manufacturer for the driver's own stupidity.
Re: (Score:2)
Re: (Score:2)
The report notes that while he does take some of the blame there are other factors:
- Autopilot lulls drivers into a false sense of security. The repeated warnings probably have the opposite to the intended effect, making the driver more eager to ignore and dismiss them. Much like security warnings on computers that users blindly click through.
- Smartphone games are addictive and encourage users to pay attention to them at inappropriate times. Apple has implemented a feature that disables some phone function
Re: (Score:2)
- Autopilot lulls drivers into a false sense of security. The repeated warnings probably have the opposite to the intended effect, making the driver more eager to ignore and dismiss them. Much like security warnings on computers that users blindly click through.
This is fundamentally a problem with humans, I would say it is more pertinent to develop independent driving cars (yearly there are more than 1 million traffic realted deaths).
- Smartphone games are addictive and encourage users to pay attention to them at inappropriate times. Apple has implemented a feature that disables some phone functionality when driving but it is disabled by default.
tesla is not the correct entity to address addiction.
They completely missed the point (Score:3)
The whole point of autopilot *IS* to let the driver be distracted. It is to let the driver be on his phone. If it crashed while on autopilot, that's a fault in the autopilot system. Blaming the driver on being distracted misses the whole point of automated vehicles in the first place.
Re: (Score:2)
Technically it's a drive assistance feature, like cruise control. You don't have to operate the steering wheel manually. Like cruise control you are supposed to monitor it constantly.
We seem to have found the point at which drivers stop paying attention. Cruise control requires attention to keep the car in lane, but take the need to steer away and people start paying with their phones, wedging fruit into the wheel to defeat the hands-on detection, taking naps etc.
Re: (Score:2)
The dead guy apparently didn't understand this either.
Re: They completely missed the point (Score:2)
Re: (Score:2)
> where is that EVER stated as a purpose?
How is that even relevant?
A nod's as good as a wink to a blind horse.
Re: (Score:2)
> where is that EVER stated as a purpose?
The called it "autopilot."
It's always the brake (Score:2)
And the driver not using it to stop the car before a crash.
Re: (Score:2)
Actually, it's the car keys.
you had one job: (Score:2)
To drive.
Mobile device distraction is a plague on society. In Australia if you so much as touch or hold a device with a screen-phone, tablet etc, even whilst stationary, it's a $450 fine and 3 demerit points. Double in NSW over the holidays.
Re: (Score:2)
Deadly Tesla Software (Score:2)
I test drove a red Model 3 last fall. The Tesla salesman rode along. As I was driving, I noticed that other vehicles at about 135 degrees, that's my right-rear quadrant, were not appearing in the diagram on the display panel and I mentioned that to the salesman. He looked to his back right, then at the dash-mounted display and opened his mouth to speak. I could tell from his expression and body language that he was winding up to explain to me why I was wrong. What he had seen took a moment to register
Re: (Score:2)
From a legal standpoint, it only ever comes down to one thing.
You're the driver, you're responsible. If you rely on a reversing camera, reversing sounder, parking sensors, lane-deviation monitor, blind-spot indicator, cruise control, automatic emergency braking, signpost-recognition, speed limiter, hell even a mirror...
YOU are still responsible. Even if the thing fails, doesn't do it's job, misses everything, doesn't show you something that's present, or even if it does show you.
The driver is responsible.
Only in America... (Score:2)
and Apple -- Huang's employer -- for not having a distracted driving policy.
WHAT? Am I alone in considering this to be the most ridiculous "contributing factor" to a crash EVER? That somehow his employer should have told him he should pay attention when driving? Crueller people than me might ask how stupid Americans must be if they have to be told by ANYONE that they shouldn't become distracted by driving... let alone the fact that out of all the possible people who could tell you that - e.g. your parents, siblings, kids, friends, teachers, government, newspapers, entertainers, new
Autopilot wasn't needed for this crash (Score:2)
As evidenced by the number of cars WITHOUT autopilot that have crashed at this site.
Even in aircraft, there are "levels" of autopilots, from "keep it flying straight" to "follow a pre-programmed flight plan and land at a suitably-equipped airport and brake to a stop." But, even those auto-land-capable autopilots can't pick up ATC instructions and clearances. Yet.
People are confused about what "autopilot" means because no one seems to want to correct them, until a crash occurs.
Re: Dude was playing video game (Score:2, Informative)
Re: Dude was playing video game (Score:4, Insightful)
Re: (Score:3)
Right, go get your pilots license.
It will cost at least 10 grand and take around 60 hours of flight time and countless hours of ground study. And you still won't touch an autopilot because the 172 isn't gonna have it.
Cars are operated by any moron with a pulse. So if anything, these features have to be way more idiot proof, and not try to oversell their capabilities.
Re: Dude was playing video game (Score:4, Insightful)
Autopilots are fairly common on C172s. A 152 on the other hand (which would be more economical to train in) likely won't have one. I think you're missing the GP's point, though. I don't believe he was suggesting that one should go out and literally get a PP-ASEL, but rather suggesting that an ignorant idiot (I use "idiot" purposefully here, as the ignorance is willful) who insists that "autopilot" means "autonomous" were to educate himself on autopilot systems, he would find he is wrong.
Re: (Score:3)
Yeah you're right about the 172/152. But I don't think I'm missing the OP's point, although I probably could've been more clear. What I'm saying is that pilots are highly trained to understand the capabilities, limitations, and proper usage of the autopilot. Tesla drivers just buy a new gadget, dismiss the warnings and proceed to play angry birds and eat cheesburgers while on the highway.
Re: (Score:2)
Quote from beyond the grave: "how about next time I drive the car and you play the games?"