MIT Study: Tesla Autopilot Drivers "Maintain Functional Vigilance" (mit.edu) 55
Long-time Slashdot reader Rei writes: Friday, the results of a study by the MIT Center for Transport and Logistics on autonomous system driver attentiveness were released, and the results were conclusive: "drivers do not appear to over-trust the system to a degree that results in significant functional vigilance degradation in their supervisory role of system operation".
The study, involving 323,384 miles driven (34,8% on autopilot) and 8682 "tricky situations" identified. Of the "tricky situations", 0% of incidents involved slow driver responses or missed detections; 4,5% rapid/timely responses; 90,6% anticipatory reaction (preventing the situation from occurring); and 4,9% "other". The study suggests that this is the result of two effects: 1) drivers effectively learn the limits of the system through usage; and 2) "tricky situations" are common enough so as to prevent excess trust by the driver in the system — creating the counterintuitive result that the better the systems become, the worse the driver may become.
While the study is limited by the age of the vehicles (under a quarter were even running HW2, vs HW3 which is being released now — and due to the length of the study, most of the miles were accumulated on older software versions), it offers positive conclusions — but also a precaution — about the integration of humans and driver assist systems.
In other news, Tesla has announced an April 22 Autonomy Investor Day to showcase the capability of its development versions of the software in city driving, and has started rolling out stoplight detection, no-confirmation automated lane changes and exits, and a limited rollout of advanced summon (navigates through parking lots without a driver).
The study, involving 323,384 miles driven (34,8% on autopilot) and 8682 "tricky situations" identified. Of the "tricky situations", 0% of incidents involved slow driver responses or missed detections; 4,5% rapid/timely responses; 90,6% anticipatory reaction (preventing the situation from occurring); and 4,9% "other". The study suggests that this is the result of two effects: 1) drivers effectively learn the limits of the system through usage; and 2) "tricky situations" are common enough so as to prevent excess trust by the driver in the system — creating the counterintuitive result that the better the systems become, the worse the driver may become.
While the study is limited by the age of the vehicles (under a quarter were even running HW2, vs HW3 which is being released now — and due to the length of the study, most of the miles were accumulated on older software versions), it offers positive conclusions — but also a precaution — about the integration of humans and driver assist systems.
In other news, Tesla has announced an April 22 Autonomy Investor Day to showcase the capability of its development versions of the software in city driving, and has started rolling out stoplight detection, no-confirmation automated lane changes and exits, and a limited rollout of advanced summon (navigates through parking lots without a driver).
It's not autonomous driving then (Score:5, Insightful)
When it comes to cruise control I know I'm driving and I have to make sure the cruise control doesn't run me into another car.
With "autopilot" I'm supposed to be fully aware at all times of what going on while I'm not actually doing anything. Basically it is a neat toy that I can play with to see what it will do. No thanks.
Re: (Score:2)
Drove an automatic shift for the first time in years recently. Really weird experience. Car always downshifted or upshifted a few milliseconds before or after I would have (and a _lot_ of milliseconds in a couple of corners!). Didn't know what to do with my free hand (ya, ya ya!) [or my spare foot]. Felt like I was only partly in control - didn't like it.
Back to my 5-on-the-floor that keeps me awake and involved.
Mac
Re: (Score:1)
I read his point as being that the more automation is involved, the more humans tend to become mentally disengaged from the process at hand. That is in fact related to the Tesla Autopilot and other similar systems.
I am also an MT6 driver and I see Cutterman's point all the time. My friends who drive ATs are much less mentally engaged and more distracted than I am during driving, because the car is doing so much more for them automatically. I see the same in myself when I drive their AT cars: I find mysel
Re: (Score:3)
You obviously haven't tried any of the more advanced systems. They are much more than "toys".
I have two cars that both have adaptive cruise control and lane keep assist. While they are not full autonomy... they are _very_ useful. Adaptive cruise control in particular (where you set a desired speed - and a desired follow distance if you come up on another car) is a godsend on long trips. It removes all of the frustration involved with normal cruise control and people going slightly slower than you...
Lane
Re: (Score:2)
Adaptive cruise control in particular (where you set a desired speed - and a desired follow distance if you come up on another car) is a godsend on long trips.
I had a rental car with that feature. It was nice for some cases, but too often it was getting confused by slow cars in other lines (especially in a curve), and would suddenly brake.
Re: (Score:2)
I had a 2011 Audi S5 that had a fantastic radar-based cruise control, never got confused. At all.
Traded it in last year for a Model X, and the autopilot is a godsend. Cruise control is even better than Audi’s, and the “lane keep” function (a misnomer, but that buckets it with other manufacturers’ products - autosteer) is amazing. You really can’t know how great it is until you’ve done a road trip or two with it.
Re: It's not autonomous driving then (Score:2)
Iâ(TM)m sure there are some systems that are better than others: but neither of mine have ever done that.
For reference I have a Ford from 2 years ago (radar and camera system) and a brand new Subaru (two camera system). Overall the Fordâ(TM)s system is better (quicker to acquire, etc.) but both are really good.
Re: (Score:2)
You obviously haven't tried any of the more advanced systems. They are much more than "toys".
I have two cars that both have adaptive cruise control and lane keep assist. While they are not full autonomy...
I've had both of these. They are nice features when they work. They get confused by many common situations, such as bad weather. Regardless, as you said, they are not "full autonomy". You must still be the driver at all times.
Re: (Score:2)
Adaptive cruise is what finally made cruise control useful again. A bummer that rain confuses it, but other than that, it works perfectly on all the Toyota's I've had it on. Until I got that, I couldn't remember the last time I bothered with normal cruise. These days you really need to be out in the boonies to have sufficiently light traffic that you're no constantly having to tweak your speed.
Bonus points now is that Toyota bundles this with pre-collision detection. That has kicked in a number of time
Re: (Score:2)
i hate thinking of a subject every god damn time (Score:1)
Re: (Score:2)
Re: (Score:2)
A couple of anecdotes do not change overall statistics.
Re: (Score:1)
You don't know what the word anecdote means. (Score:3)
Re: (Score:2)
I fully expect that we will one day have an amazing self driving system, but it will have to screw up a million times to get there. So those mistakes do happen and they will continue to happen and systems will continue to improve because of them.
Re: (Score:2)
It doesn't, but 'sample error' does.
Re: (Score:3)
I was Studied (Score:5, Insightful)
I am one of the drivers in the study. They equipped my car with three cameras, one for my face, one for my hands, and one out front. They also record data directly from the car and audio. When I look at the animation of the various trips, I can recognize my drive to work.
I generally turn on Autopilot (AP1) anytime the road has paint on both sides of the lane. I've learned that there are some situations it handles poorly, such as coming over the crest of a rise, so that accounts for a lot of the disconnects. On highways, we're on Autopilot most of the time, and it's really quite good, though I watch for stopped cars, construction zones, and exits (it used to be bad about following the right paint into the exit).
If you have any questions about my experience, I would be happy to answer. (I'm not seeing messages from Slashdot on post replies recently; I'm not sure if something broke, but I'll try to check back.)
I signed a release for video clips of disconnects for release with this paper, so there will probably be some videos of me somewhere.
Re: (Score:2)
Thx for sharing your experience.
Do you feel that knowledge of being watched tends to increase attentiveness in your case, or did you feel you drive as you normally do? Also, were you contacted for the study, or did you apply? I'm sort of curious about the selection method, and whether it might introduce any inherent bias.
To be honest, the study's conclusion probably goes against most people's expectations, and so its always worth looking at the study with a reasonably critical eye. It's a good thing that
Re: (Score:3)
There were flyers at the Tesla service center in Watertown, and I contacted them. We had the cameras installed in September of 2016. Occasionally I remember the cameras and pay extra attention, but that's rare, probably 1% of the time. We pretty much ignore them.
Re: (Score:2)
Re: (Score:2)
Hmm, having multiple cameras pointed out you and recording data and audio constantly is quite likely to affect your attention levels during driving. People behave differently when they are aware that they are being observed.
I wonder if there might be a kind of "uncanny valley" for level 2 autonomy, where it gets really good and lulls people into a false sense of security. AP is currently not reliable enough that people are going to trust it.
Re: I was Studied (Score:1)
What are the confidence intervals? (Score:2)
"0% of incidents involved slow driver responses or missed detections" is not a believable absolute statement, at least not without statistical confidence intervals. There are already recorded historical incidents that contradict this 0% statement, so the statement is obviously not true in an absolute and infinite precision way. Either the study needs to state the confidence intervals or state how the study differs from the past contradictory historical incidents. For example, how does the statement that
Selection bias (Score:2)
Since Tesla were too cheap (many meanings here) to build-in a camera pointed at the driver, this study installed one...
Not only does the driver know they are being watched, the type of driver that agrees to enroll in this study is comfortable being surveilled.
How were the results corrected for that? How *can* they be?
That's because all Tesla owners are wealthy (Score:2)
Rich people don't get rich by making bad decisions.
I guarantee you that less than 5 minutes after Tesla sells their first economy subcompact there will be a "ghost ride the whip" video on Youtube.