Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation

Self-Driving Cars' Shortcomings Revealed in DMV Reports (mercurynews.com) 181

A demand from the California DMV of eight companies testing self-driving cars has highlighted a number of areas where the technology falls short of being safe to operate with no human backup. From a report: All companies testing autonomous vehicles on the state's public roads must provide annual reports to the DMV about "disengagements" that occur when a human backup driver has to take over from the robotic system. The DMV told eight companies with testing permits to provide clarification about their reports. More than 50 companies have permits to test autonomous vehicles with backup drivers on California roads but not all of them have deployed vehicles.

It turns out that a number of the issues reported are shared across technology from different companies. Some of the problems had to do with the way the cars sense the environment around them. Others had to do with how the vehicles maneuver on the road. And some had to do with what you might expect from systems made up of networked gadgets: hardware and software failures. The disengagement reports themselves identify other problems some self-driving vehicles struggle with, for example heavy pedestrian traffic or poorly marked lanes.

This discussion has been archived. No new comments can be posted.

Self-Driving Cars' Shortcomings Revealed in DMV Reports

Comments Filter:
  • Holy shit ... (Score:5, Insightful)

    by Anonymous Coward on Thursday May 03, 2018 @04:19PM (#56550026)

    Baidu, a Chinese internet-search giant, reported a case in which driver had to take over because of a faulty steering maneuver by the robot car; several cases of âoemisclassifiedâ traffic lights; a failure to yield for cross traffic; delayed braking behind a car that cut quickly in front; drifting out of a lane; and delayed perception of a pedestrian walking into the street.

    Automotive supplier Delphi noted that its autonomous system âoeencountered difficulty identifying a particular traffic light,â and also said a GPS problem meant a vehicle didnâ(TM)t know where it was. Delphiâ(TM)s system also had issues with unexpected - usually illegal - behavior by other drivers, the company said in its report to the DMV.

    Drive.ai - which makes artificial intelligence software for self-driving vehicles - cited reasons for disengagement that included the swerving of a vehicle within a lane and "jerky or uncomfortable braking." The firm also noted a "localization error" that meant a vehicle was uncertain of its location, and a discrepancy in data from different sensors on a vehicle.

    Holy shit, they've just described, you know ... driving.

    This shit is what happens pretty much daily, and for which the act of driving requires you to have a high degree of situational awareness.

    Just this morning as I was driving into work, some clown turning off a street into the road I was driving on ... he hesitated, then apparently said "fuck it" and went anyway. Unfortunately he didn't seem to be aware enough or intelligent enough to have noticed me. The end result was I had to pretty much do a panic stop behind some idiot who unsafely pulled out into oncoming traffic, and was pretty much suddenly in front of me and driving at half my speed (which was the posted limit).

    Why the hell are these companies acting like they have self-driving cars when they clearly can't handle driving in the real world.

    This shit is never going to work if it can't handle random, illegal, and stupid behaviour from the humans on the road. And if they think the world is going to replace all cars with autonomous vehicles, then clearly they expect the rest of us to pay for that future.

    This is just hubris from the tech industry who are pretending they're closer to solutions that work in the real world than they really are. The fact of the matter is, we're not all going to run out and buy new cars to allow this awesome future as envisioned by corporations to actually ever happen.

    Sorry, but all of the stuff in the above quote is pretty much mandatory for driving a car.

    • My wife was taking care of the neighbors dog and it bolted out the door, down the driveway, and out to the road. The dog could not be seen by the car because of snow banks by the side of the road. The accident was avoided because the car saw my wife jumping up and down back in front of the house and waving so it stopped.
      • At legal speeds in residential neighborhoods, a self-driving car that detected the dog promptly would have a very high likelihood of stopping in time. Brands with crappy sensors would hit the dog, but ones with good sensors you're not going really be able to jump out that fast when the vehicle is going that slow. Brakes are too good at stopping at low speed.

    • by be951 ( 772934 )

      This shit is never going to work if it can't handle random, illegal, and stupid behaviour from the humans on the road.

      Yeah, it's too bad they've stopped trying to improve them and are claiming they're ready for the road now. Oh wait, that's exactly the opposite of what's happening, as cited specifically in the article:

      As Telenav [representative of others] put it, “Our autonomous system is still being developed and we are working on improvement cycles. At this stage we expect that (the) driver will be taking over the car control from time to time due to the fact that it is new technology.”

      It's easy (also stupid and pointless) to say that if they never improve, they'll never be good enough, because none of them claim they're ready today and all are focused entirely on improving performance.

      This is just hubris from the tech industry who are pretending they're closer to solutions that work in the real world than they really are.

      I'm aware of at least one company for which that seems to be true. As for the more than 50 other companies

  • It is not like anyone knowning anything about AI could have told them it isn't ready for general roads, and only for special roads. It is not like allowing them on special roads first is exactly the plan in Europe where government listens to experts.

    • In the US we define experts as those who are paid by industry leaders to say the things we like to hear.

      • In Europe, they define experts as those who are paid by government leaders to say the things the leaders like to hear.

        • In Europe, they define experts as those who are paid by government leaders to say the things the leaders like to hear.

          Possibly, but fortunately we then have whole lot of governments with different idea, leading to a lot of different experts in debating field.

  • by smoothnorman ( 1670542 ) on Thursday May 03, 2018 @04:26PM (#56550084)
    Seldom (if ever) is there the rather obvious suggestion to limit autonomous vehicles to simple point to point 'highway' trips; but that's exactly where and how it should be done for the foreseeable future, if it happens at all. That is, the (literally) lethal mistake is to introduce autonomous vehicles into the complex and chaotic world of city driving. The next time you drive in the city consider how many of your decisions are predicated on understanding subtleties (some might occasion "stupidities") of human nature: "Is that guy looking at the person as they're talking on the corner? If so, they aren't as likely to start across the street" "Is that a child's toy which just bumped a bit into the road (to be chased by a child) or just a blown leaf?" "OK ...four way stop: it's that guys turn, but, he's got a cell phone in his hand he's consulting" ...etc. So, start out with truck loads from freeway exit 113 to 114, then if that works, exit 117...
    • That's exactly how GM does it: If the GPS sees your not on on a divided highway, no lane assist for you.

      • If the GPS sees your not

        Your knot?

        You're not?

        Something else entirely? Enquiring minds want to know...

        • I love to trigger grammer nazis. it good fun.

        • I was just trying to figure out what a "not on" was, and why it would be visible to the GPS. Maybe it is a type of jamming device.

          If it was a really a "knot on" "on a divided highway," in my State you end up having to register your address on a list for the rest of your life if you get caught doing that.

    • There's one edge case in "city" driving where allowing autonomy on normal streets makes sense and still works: bumper-to-bumper stop & go gridlock where nobody is moving faster than 10mph anyway, and having AI pay attention to the traffic is probably a net improvement over the 90% of drivers who are only halfway paying attention and watching cat videos on their phone in their lap ANYWAY.

      One big area for improvement... two-way communication between highway networks and vehicles. It's common for an accide

    • by Kjella ( 173770 )

      The next time you drive in the city consider how many of your decisions are predicated on understanding subtleties (some might occasion "stupidities") of human nature

      Well, I can't say for certain how many accidents I've avoided through the human perspective. But if I'm tallying the accidents I and those I know have had driving they're almost exclusively caused by glitches in how we drive, like things you almost always pay attention to but then for some reason we had a lapse of concentration or was tired or got distracted or was emotionally on tilt. If not massive errors in judgement passing other cars that almost lead to head on collisions. In fact, I can't really say I

      • I don't think self-driving cars will be perfect, but I think they'll be a lot more consistent and always keep reasonable safety margins.

        Probably. However, on the flip-side, they will all consistently fail in the same way if they have the same bug (and bug-free software is impossible), unlike humans, whose errors are a lot more randomized. Sure, not all cars will run the same software, but the market in any given country is bound to be dominated by a single-digit number of manufacturers, compared to millions of human drivers (or a lot more). From a systemic perspective, human variation and imperfection is not a bug, it's a feature.

        The other

        • However, on the flip-side, they will all consistently fail in the same way if they have the same bug

          You say this like it's a bad thing. Once that bug is discovered and fixed, it's fixed in all of the cars. Yes, everyone is going to OTA software updates, like it or not.

          Unless it's a hardware issue, of course. I suspect that self-driving cars will get regular updates of "don't do this or you'll crash" warnings that nobody will read. Next model year will have that fixed, recalls might have to happen to update the hardware, etc. But we already do that shit for all of the other parts of the car, so there are s

    • Most logical place to start is in a city.

      Specifically, bus and garbage trucks.

      They are:

      ]1) Low speed
      2) Set routes
      3) Perfect for early morning/late night shift. Restrict them to the 2 am to 5 am shift.
      4) No need to worry about the state complaining as the state would be the people setting up the rules. They can set everything up with a focus on safety rather than performance.

    • Uber is already doing it.

      https://www.youtube.com/watch?... [youtube.com]

      They're using human drivers to move freight within cities and autonomous vehicles to transport them between cities to depots at the out skirts of the city where human drivers pick them up.

    • Seldom (if ever) is there the rather obvious suggestion to limit autonomous vehicles to simple point to point 'highway' trips; but that's exactly where and how it should be done for the foreseeable future, if it happens at all. That is, the (literally) lethal mistake is to introduce autonomous vehicles into the complex and chaotic world of city driving.

      Absolutely, self-driving cars should at first be limited only to freeways...not only for the very important reasons you mention, but also because freeways open up an easy (and relatively low-cost) avenue for installing autonomous-vehicle-specific infrastructure along them, which would make them much safer.

      However, the companies most pushing autonomous vehicle development don't want to do that, and it's clear why. Use on freeways only would just effectively make autonomous driving a fancier version of cruise

  • by Drunkulus ( 920976 ) on Thursday May 03, 2018 @04:41PM (#56550184)
    So the self driving cars act just like an inexperienced millennial who can't drive a stick or read a map, and texts constantly?
  • by Anonymous Coward on Thursday May 03, 2018 @04:51PM (#56550256)

    https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testing

  • I told you so.

  • GASP... who would have thought driving was actually hard for a computer to do? Maybe they can keep paying attention EXCEPT when it comes to concrete barriers, or backing up trucks, or pedestrians in the street. But millions of cycles a second means nothing if they're not successfully driving with them.
  • by RhettLivingston ( 544140 ) on Thursday May 03, 2018 @07:05PM (#56551150) Journal

    There are times that they can't see the faded lines in the road or the stop lights? Well, yeh. I agree. Same here. And I don't consider that safe either. There are many different inherently unsafe intersections near me. The only difference is that the self-driving vehicle can avoid taking the chance by turning it over to me to take the chance. I honestly don't know if it would be better if it kept control and tried it than handing control to me. But, the buck is passed.

    The biggest problem that autonomous vehicles have to overcome is that they expose the faults in a very bad system through extensive data collection. Exposing faults in something people depend on is never an easy road. The messenger often becomes the victim.

    Like most who have lived more than half a century, I've been in a number of accidents including:

    • During a light drizzling rain, an oncoming vehicle veered into my lane on a sharp turn (cutting the corner). I tried to turn even sharper to run into a yard and avoid the vehicle. My car broke loose and slid semi-sideways into the oncoming vehicle's driver side door. Contact was in my lane. I would not expect an autonomous vehicle to avoid this accident though if the other vehicle with a teenage driver (first year of driving) headed to school had been one of the new self-driving ones, I am confident it would have been avoided.
    • I was stopped in a two-lane highway with blinker on awaiting the passage of oncoming traffic before making a left turn. A vehicle rear ended my vehicle at full highway speed. The driver was a railroad engineer driving five hours to his home after having driven his train route. The collision woke him up. I am pretty sure that if he had been in a self-driving vehicle, that accident would have been avoided.
    • While driving into the sun through a parking lot, I drove down a lane that was not a "thru" lane. The only reason it wasn't was because eight inch high concrete stops had been placed across the lane to prevent thru traffic. I suspect a self-driving vehicle would have detected the barriers across the road and that single car accident would have been avoided. If it didn't, it would do no worse than what I did.
    • While driving through a construction zone at night on a two-lane road with the lanes separated by barrels, a driver from the opposing lanes suddenly attempted an illegal U-turn between the barrels in front of me. I barely had time to even move my foot from the gas to the brake much less stop before swiping across the drivers front end at about 40 mph. Both vehicles were totaled and the mark on my arm from the airbag deployment appears permanent. It is possible that if I had been in a self-driving vehicle that it might have seen her insane turn before I did, but I doubt it. On the other hand, if the distraught grandmother on the way to see her grandchild in the hospital who suddenly realized she had taken the wrong turn off the interstate had been in a self-driving vehicle, she likely would not have even been on that road, much less making an illegal turn.

    Driving is dangerous. Four out of four of the accidents above were because of driving while impaired in some fashion - a teenager unprepared to drive in rain, driving while tired, blinded by the sun while driving (should have stopped or greatly slowed), and driving while distraught. And there are numerous other incidents that didn't rise to the level that I would term an accident that would be reported by these self-driving vehicle regulations.

    Road maintenance is also atrocious in this country and human drivers die because of it every day. Human drivers are also prone to complain that others shouldn't drive while impaired and then make exceptions for their own needs.

    Maybe we should start this conversion by just shining a very bright light on reality - require every new car to be equipped with the sensors to record the reality and disclose every bit of it to the DMV - every solid line crossed, every rolling stop, every time the light is red and we're stil

    • The only difference is that the self-driving vehicle can avoid taking the chance by turning it over to me to take the chance. I honestly don't know if it would be better if it kept control and tried it than handing control to me. But, the buck is passed.

      Herein lies another important problem which is seldom discussed. "Disengagement" only works as an effective safety backstop if you have an experienced driver at the wheel..

      In other words, it only makes sense for the first massively deployed generation of autonomous vehicles, where you can be sure that almost everyone behind the wheel has already spent years driving on their own. However, once you have a generation of people growing up used to being driven around rather than driving, and being driven by the

  • Driving into the sun, any kind of snow, hydroplaning, hard to identify objects in the road, road construction, pedestrians doing stupid stuff, other drivers not following predictable actions or laws, driving in the rain, sensors failing due to age and wear, and about 100 other items are unsolvable problems that our level of technology, especially AI, is nowhere near yet. This is like all those "before their time" inventions in the 50's. The technology just wasn't there to make it real no matter how convinci
  • Even if there are failure cases for autonomous cars, if the overall safety is significantly less than human drivers reason says we should implement post haste. But the reality is humans are emotional thinkers, and we likely won't accept that. The first time a self-driving car runs over a kid we will be pulling them all off the road even though ten other kids were run over by humans the same day.
    • If there were ten million automated cars on the road and everyone saw them driving around successfully all the time, but then they hit one kid, it wouldn't be that much of a problem. But the fact of the matter is, there are only a handful of these on the road running in places that are carefully selected as being perfect conditions for them to drive in and they are still killing people. That's what the problem is, and in this case, emotional response is the correct one.
  • Trees block traffic lights and GPS signals. Pedestrians aren't always seen by the bot-cars. So unless you live in a desert, once in a while your car is going to try to kill you or a pedestrian.

    There is no way all traffic lights are going to be networked to talk to vehicles in the US. I predict "bot-car friendly roads" will be marked as such, and cars will have to shut down autopilot type driving in all other areas.

  • Lets start building the vacuum-powered hamster tubes from Futurama.

Garbage In -- Gospel Out.

Working...