Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Transportation Google

Philosophical Differences In Autonomous Car Tech 247

An anonymous reader writes: The Guardian has an in-depth article on the status of self-driving car development at BMW. The technology can handle the autobahn just fine, for the most part. But the article highlights philosophical differences in how various companies are developing self-driving tech. European and Asian car manufacturers are fine working on it piece-by-piece. The car will drive itself when it can, but they expect drivers to always be monitoring the situation and ready to take control. Google's tests have taught it otherwise — even after being told it's a prototype, new drivers immediately place a lot more trust in the car than they should. They turn their attention away and stop looking at the road for much longer than is safe. This makes Google think autonomous cars need an all-or-nothing approach. Conversely, BMW feels that incremental progress is the only way to go. They also expect cars to start carrying "black boxes" that will help crash investigators figure out exactly what caused an accident. In related news, Google is bringing on John Krafcik as the CEO of its self-driving car project. He has worked in product development for Ford, he was the CEO of Hyundai North America, and most recently he was president of Truecar.
This discussion has been archived. No new comments can be posted.

Philosophical Differences In Autonomous Car Tech

Comments Filter:
  • by vakuona ( 788200 ) on Monday September 14, 2015 @11:05AM (#50518785)

    Expecting a driver to take control in a failure scenario is not a solution.

    I would never trust a car that could require me to take control in an emergency. At the very least, the autonomus driver should get the car to a safe stop before requiring a human to take over.

    • by Wycliffe ( 116160 ) on Monday September 14, 2015 @11:13AM (#50518831) Homepage

      Expecting a driver to take control in a failure scenario is not a solution.

      I would never trust a car that could require me to take control in an emergency. At the very least, the autonomus driver should get the car to a safe stop before requiring a human to take over.

      I agree that "expecting a driver to take control in a failure scenario is not a solution" and stopping is an acceptable solution but it doesn't necessarily
      need to be an all or nothing. A better piecemeal solution would be to have it only engage on known safe highways. It would still be extremely useful
      in trucks, RVs, and regular cars if it only engaged on predesignated roads or interstates. The trucking industry already has depots at both ends of
      Kansas where trucks double or triple up before taking the long straight stretch across Kansas to minimize drivers. I see no reason why driverless
      cars couldn't do the same where you could only engage autopilot on certain known safe highways with good shoulders to do emergency stops.
      You could also do the same with weather. If it detects rain starting then it gives a 60 second warning and pulls over to the side of the road.

      • by TWX ( 665546 )
        I do think that over-the-road and other long-distance highway or freeway applications will come first, but even then, the cars will need to be able to handle rain and snow and other mild weather without requiring driver intervention.
      • > better piecemeal solution would be to have it only engage on known safe highways.

      • If it detects rain starting then it gives a 60 second warning and pulls over to the side of the road.

        Imagine you have a road full of driverless cars, and it starts to rain. Suddenly the shoulders get jammed as every single car tries to pull over.

        I see no reason why driverless cars couldn't do the same where you could only engage autopilot on certain known safe highways with good shoulders to do emergency stops.

        There are none. Any road has potential to have construction at any time, for example, without warning. This particular problem could be fixed with legislation requiring all construction projects to be entered into a database in advance, but do you really want your cars driving to be affected by a remote database? Seems like a security issue.

        • There are none. Any road has potential to have construction at any time, for example, without warning. This particular problem could be fixed with legislation requiring all construction projects to be entered into a database in advance, but do you really want your cars driving to be affected by a remote database? Seems like a security issue.

          I expect that the first adopters of this will be big rigs with plenty of money and as such, I could see them even constructing special pull off areas and yes, even requiring all construction on the road to be documented. Toll roads which are privately owned would be a reasonable place to start.

      • by raymorris ( 2726007 ) on Monday September 14, 2015 @11:48AM (#50519043) Journal

        > better piecemeal solution would be to have it only engage on known safe highways.

        Also, what we're already seeing is more and more driver assist. My 2012 Dodge has an option for "smart" cruise control where is slows down if you're getting too close to the car In front. Most cars these days have traction control, where the computer automatically brakes the wheels independently in order to turn the car in the direction the steering wheel is pointed. Do we already have systems that will nudge the steering a bit when you start to drift out of your lane? If not, that could be added. Not overriding a clear steering input from the driver, just a slight torque so that the existing self-centering action of the steering wheel follows the lines which mark the lanes. In other words, with today's cars, if you let go if the steering it'll tend to go straight ahead. Mayb with tomorrow's cars if you let go of the wheel they'll TEND to follow the lane.

          On my 2012, the headlights automatically turn on and off as needed.

        I could see more and more of that stuff being added, stuff where the computer insures that the car does what the driver wants/expects it to do. Eventually, you slowly get to the point where "what the driver expects" is defined by the destination they select in the gps.

        • by tlhIngan ( 30335 )

          Also, what we're already seeing is more and more driver assist. My 2012 Dodge has an option for "smart" cruise control where is slows down if you're getting too close to the car In front. Most cars these days have traction control, where the computer automatically brakes the wheels independently in order to turn the car in the direction the steering wheel is pointed. Do we already have systems that will nudge the steering a bit when you start to drift out of your lane? If not, that could be added. Not overr

    • Expecting a driver to take control in a failure scenario is not a solution.

      Yes it is. Its just a solution that can't be used in America where no one feels any need what so ever to have any personal responsibility for the situation they are in. You're just demanding to be pampered and ignore the fact that in a failure situation THE CAR CAN'T BE TRUSTED. Thats what a failure means in this context. You'll die in a car accident even with Google's car because you'll be bitching about how its not working right when ti drives off a cliff instead of just putting your foot on the damn

    • by fermion ( 181285 )
      In all these things, the question we should be asking is if the cars are safer. With many modern cars resembling living room instead of cockpits, I would say that autonomous driving is not only going to happen, but it will be necessary for the future of what people want a car to be. As long as we don't see accident rates go up, or if the serious injury or death rate declines, then all will be well. Much of this is going to be driven by acquisition costs and insurance rates. Acquisition costs will be effe
    • by argStyopa ( 232550 ) on Monday September 14, 2015 @12:11PM (#50519199) Journal

      And I'd absolutely disagree.
      The first step for autonomous driving would make sense to be implemented ONLY on long-stretch highway drives, with strong signals many minutes before exit-destination arrivals and a "pull over and stop" system for drivers that don't respond/wake up.

      To suggest that driverless cars have to be able to cope with every conceivable situation is totally unreasonable. Hell, HUMANS can't cope with "every conceivable situation", really.

    • At the very least, the autonomus driver should get the car to a safe stop before requiring a human to take over.

      Depending on the emergency that may not be possible to do. Rather, if you're going to go that route it should be interactive with the driver to safely transfer control and all the while continuing to attempt the safest maneuvers it can.

  • Stuck in traffic (Score:5, Interesting)

    by Kohath ( 38547 ) on Monday September 14, 2015 @11:16AM (#50518845)

    At first, I want autonomous driving for when I'm stuck in traffic. It should be able to handle that situation safely. Let's have that and then move on from there.

    • by pz ( 113803 )

      That would be awesome. Especially when nearly all cars have some rudimentary autonomous capability as well as the ability to communicate with each other. Then, when the light turns green, the entire fleet can move forward as one, rather than starting up with a traveling wave and wasting gobs of time. Intelligent intersections and cars will make urban travel far more efficient than the horrorshow one finds in some cities.

      LA freeways are a good use-case for autonomous driving: generally slow-varying traffi

    • There are philosophical differences in how people are developing self driving cars because there are philosophical differences in how & why people drive.

      Some people want to 'skip the boring parts'. Uber & Google are trying to replace cars for people that really don't want a car. They don't want maintenance, a car payment, to drive. They just want a magic transporter to get from A to B scheduled from their phone. When I'm stuck in traffic or need to get home from the bar, I want to press auto and fal

      • They don't want maintenance, a car payment, to drive.

        Sounds like a lot of computer users. But whenever a company makes a non-reparable cell phone, the internet has a meltdown. Two years ago this happened when a car company was going to produce a car with a sealed engine compartment.

    • At first, I want autonomous driving for when I'm stuck in traffic. It should be able to handle that situation safely.

      Agreed...this seems like a situation where many of the out-of-band variables are constrained, especially in bumper-to-bumper type traffic. Cruising "free form" down a widely varying array of roads and intersections with all sorts of random traffic situations seems much more difficult to manage safely and consistently.

    • by swb ( 14022 )

      Some may come pretty close. I drive a 2007 Volvo S80 with adaptive cruise control -- a radar panel keeps track of what's in front of you, and slows down to keep distance with the car in front and speeds up to the cruise set point if the car up front speeds up.

      My model is good to 20 mph, and about 15 mph it will cut out. It will slow way down but not quite come to a complete stop (I've tested this exiting a freeway ramp)

      The last time I was at the dealership reading the BS about newer models, it sounds like

    • by Ichijo ( 607641 )
      Why wait for autonomous cars to handle freeway traffic congestion when freeway traffic congestion is already obsolete [wikipedia.org]? The technology already exists, we just have to get the wealthy people to support such a progressive tax.
  • by Chrisq ( 894406 ) on Monday September 14, 2015 @11:18AM (#50518855)

    I think that google are correct in that you cannot expect a driver to be fully alert on a long trip all the time. On the other hand a car that could handle the autobahn, but not other roads (which could have pedestrians, horses, or even marching bands!) would be fine as long as it gave a "count down" of warnings as it approached the exit, and the driver knew that they would have to take over once they reached the slip road.

    Even here there should be some sort of fail-safe behaviour, if the driver does not acknowledge taking control the car should park up, and possibly phone teh emergency services (at some point someone will have a heart attack while being driven in an autonomous car).

  • I recall seeing an old illustration of a father and son playing chess while the car drives them to their destination on the freeway. In particular, the father had his back to the windshield and not paying attention to traffic. I guess today's technology still has a long way to go.
    • I recall seeing an old illustration of a father and son playing chess while the car drives them to their destination on the freeway.

      I recall seeing something similar in Popular Mechanics . . . except that the car was flying, not driving.

      I guess our technological development took a wrong turn at Albuquerque somewhere.

      • I recall seeing an old illustration of a father and son playing chess while the car drives them to their destination on the freeway.

        I recall seeing something similar in Popular Mechanics . . . except that the car was flying, not driving.

        I guess our technological development took a wrong turn at Albuquerque somewhere.

        Not a wrong turn. It was just bad prognostication. It's very hard to predict the difficulty of problems you don't yet thoroughly understand.

        • With some things (like flying cars), maybe. With other things (like space stations and moon bases), no.

          The future depicted in 2001: A Space Odyssey was almost completely plausible (even in 2001, much less 2015), except for the bit about HAL being self-aware and all that. Videophones, rotating space stations with gravity, and a Moon base could all have been done by 2001 if there had been the will and the budget for it. We even had videoconferencing by then, but thanks to poor regulation there was no stand

          • With some things (like flying cars), maybe. With other things (like space stations and moon bases), no.

            Sure, some problems are easier to understand. Well, unless you include the larger social context, which is why 2001 didn't happen :-)

    • by 0123456 ( 636235 )

      On Youtube, there's a video from the 50s about how you'll soon be able to get on the highway and your self-driving car will take you to your destination. Of course, they had 'road traffic control' towers controlling the cars, not computers.

      • That reminds of a 1970's disaster movie where the LA freeway system got so terribly clogged up that orders from the control room were given to send in the military to get traffic moving again.
  • Self-driving cars won't happen for quite some time in my estimation. With today's roads there are simply too many factors that the car won't know how to handle.

    If roads were retro-fitted with some sort of standardized guide-wire or other tracking/placement beacons then I think it would be more likely to work, but between the many variations in roadways, weird intersections, roundabouts, ramps, etc etc, PLUS factoring in all the out-of-band stuff like pokey drivers, bicyclists, motorcycles, etc etc, I just s

    • by Jeremi ( 14640 ) on Monday September 14, 2015 @11:44AM (#50519005) Homepage

      If you want to see where the state of the art is at (at Google, anyway), have a look at this video [youtube.com]. The first part is high-level info about Google's approach to the problem; the actual demonstration of the software starts at the 7m50s mark, and an example of the car dealing with something unexpected is at the 11m00 mark.

      Note that in none of the examples has the roadway been pre-fitted with any kind of tracking or placement system, and that these aren't imaginary scenarios; rather these cars have been on the roadways, in real life, for years already. If the problems were really as unsolvable as you suggest, I'd expect we'd have read about a number of hilarious and/or tragic mishaps by now.

      • by 0123456 ( 636235 )

        If the problems were really as unsolvable as you suggest, I'd expect we'd have read about a number of hilarious and/or tragic mishaps by now.

        Yeah, and if overclocking our chips was dangerous, we wouldn't have been running them at 3GHz in the pre-release benchmarks (oops, we picked the chips that were capable of running at that speed, not the ones we'd sell to you for $200, but don't tell anyone).

        How many of those cars were being driven in sunny weather by Google employees who were paying full attention to what's going on around them? And how many of those cars were being 'driven' by soccer moms talking on their cellphones in between yelling at t

      • by rockmuelle ( 575982 ) on Monday September 14, 2015 @12:36PM (#50519379)

        That's not exactly correct... the Google cars have incredibly precise maps of the roads they're on, not just the route, but maps of the actual surface of the road (e.g., where the potholes are). That level of detail available to the onboard computers is pretty much the same as having sensors on the road. It requires an incredible amount of prep work. Of course, map updates could be handled by sensors on other cars constantly providing real time information. It's a cool approach, but only practical when you have that level of detail available.

        Google, et al, are showing very controlled research projects. Even though they're testing in the real world, they're still highly controlled experiments.

        Sure, many of the problems are resolvable using this approach, but what we don't know is what new problems will evolve once there are more than a handful of self driving cars on the road. More research will help identify these, but anyone who's done real science or engineering knows that what works at small scale rarely scales as you would hope/expect.

        -Chris

    • by Guspaz ( 556486 )

      Also, no current automated car (including Google's) works at all when the roads are covered by snow, and large numbers of people live in areas that have at least some snow during the year. Purely automated cars are not practical in most of the US and Canada until they can handle that scenario.

    • I hear you, and and I agree with you, and my solution to these problems is for there to be a full set of manual controls, and drivers continue to be educated, trained, tested, and licensed, just like always, and consider the 'autonomous' system to be mainly a very sophisticated form of cruise control, mainly to relieve driver fatigue on long drives. In fact, based on certain revelations I've had recently about the current state of driver training, driver education and training needs some serious reform to a
  • by wired_parrot ( 768394 ) on Monday September 14, 2015 @11:40AM (#50518983)

    Business and practical considerations will mean that BMW's and other automakers gradual approach to automation will prevail. A gradual approach allows automakers to recoup their investment immediately, and allows automakers to fine tune their technology as each aspect of automation is rolled out. It is also important to note that regulatory agencies will react and set the rules for new autonomous vehicles on the road based on the technology that is on the road, so those carmakers rolling out the technology first - those with a gradual approach - will have a greater input on the regulatory nature of that technology. The risk for Google as that as the other automakers will end up defining that regulatory environment, their technology will be obsolete from a regulatory standpoint by the time it is rolled out.

    As much as I like Google's approach from an engineering standpoint, the truth is Google is already being left behind in autonomous car technology. Other auto makers are already introducing various aspects of self driving - from automatic emergency braking, lane assist, adaptive cruise control - so that by the time Google has their self-driving vehicles ready for the market, the major automakers will already have a road-test, established and entrenched set of technology they're working with.

    • As much as I like Google's approach from an engineering standpoint, the truth is Google is already being left behind in autonomous car technology. Other auto makers are already introducing various aspects of self driving - from automatic emergency braking, lane assist, adaptive cruise control

      You failed to describe anywhere Google is actually being left behind. Their systems obviously do all of those things.

  • by Moses48 ( 1849872 ) on Monday September 14, 2015 @11:43AM (#50518995)

    We need completely self-driving cars. But we don't need them to drive everywhere. Handling suburban, city, freeway, highway, et al is hard to program in. If we just focused on 100% freeway driving, I think that would be much easier to program. We could have a self driving car that drives on the freeway autonomously, but gives ample warning before an exit ramp where it expects the user to take over at the first stop.

    • by necro81 ( 917438 )
      As a step in the gradualist progression will be the phase-out of long-haul truckers. Instead, you'll have self-driving trucks that cover 95% of the route (the freeway miles) all by themselves (driving 24/7, as fuel permits), pull into a truck stop just off a prescribed exit, and have a conventional trucker drive it the rest of the way in. I could easily see Wal-Mart, for instance, going this way. The implications for the teamsters could be dire. Would those final-miles drivers be union, or would they be
      • Teamsters could always force a law stating that a human driver must be present in the vehicle at all times ostensibly for safety (but not actually be responsible for anything). Not only that, but because of the additional training involved will deserve higher pay. Truck driver becomes autonomous systems engineering manager.
  • It appears obvious that the human condition will put too much trust in the car. So, let's not let all humans operate self-driving vehicles for now. Let's say we instead begin with a very limited license that can only be obtained by specially-trained drivers familiar with expectations for device operation and manual override. Find a fleet of taxi drivers in a municipality, for example, or perhaps some transport vehicles that just bus passengers between an airport and hotels. Beta test car operations to d

    • Expecting new infrastructure (at least in the USA) is even more unrealistic than waiting for autonomous cars to become fully self sufficient.

  • There's a fundamental difference in trust between the two industries. Technology companies place little trust in users. Good software requires thinking of all of the dumb things a user could do to break it. Good hardware requires thinking of all the dumb things a user could do to break it. Good technology infrastructure requires identifying lots of critical paths and either automating, simplifying or building redundancies because failure will happen. Car companies are the opposite. Sure, they engineer
  • Generally speaking, automation makes us stupid. Oh, sure, it helps free us from drudgery, and won't get bored like up, but presents new failure modes that aren't always obvious during the design and testing phase.

    Over the summer, 99% Invisible [99percentinvisible.org] and NPR's Planet Money [npr.org] put out several podcasts ([1 [npr.org]], [2 [99percentinvisible.org]], [3 [99percentinvisible.org]]) on the automation paradox, and the Google car is front and center. So is Air France Flight 447 [wikipedia.org], which shows what happens when automation fails and humans can't properly respond.
    • Automatic transmissions are an example of what is to come.

      Theoretically automatic transmission drivers should be safer. They have one less thing to worry about and all the 'enthusiastic', stigish drivers have real transmissions.

      In reality if you want to identify a terrible driver you ask the question 'can you drive stick'. If the answer is no they are guaranteed to be rolling hazards. They simply pay less attention.

    • As the 99% Invisible article points out, the net effect of automation is better overall safety. The knowledge that is lost because people no longer know the details of how to operate a manual elevator, or airplane, or car, is more than made up for by the relative reliability of the automation systems that replaced the manual processes. Yes, it might be true that people don't know how the finer points of making horseshoes any more, but who cares?

      Yes, in the case of airline or train accidents, people can di

  • And require constant monitoring from human in charge.

  • ... that European/Asian car makers are trying to provide some convenience features to otherwise competent drivers. American (Google) makers are trying to build technology to keep people on the road who otherwise should be taking the shuttle van to the senior center.

    In Europe, they don't have any problem with telling incompetent drivers that its time to hand over the license and start taking the bus. Not so in the USA, where punching the wrong pedal and ramming the Cadillac through the coffee shop is not s

    • No. My German Aunt is still, sort of, driving.

      She has been a hazard for at least 20 years. Stops in the middle of a left turn to argue with her sister about where they are going.

      Took years and many tries to get her license in the first place. As bad a driver as any American I've ever met.

    • ....American (Google) makers are trying to build technology to keep people on the road who otherwise should be taking the shuttle van to the senior centre....

      These are the kinds of people who still have 12:00 flashing on their VCR. That's how dumb and foolproof the technology needs to be. Otherwise it's just an expensive gismo.

  • "Google's tests have taught it otherwise — even after being told it's a prototype, new drivers immediately place a lot more trust in the car than they should. They turn their attention away and stop looking at the road for much longer than is safe."

    Statistically, someone will likely die today as a result of distracted driving, so not sure how this isn't already true today. It sadly is.

    On a side note, what exactly did Google expect? People want to text, email, surf, sleep, eat, put on make-up, do just about any damn thing except actually pay attention and drive when behind the wheel today. Of course the consumer is looking towards automation for them to be able to legally engage in just about any activity other than paying attention to the road,

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...