Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Transportation

Volvo Unveils Autonomous Concept Car, WIth Retracting Wheel, 25" Display (computerworld.com) 154

Lucas123 writes: Volvo has revealed what is sees as the future of self-driving vehicles, a car that has three autonomous driving options, one of which includes a retracting steering wheel, reclining seats with foot rests and a tray table. Unveiled at the Los Angeles Auto Show this week, the Concept 26 also has a 25-in interactive display. Volvo is also among the first to address the subject of self-driving cars and liability, saying we firmly believe that car makers should take full responsibility for the actions of the car when it is driving in full autonomous mode."
This discussion has been archived. No new comments can be posted.

Volvo Unveils Autonomous Concept Car, WIth Retracting Wheel, 25" Display

Comments Filter:
  • by erp_consultant ( 2614861 ) on Thursday November 19, 2015 @11:57AM (#50963147)

    the need for Swede!

  • Volvo says it will be liable for any accidents its cars cause.

    So will the CEO do hard time if there is a felony car accident? Due to a software fault / sensor error?

    • Re: (Score:2, Insightful)

      by jellomizer ( 103300 )

      Even though Volvo will be liable, however that doesn't really excuse them for taking the steering wheel away from the driver.
      There is legal liability. However that isn't much condolence if you are in the car about to get into an accident which you could avoid if you could.

      Well I am going to lose my arm or die. However I will take comfort in the fact that I will get a big payout for this. Even though if I had access to a wheel and brake I could have saved myself.

      • Re: (Score:2, Insightful)

        by jellomizer ( 103300 )

        To expand further. It is like if you are driving and you can safely avoid someone rear ending you. But you don't because they are liable.

        • Re: (Score:3, Interesting)

          by Anonymous Coward

          To expand further. It is like if you are driving and you can safely avoid someone rear ending you. But you don't because they are liable.

          The way Americans love to tailgate (follow too closely) you picked a good example. I've actually had to slam on my brakes because someone in front of me suddenly stopped (deer, I think). Since I wasn't following too closely, I had a good margin. But the jerk-off behind me was following much too closely in his large SUV and I saw he was about to plow into me. So I then had to punch the gas, move up another meter or two, and slam the brakes again which gave him barely enough time to stop and avoid me.

          N

          • by Anonymous Coward

            So many hypotheticals. Self driving cars have eyes in the back of their heads! You don't. So, self driving cars aren't estimating distances, know exactly how fast they are going, frequently know road conditions (wet, dry), can calculate much more accurately and faster how soon impacts will happen in all directions. It is really quite amusing to assume that most drivers can perform better in an emergency situation. Does anyone still believe that seat belts don't save lives? Antilock brakes? People over es

        • To expand further. It is like if you are driving and you can safely avoid someone rear ending you. But you don't because they are liable.

          Let's ignore the fact that you rarely are going to be able to see or respond to an accident that the computer doesn't also see and can respond to. Even ignoring this, this is no different than you sitting on a bus, a plane, or a subway and seeing an accident that can be avoided. As a passenger, even if you are in the 2nd row seat, are you really going to jump up and try to take the steering wheel from the bus driver? What are the odds that you can do this safely and prevent an accident where the bus driv

      • by Dutch Gun ( 899105 ) on Thursday November 19, 2015 @12:45PM (#50963635)

        Have you ever been in an accident? It's pretty rare that you can actually see them coming. Otherwise, you would have avoided it, right? Or put another way... even if you can see it coming, it's likely that had you seen it earlier, there would be no need for last second heroic swerving or braking maneuvers.

        Short of some horrible malfunction on multiple levels, a computer is going to start slowing down or braking long before a human is even aware of a potential problem. The autonomous car has the advantage of literally being able to see in all directions at once, and being able to react to that information in the blink of an eye.

        Typical future scenario in your autonomous vehicle: "Why the hell is the car slowing d... oh, I see..."

        • by dotancohen ( 1015143 ) on Thursday November 19, 2015 @02:23PM (#50964561) Homepage

          Typical future scenario in your autonomous vehicle: "Why the hell is the car slowing d... oh, I see..."

          Here is a video of that actually happening, with Tesla's autopilot:
          https://www.youtube.com/watch?... [youtube.com]

      • Even though Volvo will be liable, however that doesn't really excuse them for taking the steering wheel away from the driver.
        There is legal liability. However that isn't much condolence if you are in the car about to get into an accident which you could avoid if you could.

        Except you can't, because you stopped really paying attention to the traffick the second the car began driving itself. You won't even notice you're about to get to an accident, much less have any idea what to do about it. At best you mig

        • Forget magic. Any technology distinguishable from divine power is insufficiently advanced.

          So... Jesus is my autopilot?

      • I'd feel a lot better if most people on the roads DIDN'T have a wheel that would allow them to control a multi-ton vehicle at speed.

    • If someone is driving a brand new car right off the lot and the breaks fail, causing a fatal accident, does the CEO do hard time now? In that case, it's pretty clear that the defect is the responsibility of the manufacturer, but it would be far more likely that there would be a civil lawsuit. So when Volvo says they will be liable, they're talking about civil and not criminal liability.

      • by bondsbw ( 888959 )

        Precisely, the only times criminal liability would be a factor is if there is evidence that an employee tampered with the vehicle, management decided to ignore internal warnings that a design defect could cause loss of control, or if the manufacturer systematically cheated regulatory tests designed to find such problems.

        • by zlives ( 2009072 )

          so... someone from GM is going to jail?

        • So what if say due to a software error a small kid gets miss identified as safe to run over and car does that and keeps on going that is hit and run a felony + felony manslaughter. Will volvo pay for the owners court costs + Attorney + bail + job loss support + jail fees + inpoud / towing fees + a new car / cab fees?

          • So what if say due to a software error a small kid gets miss identified as safe to run over and car does that and keeps on going that is hit and run a felony + felony manslaughter.

            No, it's called Natural Selection. The kid shouldn't have been in the road in the first place.
          • So what if say due to a software error a small kid gets miss identified as safe to run over and car does that and keeps on going that is hit and run a felony + felony manslaughter. Will volvo pay for the owners court costs + Attorney + bail + job loss support + jail fees + inpoud / towing fees + a new car / cab fees?

            What court costs? You weren't driving, but were a mere passenger in a car that had been approved by regulatory agencies to not need a driver. It's Volvo and said agencies who are responsible fo

            • When cops hall your ass to jail after a cop see your car run over that kid and it keeps on going even trying bypass the cop road blocks as see that as some big that it must move out of the way of.

              • When cops hall your ass to jail after a cop see your car run over that kid and it keeps on going even trying bypass the cop road blocks as see that as some big that it must move out of the way of.

                Why would the cop haul a person sitting in an automated car to jail for a hit and run? That's like sending the passengers of a bus to jail for a hit and run done by the bus driver. Sure, the passengers have an obligation to report it, but they aren't liable for anything themself. They are just the passenger.

                • And if you get a dumb cop like Officer Barbrady or chief wiggum?

                  • And if you get a dumb cop like Officer Barbrady or chief wiggum?

                    Then you'll get hauled to jail for running over a kid while you were home sleeping in your bed.

          • by bondsbw ( 888959 )

            You are assuming that the criminal liability laws which govern human drivers would apply exactly the same to autonomous vehicles, but we have been arguing that they would not.

    • So will the CEO do hard time if there is a felony car accident? Due to a software fault / sensor error?

      Yeah, because that will so encourage corporations to accept liability and responsibility for their products, and not use a boilerplate "get out of jail free" card by putting all risk on the passengers (e.g. GPL section 15):

      THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM âoeAS ISâ WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.

      That is exactly what we don't want from autonomous car manufacturers, so threatening to lock their CEOs up for trying to do the right thing isn't just counter-productive, it is beyond st

      • Agreed. Though I *would* love to see CEOs and Boards doing hard time if the fault is due to intentional fraud or corner-cutting. Even if they're not personally involved, they're the only ones in a position to impose the necessary oversight and/or avoid the creation of perverse incentives.

      • So will the CEO do hard time if there is a felony car accident? Due to a software fault / sensor error?

        Yeah, because that will so encourage corporations to accept liability and responsibility for their products, and not use a boilerplate "get out of jail free" card by putting all risk on the passengers (e.g. GPL section 15):

        THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM âoeAS ISâ WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.

        That is exactly what we don't want from autonomous car manufacturers, so threatening to lock their CEOs up for trying to do the right thing isn't just counter-productive, it is beyond stupid.

        So what you are saying is that the GPL is incompatible with use in autonomous vehicles?

        • by Anonymous Coward

          It is compatible. You can release the source code as GPL (with no warranty) and provide customers with a warranty, be it for the code or car as a whole. There is no requirement in the GPL that no warranty should be provided, although the GPL by itself provides none.

        • by jiriw ( 444695 )

          Definitely NO. Answer is in the same same piece of text:

          EXCEPT WHEN OTHERWISE STATED IN WRITING...

          It's very easy to add a little certificate of 'taking responsibility', or add it as an extra clause at the end of the license. It won't change the GPL. However... you must find a manufacturer of software for autonomously driving vehicles willing to provide their sources under a GPL license.

      • So what happens in a case where some dies and a criminal court finds that Volvo software is at fault Lets say (Volkswagen level failing or code that will not pass FAA regs)? Professional engineers can do prison time for signing off on unsafe stuff. Or what about evidence destruction say there is code in the case of unknown error to delete logs so they can't be used against them?

        What if a judge holds Volvo in contempt of court for trying to NDA / DMCA / EULA there way out of giving up logs / source code?

    • by Viol8 ( 599362 )

      "So will the CEO do hard time if there is a felony car accident?"

      I think we know the answer to that - liability will be strictly limited to the company and there will be so many get-out clauses in the purchasing contract that they'll never be succesfully charged with anything short of a major failure of the vehicle.

      Also I suspect "its cars cause" doesn't include the time when the car isn't driving itself - ie just after its detected an unavoidable accident and hands over to the human!

      • Actually, I think handing control over to the "driver" in such a situation is the worst possible solution, since they probably aren't paying any attention at the time. Sure, the driver should always be able to demand control if they're paying attention, but even then the computer can probably do a better job of damage mitigation than most people.

        Besides, if it's truly an unavoidable accident then it wasn't caused by the car or driver in the first place. And if it was an accident that could have been avoid

      • I think we know the answer to that - liability will be strictly limited to the company

        There's no need for "get-out clauses" in the purchasing contract, because that's how the law works. Employees are not liable for the products or services of the company they work for. The only exception, from what I understand, is if they do something illegal themselves.

        Also, Volvo has clearly stated that they're accepting liability for accidents which their autonomous systems cause. From TFA: "Who will be responsible when an autonomous vehicle causes an accident?" Why would they be responsible when an

    • Comment removed based on user account deletion
    • by stdarg ( 456557 )

      I'm guessing that Volvo is banking on changes to laws if this becomes common. The concept of felony driving violations will simply go away, at least with respect to self-driving vehicles.

    • Corporations are people my friend.

      Just not people when it comes to criminal acts. I can't pay a fine to get away with manslaughter. Corporate "persons" can.

    • by jopsen ( 885607 )

      Volvo says it will be liable for any accidents its cars cause.

      So will the CEO do hard time if there is a felony car accident? Due to a software fault / sensor error?

      If you cause a car accident due to an unforeseen heart attack or medical condition that you had no control over or expectation of, you hopefully won't go to jail :)
      Similarly, if one of the wheels fall of the car while driving, and the car is well maintained, regularly serviced, you hopefully won't face criminal charges in the event of a car accident. Nor will your mechanic face charges.

      If however, it is proven that your mechanic knowingly didn't do his job and put bad wheels on your car, then yes, maybe

  • What the car dumps out of full autonomous mode right before an accident giving the people in the car no time to try to get out of it?

    • Once it's too late to do anything to avert an accident then responsibility (or lack of, in the case of unavoidable confluences of bad luck) has already been established. In purely human terms, if a pilot sends an airplane into a steep dive while the copilot is on the can, and then hands the controls over to the copilot moments before impact, he does not transfer any responsibility to the copilot, because he did not actually transfer any control over the events about to unfold.

      I agree that it is the sort of

    • What the car dumps out of full autonomous mode right before an accident ...?

      Since self driving cars (SDCs) do not "dump out of full autonomous mode", that will not happen. The car may beep to get the human's attention, but the computer will continue to make a best effort to prevent an accident, or reduce its severity, until the human affirmatively takes control of the vehicle. There is no way in hell that the computer will just stop controlling a moving car.

      It is odd that when people try to point out the problems with SDCs, they often tend to focus on tasks where SDCs particularl

  • by Anonymous Coward

    At some point we'll also need inter-car communication protocols. I look forward to cars strategizing on the highway, in yields, or at traffic lights (which may eventually stop to exist altogether).

  • by The-Ixian ( 168184 ) on Thursday November 19, 2015 @12:00PM (#50963185)

    I haven't had a car for many years and don't foresee buying one any time soon.

    However, once self driving cars are a reality, I will certainly consider buying one.

    I suspect that I am not alone in this. It will be a huge selling point for these car companies and will perhaps turn non car owners into car owners.

    • by Viol8 ( 599362 )

      "However, once self driving cars are a reality, I will certainly consider buying one."

      Why , can't you drive? If you're such a poor driver please stay away from autonomous cars too and continue taking public transport, because if the car needs you to take over suddenly your fellow motorists won't want you having a panic in the middle of the highway.

      • by suutar ( 1860506 ) on Thursday November 19, 2015 @12:47PM (#50963655)

        if a car needs someone to take over _suddenly_ it doesn't matter. Nobody's going to be paying enough attention to what's going on every minute of every trip to be ready to take over on the one trip when the car can't handle it.

      • Face it, most people are poor drivers. Those who proclaim themselves to be exceptional good are usually the worst.

        Most autonomous cars will be fully autonomous and there will be no "sudden taking over".

        It would spoil completely the point if it where otherwise, or how do you think a person without driving license would ever be able to use an autonomous car?

        Also, you and basicslly everyone here commenting against autonomous cars is obviously since decades out of the loop.

        High end cars allready have everything

      • Any autonomous car that may need you to take over suddenly is completely unfit for purpose, because the erstwhile passenger will almost certainly not be paying any attention to the road, and require at least several seconds to survey the situation and decide on a course of action. In which time the car could have come to a complete stop and avoided the problem.

        There is perhaps some leeway for "highway autopilots" operating in low-complexity environments, but even then they need to be ready to deal with any

    • You're fucking joking, right? I wouldn't have a car programmed by someone else to do who-knows-what if you paid me to use it.

      Software glitches and shitty programming are fine on computers, hell, they keep me in a paycheck. But hurtling down the highway at high speeds with steel and glass all around me? No thanks.

      You must be one of those mentally ill "transhumanists" or something, that somehow think machines programmed by humans are somehow more capable than humans themselves.

      • Have you *looked* at humans lately? For a relatively straightforward task such as driving (stay on the road, obey traffic laws, don't hit anything) a machine programmed by a team of competent, well-informed humans and subjected to millions of miles of real-world testing has my vote over a lot of the oblivious idiots I see on the road. Sure, I'd still much prefer a competent, attentive human driver behind the wheel, but those are so rare they're practically mythical anyway.

    • Well don't you feel smug.
      Hey look at me, I can get by in life without a tool that other people need to live. My superiority is assured because I chose different life choices than other people!

    • by Thud457 ( 234763 )
      1. Self-driving car?
      Try self-driving RV .
      With the boomers retiring and slowing down, this will be a huge product category.

      2. 25" screen? How quaint.
      How about thinking futuristically, maybe a VR holosphere driven by methane micro-lasers?
      Gonna need some sort of privacy screen for the inevitable social maladjusted out there that feel the need for a FUFME session with their 'net-enabled fleshlight on the 405.
    • However, once self driving cars are a reality, I will certainly consider buying one.

      I think it will go exactly the opposite direction. People who now own cars will get rid of them, instead calling a self-driving car to pick them up when they need a ride. When you remove the overhead of paying a driver, the car service model of transportation becomes really compelling. We spend so much money on vehicles that are parked 95% of the time... pure waste.

      • Yeah, you are right, I should have thought of that.

        Thanks for the reply.

    • I haven't had a car in over 20 years. I will be happy when everyone has an autonomous car because hopefully then there will be a competent driver behind the wheel that doesn't try to kill me on my motorcycle.

  • But considering this is volvo,
    Does it run SteamOS?
  • Seeing the pic in the TFA of the hipster guy leaning back while the car drives itself made me think that if it was me, I would not be able to relax while the car just drove. I think I'd still be continually scanning the displays and surrounding area looking out for potential trouble even though I know that I am not in control.

    This make me wonder how much autonomous driving it will take before people actually feel emotionally comfortable letting the car do its thing? Or if anyone who has grown up with manu

    • by suutar ( 1860506 )

      I suspect most folks who are comfortable being passengers and comfortable around computers will be capable of getting used to this. That is, admittedly, a limited subset of humanity, but growing over time.

  • I need dimmable windows for some...surfing privacy while I'm driven to work.

  • I can't find the link at the moment, but as I recall Google has articulated the same position, and said it some years ago. The maker of the autonomous care should be held liable for any errors made by the autonomous driving system. Really, who else could be liable?

    • This likely isn't a selfless move, If the auto maker takes responsibility they expect to make a profit off that liability.
      >Really, who else could be liable?
      those who provide the map, the maintenance, the inspections, the tires, the route... So all of those will need to be provided by someone willing to take the liability. So basically Google or Volvo will likely require they are paid to provide all of these processes. Of course not all directly, but they will be the authority, that certifies those al

  • Is Volvo also legally and financially responsible for the driver's side backseat passenger being crushed to severe injury or death immediately upon enabling self-driving mode? Source: the concept pic at the bottom of the article.
  • From the article:

    According to Tesla, more than 1 million cars have already installed a recently released over-the-air software upgrade to the Model S sedan's Autopilot feature

    That is quite some feat, considering that most analysts estimate the total number of Model S sedans delivered so far at under 100,000

  • More importantly (Score:2, Insightful)

    by Anonymous Coward

    More importantly, this is probably the first direct statement from a car manufacturer that THEY consider themselves 100% responsible for any accidents or problems when the car is in self drive mode.

    This statement alone is more news worthy then the self-driving car itself!

    • That's certainly what caught my attention. And it could go a long way toward making these things workable.

      Let's face it, there's going to be huge push-back from people who don't want to get stuck behind a vehicle that drives under the speed limit most of the time, and there's bound to be wankers who try to get a pay-off by throwing themselves at the car or cutting in front of it in their scrap-worthy beater.

  • What's interesting about robotic cars is they will probably be vastly safer than human drivers overall. Say 15,000 human cause accidents a year, versus 100 automation caused accidents a year. So going with my made up numbers, even if a robotic car causes an accident it would have prevented 150 accidents. Morally, it seems like they should *almost* get a free pass for the limited number of accidents they cause (as every human driver they replace will save lives).

    • The road toll in the US is about 50,000 per year.
  • Ladies and gentlemen, the greatest concept car of all time, the Buick Y-Job:

    https://en.wikipedia.org/wiki/... [wikipedia.org]

  • This ought to collapse insurance rates in the long-run, so I'm all for it. What a racket that is.
  • Here is where Volvo is with the tech.
  • I am still not sure how people think computers can deal with the real world when my 3 year old can out do most any computer in solving visual word captchas. We give them lidar and sensors galore, then painstakingly and manually map out the routes to inch resolution marking every driveway, road sign and stop light. Many of these sensors won't function in bad weather like rain and snow, and what happens when real world things happen like a blowing cardboard box that lidar would pose a far bigger threat to s
    • They will replace paid human drivers much faster than we can foresee. Capitalism and cutting costs will not wait even 1 second longer than when they can fire a bunch of vehicle (trunk, cab) drivers. I think the real market autonomous driving may be eyeing for is the long-haul trucking business.
      • No doubt, but for each wrongful death you would have to fire an awful lot of employees to cover the cost that quarter.
  • How long before the insurance scammers figure out how to "bait" these new Volvi into crashing? This seems like the perfect neck-injury con-job just waiting to happen.
  • yes, this should be interesting since volvo is now a Chinese car, with Chinese quality.
    And I am sure that it will be out next fall. Right?

Technology is dominated by those who manage what they do not understand.

Working...