Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Technology

Vans Drive Themselves Across the World 157

bossanovalithium writes "Four driverless electric vans successfully ended a 13,000-kilometer test drive from Italy to China which mirrored the journey carried out by Marco Polo in the Middle Ages. The four vans, packed with navigation gear and other computer software, drove themselves across eastern Europe, Russia, Kazakhstan and the Gobi Desert without getting lost. They had been equipped with four solar-powered laser scanners and seven video cameras that work together to detect and avoid obstacles."
This discussion has been archived. No new comments can be posted.

Vans Drive Themselves Across the World

Comments Filter:
  • by somersault ( 912633 ) on Friday October 29, 2010 @09:10AM (#34061524) Homepage Journal

    Who do you know that drives more like an idiot because their car has safety features? I drove like an idiot even when my car didn't have ABS, and these days even though all cars I drive have ABS, I drive like less of an idiot.

    Traction control is no use for driving like an idiot. I switch it off when I want to have some fun.

  • by caluml ( 551744 ) <slashdot&spamgoeshere,calum,org> on Friday October 29, 2010 @09:12AM (#34061528) Homepage

    Stop wasting your time and build a personalised rail network when I can get into a "pod" or something, enter my destination and it would take me there on good, solid, metal rails and a bit of signalling.

    Indeed. A packet-switched transport system. Broadcast your destination via Bluetooth, "routers" can receive that and direct you the best way. The pods would be unpowered, but pushed/blown along - possibly compressed air?
    If you had a system of tubes under the ground, and some sort of decent bearings, you could make it work. You could also have large "trunk"/"backbone" roads, which smaller roads joined. Basically, model it on the Internet. But without the packet loss, or routing loops. Or collisions.

  • by rockNme2349 ( 1414329 ) on Friday October 29, 2010 @10:33AM (#34062304)

    So the second one person has an accident in an autonomous vehicle, you're looking at major liability and lawsuits directed towards the car manufacturer - whether or not it was their fault and whether or not a human driver could have prevented the accident in *any* car. That manufacturer now has to take responsibility for that car versus every idiot on the road, every pedestrian that runs out and everything that can confuse one of its sensors.

    I've thought about this problem for a while, and here is my guess how it will proceed. When cars started being made with cruise control, the responsibility in an accident still belonged to the driver. There are cars being built today which automatically apply brakes when they sense an oncoming collision, but in the event of a malfunction or accident, the human driver is ultimately held responsible.

    I don't believe anyone is going to drop an autonomous car into the market, but instead it will simply be more and more iterations of the computer taking control. The human driver will always have a manual override though, and will be responsible for the accidents, simply because that was the status quo. My guess is by the time we do get autonomous cars, people probably won't be paying attention to the road since their cars are driving themselves fine anyway, but they will have signed a disclaimer claiming responsibility anyway. I do think there will be uproars when accidents do occur, like we have seen with the Toyota problem, but not for a long while after we have become comfortable with autonomous vehicles will any law change regarding responsibility.

  • by ElectricTurtle ( 1171201 ) on Friday October 29, 2010 @11:06AM (#34062718)
    "Staggering"? Hyperbole. In the first place rates vary culturally. Iceland has 3.8 fatalities/year per 100000 people, less than a quarter of the rate of the US at 12.3, which in turn is about a quarter of the rate of the worst country for fatal car accidents: Eritrea at 48.4. Even that highest rate is still only 0.0484%/year, and at the risk of sounding cavalier about human life, I wouldn't call that staggering.

    Also your assertion that the AI problem would not require a groundbreaking solution is founded on what knowledge? I think you vastly underestimate the problem. Example scenario: a vehicle is traveling on a rural road in the winter around a tight, blind turn on a mountain road. Suddenly, another vehicle appears heading toward the first in the middle of the road. Does the AI in the first vehicle know it's winter and black ice may interfere with braking? Does the AI know that turning out of the other vehicle's path toward the mountainside may result in the vehicle flipping? Does the AI know that if it turns away from the mountain to avoid the other vehicle that it could cause it to plummet to its doom?

    Let's back this off a bit, instead of a mountain, it's a hilly region and the same scenario, turning toward the hill would cause the same risk of flipping, but turning away would probably be rough but survivable. The AI turns away, but the hill is too steep and icy to brake effectively, does it know how to steer under such conditions? Does it know where to steer? Let's say there's a body of water down there, does it recognize that as a hazard to avoid? What if the water is frozen? Does that appear as a solid surface to the AI? What about at night? On and on and on.

    Human intuition and integration is so powerful we don't think about most of these things consciously. We have the capacity to act with so many key factors understood naturally and relationally. AI will get there, that's inevitable, but it will be decades more before that happens, and when it does it will be "groundbreaking".
  • Re:More Importantly (Score:3, Interesting)

    by CastrTroy ( 595695 ) on Friday October 29, 2010 @12:14PM (#34063754)
    This is exactly the problem with self driving vehicles. Even if they are 10000 times safer than having human drivers, they will still not be used by the general public, because any kind of crash will be a huge lawsuit against the company. With human drivers, you can always blame the problem on human error. However, with computerized drivers, it's now the manufacturer who is at fault for every single problem. They can't even get all the systems they have (Think Toyota) working properly. Making cars that drive themselves is going to be an even bigger problem.
  • Re:More Importantly (Score:3, Interesting)

    by slick7 ( 1703596 ) on Friday October 29, 2010 @05:09PM (#34068064)

    I imagine one of these would be less effective at explosives delivery than a remote controlled vehicle would be.

    Autonomous vs. remotely operated is different how?

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...