Cooperative Cars Battle It Out In Holland 139
An anonymous reader writes "The first cooperative platooning competition, where vehicles use radio communication in addition to sensors, was held in Helmond, Holland a week ago. By using wireless communication the awareness range of each vehicle is extended, enabling vehicles to travel closer together which increases road capacity while at the same time avoiding the shockwave effects responsible for traffic jams. The Grand Cooperative Driving Challenge distinguishes itself from earlier platooning demos (e.g. the PATH project) by having a completely heterogeneous mix of vehicles and systems built by multiple researcher and student teams. Using wireless communication to coordinate vehicles raises concerns about the safety of such systems, would you trust WiFi to drive your car?"
Of course yes! (Score:5, Informative)
I would trust WiFi more than the tired trucker or the drunk driver in the other lane.
Re:so who do you blame? (Score:5, Informative)
Answer: Lots and lots of money spent on legal cases with uncertain outcomes.
This is part of the reason why people say we should have one road for human drivers and one for automated (which makes them so prohibitively expensive, it's not worth it). Basically if there's an accident, the human "driver" of the vehicle is responsible, whether he was on cruise control or his ABS failed or whatever. You can still have that but with automated cars, I foresee instant-law-suit as soon as something like that happens (in the style of the Toyota lawsuits) blaming the car.
And on an all-automated road, if you have an accident then it's *GOT* to be the automation fault, right? So you think that the car companies and road companies are going to pick up the tab for the first 50-car multiple pile-up? What about the associated traffic delays for a thousand people driving their automated cars just behind? Again, it gets prohibitively expensive and risky for the car/road companies to operate.
If you have an automated car on a "human" road, then the human has to be able to take over (seeing as he is the one responsible in case of a crash!), so it becomes a little bit like cruise control and also becomes 100% the driver's problem, even if the automation fails.
More interesting - can you get arrested for something like "driving without due care and attention" if you're the driver of an automated vehicle and do something behind the wheel? If so (and current laws say "YES!"), you might as well just drive the damn thing yourself.
It's pretty much why these things are university projects and not actually on the road except in "tests" (and also things like the demonstration of two "crash-proof automated Volvo's a couple of months ago that, when aimed at each other head on at 30mph were supposed to stop before any possible accident - in front of the press they crashed about a dozen times and stopped once).
We've had the capability to remote-control and computer control a car for YEARS. Hell, we do it with aeroplanes and oil-tankers. But the fact of the matter is that we ALWAYS have a responsible human behind the wheel with the control to take over and, if they take their eyes off the controls, are deemed to be irresponsible (imagine if your airline pilot and his co-pilot both went to sleep and left it on auto?). The problem is that the law, economics and common-sense tell us it's a stupid thing to do.
You want an automated vehicle? Get on the London Docklands Light Railway. Entirely driver-less. But they had to put conductors on the trains to reassure passengers because occasionally the things get stuck and go wrong even though they are on rails. "a Passenger Service Agent (PSA), originally referred to as a "Train Captain", on each train is responsible for patrolling the train, checking tickets, making announcements and controlling the doors. PSAs can also take control of the train in certain circumstances including equipment failure and emergencies." Been in operation since 1987, can only travel on the rails, can't go past their stated safe speed, and you can have actual physical objects on the rails that activate brakes to avoid collisions and STILL they have a "driver".
Automated cars are like the "flying cars" of science fiction - yeah, it'd be cool, and we probably have the technology - but do you really want joy-riders flying over your house?
drivers (Score:4, Informative)
would you trust WiFi to drive your car?
Do I trust the drivers of the other cars?
Cars are these strange things that drive our minds crazy. I don't know how much is cultural (i.e. movies, etc.) and how much is psychological, but there are few areas in life where the disconnect between reality and subjective is so dramatic.
Everyone thinks he's an above-average driver. Of course, that's statistically impossible.
Almost everyone overestimates his (or her) ability to handle a car in unusual circumstances.
Very few people can correctly judge road and weather conditions and their impacts on things like brake distance.
Most people do not have a correct sense of speed anymore if they've driven at speed for a few hours.
and so on and so forth. Car accidents are within the top reasons of unnatural death in most western countries, but most of us feel more uneasy going on a rollercoaster (which cause what, a dozen or so deaths a year, world-wide?) or on a plane (around 1000 deaths per year, world-wide) than taking the car to work (1,200,000 deaths per year, world-wide). Yes, that's the real numbers, here [planecrashinfo.com] and here [autoblog.com] are some sources, or google your own. Plane crashes fall way below the rounding error margin of car crashes.
Really, you would have to put really bad engineers with pre-historic computer equipment and unstable wiring into those cars to make them worse than human drivers.
Re:so who do you blame? (Score:4, Informative)
Sell one car that will last until breakdown, require special roads, special taxation, special infrastructure, special laws, huge investment, extreme legal risk, having to ride around even-more-patents, having every politician in your pocket, etc.
Or sell lots of cheaper cars that occasionally get dented/smashed up (but keep the driver intact of course), profit from the spare parts market (even if through patent licensing), require none of the above and where almost all the risk is on the driver.
The car market is already over-priced and struggling (i.e. the ENTIRE UK car market had to be bailed out by the government just a few years ago, and it's not the first time). The governments already spend billions on road infrastructure (where a road is a bit of tarmac with some paint on it, not an isolated, obstruction-free, electronically-enabled, few-travellers, risky multi-billion-dollar venture) and, believe it or not, serious road accidents are actually rare given the number of cars in the road (multiply the number of air-accidents by the difference between the number of planes journeys and the number of cars journeys world-wide and see what happens!).
Additionally, human drivers speeding and parking in the wrong places etc. is actually a HUGE source of income (not to mention drivers licenses, driving schools, insurance, etc.). Until the economics vastly change, it ain't gonna happen. If we see it in my lifetime, I will be hugely impressed at the amount of administrative and economic crap we've had to remove to get to that point. And to be honest, I don't particularly want it either.
Re:so who do you blame? (Score:4, Informative)
So probably no difference when it comes to other responsibility towards other drivers.
Most interesting questions:
- Would it be OK to be drunk in a fully computer driven car? (where the driver seat is just occupied by a passenger)
- Would it be OK for someone without a driven license to use one of these cars?
- In case of accident, assuming the computer was driving, do car owners take a hit in their driving license if they have one?
- If the car is a rental or loan, how's the responsibility divided between car owner / insurance company / car driver / etc?
- While we are at it, if cars are really able to drive themselves, do they actually need to have a human passenger at all? Can I send my car to my mom's to pick something up and come back?
Anyway, obviously self driving cars would have a shitload of system getting data from external sensors, so it would actually be easier to find out exactly what happened in case of accident, particularly if more than one car is involved and you have two sets of data to compare.
About mixing human and computer drivers, I'm not worried about it. I have no reason to believe that if the guy in the next lane is driving drunk and suddenly steers towards me I would have a better chance of solving it than a computer. I'd say the computer would actually react faster and with better control than I would. Sometimes accidents are inevitable by the way, and under some external circumstances there's no way at all to prevent them (even if you could replay the thing over and over). If I'm involved in one, I prefer to make sure its effects are minimized by a computer that knows what its doing.
Re:'bout time! (Score:4, Informative)
It's still used, but it's difficult to get right. The problem is that traffic lights are not just random obstructions on a road, as they are in the picture in the Wikipedia article, they are used for junctions or pedestrian crossings. In both cases, you can often avoid the light turning red at all if there are no people waiting to cross the road, but that breaks the wave at the next set of traffic lights. If it's a junction, then you have a bigger synchronisation problem, because there are multiple independent paths between two sets of lights, and defining a wave in one segment may decrease the overall efficiency.
When I was bored a few years ago, I wrote some code to try to define the optimal traffic light timings for a portion of Salt Lake City (where I was at the time - it has a very regular grid pattern, which makes it easy to model) to maximise total throughput. The results were quite counter intuitive (and very different to the traffic light timings that they were using).