GM's Cruise so Far: A Crash, and 60 RoboTaxis 'Disabled' After Losing Server Contact (thedrive.com) 146
On June 2nd California approved General Motors' Cruise robotaxi service. The Drive describes an accident that happened the next day:
The autonomous car made an unprotected left turn and was hit by a Toyota Prius on June 3, though the accident wasn't reported until Wednesday. When reached for comment by The Drive, the San Francisco Police Department explained that the Cruise vehicle had three passengers, all in the backseat, while the Prius had two occupants in total.... According to the incident report Cruise filed with the California DMV, the Cruise taxi was making a green light left turn from Geary Boulevard onto Spruce Street in downtown San Francisco. It began the turn and stopped in the middle of the intersection, presumably noticing the Toyota headed for it. The Prius then hit the right rear of the Chevy Bolt.
Cruise explained that afterward, "occupants of both vehicles received medical treatment for allegedly minor injuries." GM's incident report points out the Prius was speeding at the time of the accident, and was in the right turn lane before heading straight and hitting the Bolt. SFPD told The Drive that "no arrest or citation was issued at the time of the initial investigation," which is still ongoing. The National Highway Traffic Safety Administration has opened up a special crash investigation into the accident, but there are no public results yet.
Wired reports: In response to that crash, Cruise temporarily reprogrammed its vehicles to make fewer unprotected left turns, according to internal messages seen by WIRED. At an internal meeting Jeff Bleich, Cruise's chief legal officer, said the company was investigating the incident, according to a recording reviewed by WIRED. He also warned employees not working on that investigation to try and tune out crashes or related news reports, saying they were unavoidable and would increase in frequency as the company scaled up its operations. "We just have to understand that at some point this is now going to be a part of the work that we do, and that means staying focused on the work ahead," he said.
Wikipedia's entry for Cruise notes a few other incidents: In April 2022, the San Francisco Police Department stopped an empty (operating without any human safety attendants) Cruise AV for driving at night without its headlights on.... Also in April 2022, an empty Cruise AV blocked the path of a San Francisco Fire Department truck responding to a fire.
But Wired also reports on a more troubling incident that happened "around midnight" on June 28th: Internal messages seen by WIRED show that nearly 60 vehicles were disabled across the city over a 90-minute period after they lost touch with a Cruise server. As many as 20 cars, some of them halted in crosswalks, created a jam in the city's downtown in an incident first reported by the San Francisco Examiner and detailed in photos posted to Reddit....
The June 28 outage wasn't Cruise's first. On the evening of May 18, the company lost touch with its entire fleet for 20 minutes as its cars sat stopped in the street, according to internal documentation viewed by WIRED. Company staff were unable to see where the vehicles were located or communicate with riders inside. Worst of all, the company was unable to access its system which allows remote operators to safely steer stopped vehicles to the side of the road.
A letter sent anonymously by a Cruise employee to the California Public Utilities Commission that month, which was reviewed by WIRED, alleged that the company loses contact with its driverless vehicles "with regularity," blocking traffic and potentially hindering emergency vehicles. The vehicles can sometimes only be recovered by tow truck, the letter said. Images and video posted on social media in May and June show Cruise vehicles stopped in San Francisco traffic lanes seemingly inexplicably, as the city's pedestrians and motorists navigate around them.
Cruise explained that afterward, "occupants of both vehicles received medical treatment for allegedly minor injuries." GM's incident report points out the Prius was speeding at the time of the accident, and was in the right turn lane before heading straight and hitting the Bolt. SFPD told The Drive that "no arrest or citation was issued at the time of the initial investigation," which is still ongoing. The National Highway Traffic Safety Administration has opened up a special crash investigation into the accident, but there are no public results yet.
Wired reports: In response to that crash, Cruise temporarily reprogrammed its vehicles to make fewer unprotected left turns, according to internal messages seen by WIRED. At an internal meeting Jeff Bleich, Cruise's chief legal officer, said the company was investigating the incident, according to a recording reviewed by WIRED. He also warned employees not working on that investigation to try and tune out crashes or related news reports, saying they were unavoidable and would increase in frequency as the company scaled up its operations. "We just have to understand that at some point this is now going to be a part of the work that we do, and that means staying focused on the work ahead," he said.
Wikipedia's entry for Cruise notes a few other incidents: In April 2022, the San Francisco Police Department stopped an empty (operating without any human safety attendants) Cruise AV for driving at night without its headlights on.... Also in April 2022, an empty Cruise AV blocked the path of a San Francisco Fire Department truck responding to a fire.
But Wired also reports on a more troubling incident that happened "around midnight" on June 28th: Internal messages seen by WIRED show that nearly 60 vehicles were disabled across the city over a 90-minute period after they lost touch with a Cruise server. As many as 20 cars, some of them halted in crosswalks, created a jam in the city's downtown in an incident first reported by the San Francisco Examiner and detailed in photos posted to Reddit....
The June 28 outage wasn't Cruise's first. On the evening of May 18, the company lost touch with its entire fleet for 20 minutes as its cars sat stopped in the street, according to internal documentation viewed by WIRED. Company staff were unable to see where the vehicles were located or communicate with riders inside. Worst of all, the company was unable to access its system which allows remote operators to safely steer stopped vehicles to the side of the road.
A letter sent anonymously by a Cruise employee to the California Public Utilities Commission that month, which was reviewed by WIRED, alleged that the company loses contact with its driverless vehicles "with regularity," blocking traffic and potentially hindering emergency vehicles. The vehicles can sometimes only be recovered by tow truck, the letter said. Images and video posted on social media in May and June show Cruise vehicles stopped in San Francisco traffic lanes seemingly inexplicably, as the city's pedestrians and motorists navigate around them.
What we have here is a failure to communicate. (Score:5, Insightful)
If they stop dead when loosing connection to a server, they're not autonomous, are they?
Re: (Score:2)
Can we make a perfect autonomous driving car?
"Sure!"
Are humans involved in creating it at any level?
"Uh. Yeah?"
Are humans infallible?
"Uh...."
Basically trying to build such a system is always going to be throwing dice.
Granted, we have some groups that know how to throw the dice quite well. But even they fuck up now and again.
I'd rather trust my own driving skills than hope to God a piece of fallible TECH isn't going to go ape-shit.
Re: (Score:2, Interesting)
I'd rather trust my own driving skills than hope to God a piece of fallible TECH isn't going to go ape-shit.
Unfortunately, your own driving skills are of limited use against somebody else's dangerous experiments. We no longer have a choice when it comes to sharing the road with Autonomous Vehicles. Within our lifetimes we may also lose the right to actually drive a motor vehicle; we'll get to choose between between being chauffeured by an algorithm, and walking. I was going to add bicycling at the end there, but with AV's everywhere? Fuck no!
Re: (Score:2)
Re: (Score:2)
If there's enough incidents attributable to this highly flawed technology they'll revoke their right to public roads.
Great! When are they going to do the same for the humans who get into far more accidents?
What's "Enough"? Apparently Florida alone has like 6 fatalities a day in motor vehicle accidents.
Re: (Score:2)
The problem as I see it is we
Re: (Score:2)
I'd rather trust my own driving skills
The problem is I don't trust you, and you (if you had any sense at all) would not trust me. The difference between you and a computer is that you make mistakes while computers follow their programming. Imagine if every crash investigation we could push an update to every car to address the specific circumstances of a crash, which is to say if you have an accident with someone else, I become a better driver. Crashes would be eliminated.
Your post is also inconsistent. You complain about humans, but in your la
Re: What we have here is a failure to communicate. (Score:2)
Re: (Score:2, Insightful)
No, the point is that autonomous drivers AREN'T BETTER than human drivers.
Humans have had hundreds of thousands of years to build up the coordination and skill to use their bodies to drive a car.
We've had over a century of cars.
And we have enough nanny systems already in place to make us more responsible drivers.
And people want to toss it out because "Look! We ran it a hundred times on an empty track and it didn't hit anything!"
And humans can make decisions in situations outside their experience.
Autonomous
Re: (Score:2)
To be clear, current autonomous driver systems are not better than humans who haven't been drinking. Waymo likes to compare their cars to humans including drunk drivers, to make themselves look better.
Re: (Score:2)
Oddly enough drunk divers are the ones I think would be forced into autonomous cars first.
Re: (Score:2)
I don't care about that. I just want Waymo and Tesla to start comparing like vs like instead of using their data as propaganda.
Drunk drivers are human drivers (Score:2)
To be fair, despite all our efforts we haven't been able to remove human drunk drivers from our roads, so I don't think it counts as propaganda to include them in the set of "human drivers".
Just like it wouldn't be fair to remove the bottom 20% of human drivers from the stats, even though it'd make human drivers look a lot better.
Re: (Score:2)
We can remove all drunk drivers from our roads right now by requiring a breathalyzer test to start a car.
Saying "an autonomous vehicle drives as well as humans" and including drunk drivers in that statistic without mentioning it is misleading. Given Google's propensity to publish statistics as propaganda, it is most likely intentionally misleading.
Re: What we have here is a failure to communicate (Score:2)
No, people want to toss it out because over a million people die in car crashes every year.
Re: (Score:2)
Clearly neither of us knows what the exact situation was, but as described, stopping might have been a good decision.
If the driver was in the right hand turning lane and went straight unexpectedly you have two choices: 1) stop your left hand turn and hopefully let them pass in front of you 2) hit the accelerator and try and squeal through your turn so they pass behind you.
My high school driving instructor certainly would have recommended the first option.
Re: (Score:2)
You can SUE ME!
Good luck trying to successfully sue the maker of your autonomous driver.
Re: (Score:2)
This would be the main benefit to letting an AV drive me around, the limited liability. Of course, I clearly expect the AV maker to have a clause that somehow puts all the liability on me, the passenger, and not on them.
Shame we don't have a government for the people. If the AV is suppose to be a better driver, I shouldn't need insurance (not driving personally) and if the car gets in an accident, the AV maker should take on all the liability since it was driving.
Re: (Score:2)
So you want to be held accountable for a failure in technology.
Okaaay.
Re: (Score:2)
moving car needs network? and can die in unsafe sp (Score:2)
moving car needs network? and can die in unsafe space.
Just wait for one to stop on
railroad tracks
drawbridge
death valley
big uphill / downhill
blind curve
one lane road
etc..
Re: moving car needs network? and can die in unsaf (Score:2)
Seems like it should simply go into âoepolice presence modeâ if the connection is lost and simply pull over and await instructions instead of just stopping dead in its tracks, but maybe thatâ(TM)s not possible? *shrugs*
Re: (Score:2)
Re: What we have here is a failure to communicate. (Score:2)
The crash seems fine but... (Score:4, Insightful)
The crash seems fine, I mean unprotected left turns leads to crashes all the time with real drivers. In fact they would only stop happening if you didn't have any human drivers at all, so don't expect that to happen soon.
However, the vehicles just stopping in their tracks if they lose contact to a server? WTF design is that? Are they autonomous or not? If they are, surely they can safely find a spot on the right to stop and not block traffic? Why do they need remote access to drive them to such a spot? Loss of communication is the most common thing, even if GM were more competent (they don't seem to be at all), you can't avoid it 100% of the time, especially when it comes to wireless as we all know.
Re: (Score:2)
However, the vehicles just stopping in their tracks if they lose contact to a server?
How about vehicles braking on their own [screenrant.com] in the middle of highways when they don't even use contact with a server to drive?
Re: (Score:3)
lol who is the dumbshit who decided to call it phantom braking? That would be when it looks like it's going to stop, but doesn't. But yes, this is going to keep happening until Tesla uses more than cameras to determine depth, along with incidents where the vehicles don't stop when they should [fox35orlando.com]. Humans are bad at guessing at depth quickly. Teslas are even worse. They don't have the benefit of analog processing.
well GM will hard push the high cost data plan and (Score:2)
well GM will hard push the high cost data plan and say you don't want to your car to stop when it picks up an mexican or canadian tower and your only have the basic US only data plan.
Report as submitted by GM (Score:2)
A Cruise autonomous vehicle ("Cruise AV") operating in driverless autonomous mode, was traveling eastbound on Geary Boulevard
toward the intersection with Spruce Street. As it approached the intersection, the Cruise AV entered the left hand turn lane, turned the left
turn signal on, and initiated a left turn on a green light onto Spruce Street. At the same time, a Toyota Prius traveling westbound in the
rightmost bus and turn lane of Geary Boulevard approached the intersection in the right turn lane. The Toyota Prius was traveling
approximately 40 mph in a 25 mph speed zone. The Cruise AV came to a stop before fully completing its turn onto Spruce Street due to the
oncoming Toyota Prius, and the Toyota Prius entered the intersection traveling straight from the turn lane instead of turning. Shortly
thereafter, the Toyota Prius made contact with the rear passenger side of the Cruise AV. The impact caused damage to the right rear door,
panel, and wheel of the Cruise AV. Police and Emergency Medical Services were called to the scene, and a police report was filed. The
Cruise AV was towed from the scene. Occupants of both vehicles received medical treatment for allegedly minor injuries.
https://www.dmv.ca.gov/portal/file/cruise_060322-pdf/ [ca.gov]
Glad they're not experimenting in Canada (Score:2)
My first thought was about what would have happened had these pseudo-autonomous vehicles been on the road here yesterday when Rogers cellular and internet service disappeared. All those cars just stopped dead wherever they happened to be, waiting for the mother-ship connection to be restored.
Somebody needs to acquaint GM engineers with the concept of "fail-safe" - it seems they've never heard of it.
Re: (Score:3)
Re: (Score:2)
Thank goodness we are not migrating to something like 5G.
Re: (Score:2)
My first thought was about what would have happened had these pseudo-autonomous vehicles been on the road here yesterday when Rogers cellular and internet service disappeared.
Much hilarity, my friend. Much hilarity would have ensued.
The fix is in (Score:4, Interesting)
There's too much money to be made by making paid drivers unnecessary. Autonomous vehicles are coming, like it or not. Any inconvenient safety problems will be dealt with by having tame judges and legislators define them out of existence.
Hang onto your hats, it's going to be quite a ride.
Re: (Score:2)
I hope so. I hope judges and legislators ban human drivers who go straight from a turning lane hitting other cars (who or whatever may be driving them).
Humans suck at driving. The yearly deathtoll is clear evidence that we shouldn't be driving cars.
Re: (Score:2)
Sensors can be redundant and have failsafes. Human eyesight is more likely to go blindly spontaneous from various things .. anything from the sun in eyes to a health condition. Humans are unreliable, subject to health conditions, distraction, rage, and stupidity. There have been many fatal accidents due to people deciding to get a heart attack or blackout while driving.
Crash was not fully AI fault (Score:2)
If a car is in the right turn lane, I would probable try to make an unprotected left in front of them as well...
Hiowever I don't think I would have got in an accident, as (A) if I saw a car going 40MPH in a right turn lane I would assume they are not turning. And also I would not have stopped if I did go, I would go and kind of floor int assuming the car might not turn but only if I felt sure I would have plenty of time to make it if I had the case.
Honestly though a lot of drivers would not have considered
Re: (Score:2)
Honestly though a lot of drivers would not have considered that so there I think we can say the AI was just behaving as a reasonable average driver would.
This is where AI could really shine though. Right now they pay attention to road markings and cars that exist, but they could go beyond that - they could learn to classify some cars as potentially greater hazards.
One big factor in favor of AI driving is that we can learn from the mistakes and take steps to prevent them from happening again. In contrast, we've got many years of evidence that there's no way to stop human drivers from making the same stupid mistakes over and over again.
Re: (Score:2)
Tesla cars keep making the same stupid mistakes again and again because humans keep making the same stupid mistake again and again, and letting them leave the factory without a better way to determine depth to an object than estimating from visual data alone.
The software can't do anything about the problem if continually hampered by stupid humans.
Re: (Score:2)
I think the lesson here isn't about self-driving cars, but rather GM's quality control program.
It's time to lay some traps (Score:2)
It's time to lay some traps [techcrunch.com].
Not good, but also not a real problem. (Score:2)
dead zones and tunnels that may have no network (Score:2)
dead zones and tunnels that may have no network or an weak network will be bad and people can use an cell blocker to hijack auto drive trucks.
Re: (Score:2)
dumb (Score:2)
I would have expected GM to roll this out someplace other than a crowded City like SFO. There are lots of smaller communities that would have welcomed it and they could have worked out the bugs there before trying this.
accidents & severity per trip/distance (Score:5, Insightful)
There is no perfect system. I don't know how good this or any other autonomous system is, it's definitely early. What I care about really is how many accidents and how severe they are per distance per environment. ie, a direct comparison in city or highway etc driving conditions vs a human. If it's even on par, let it be. Humans aren't getting any better at driving, our algorythms are fixed. An autonomous car that matches humans however can improve with time.
Why do autonomous vehicles need a server? (Score:2)
Why do autonomous vehicles need a server?
Tesla Vehicle Safety Report (Score:2)
Not GM Cruise autopilot but a (better) system:
https://www.tesla.com/VehicleS... [tesla.com]
https://cleantechnica.com/2021... [cleantechnica.com]
“In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United St
Re: (Score:2)
Chances are very good that autopilot would not work where those drivers were not using it because the car is under conditions that auto pilot simply can not handle. I suspect the comparison between Auto-Pilot roadways and Human-Only roadways is comparing Apples and Oranges.
Re: (Score:2)
We don't deny that these cars aren't safer in the big picture, but the accidents they cause will be novel and unique to them.
Re: (Score:2)
âoeIn the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSAâ(TM)s most recent data shows that in the United States there is an automobile crash every 484,000 miles.â
Autopilot only works on the highway so if you are comparing autopilot use you had better use metrics that make sense otherwise your numbers communicate nothing.
Suicidal robocar? (Score:2, Insightful)
From TFA : It began the turn and stopped in the middle of the intersection, presumably noticing the Toyota headed for it
What? The robocar looked at the situation, judged that it was safe for it to make the left turn and began the manouver, Next, the robocar detects the oncoming prius and immediately halts in the path of the prius?
WTF?
Is the robocar too shortsighted to be driving because it cannot accurately judge oncoming traffic such as the prius?
Is the robocar able to sense the surroundings correctly b
Re: (Score:3)
Maybe you missed it but Prius was on a right turn lane, Cruise waited for it to turn.
The rest of your post becomes pretty much off topic.
Re: Suicidal robocar? (Score:2)
From TFA : It began the turn and stopped in the middle of the intersection, presumably noticing the Toyota headed for it
How is that 'waiting for the prius' except in the sense of 'getting into the danger area and waiting to be hit'?
Re: (Score:3)
"was in the right turn lane before heading straight and hitting the Bolt"
Can you at least be grounded in reality?
Re: (Score:3)
"was in the right turn lane before heading straight and hitting the Bolt"
RTFsummary
Self driving is getting good, but has big problems (Score:2)
I am beta testing Tesla's FSD. And it is a good driving assistant tool. However it is a lot like driving with a teenager with a learner's permit. It can do most things well, however when there is a slightly odd condition, the car will not react as one would expect, where I would take over.
network addicted. (Score:2)
Fun background fact; I grew up on a farm. When the calves were really hungry they tended to stick their tongue out as far as they could while making a dopey expression as the only thing on their mind was getting the rubber cap of that bottle in their mouth so they could eat.
The longer it took to feed them the more frantic and single minded they'd become and the more they would thrash and struggle and the toungue would flop around as they scrambled for food. "Need it NOW!" *mlalamlamluamlam*
That's how I see
Re: (Score:2)
Oldie but goodie (Score:3)
Bill Gates on Car Industry
I got this particularly good email in my inbox today.
For all of us who feel only the deepest love and affection for the way computers have enhanced our lives, read on.
At a recent computer expo (COMDEX), Bill Gates reportedly compared the computer industry with the auto industry and stated,
‘If GM had kept up with technology like the computer industry has, we would all be driving $25 cars that got 1,000 miles to the gallon..’
In response to Bill’s comments, General Motors issued a press release stating:
If GM had developed technology like Microsoft we would all be driving cars with the following characteristics (and I just love this part):
1. For no reason whatsoever, your car would crashtwice a day.
2.. Every time they repainted the lines in the road, you would have to buy a new car.
3 Occasionally your car would die on the freeway for no reason. You would have to pull to the side of the road, close all of the windows, shut off the car, restart it, and reopen the windows before you could continue. For some reason you would simply accept this.
4. Occasionally, executing a maneuver such as a left turn would cause your car to shut down and refuse to restart, in which case you would have to reinstall the engine.
5. Macintosh would make a car that was powered by the sun, was reliable, five times as fast and twice as easy to drive - but would run on only five percent of the roads.
6. The oil, water temperature, and alternator warning lights would all be replaced by a single 'This Car Has Performed An Illegal Operation’ warning light.
7. The airbag system would ask 'Are you sure?’ before deploying.
8. Occasionally, for no reason whatsoever, your car would lock you out and refuse to let you in until you simultaneously lifted the door handle, turned the key and grabbed hold of the radio antenna.
9. Every time a new car was introduced car buyers would have to learn how to drive all over again because none of the controls would operate in the same manner as the old car.
10. You’d have to press the 'Start’ button to turn the engine off.
Bill Gates Cars Automobiles Windows
Sourced from https://www.joelscanlon.com/po... [joelscanlon.com]
Re:Take them off the streets, now; right now. (Score:5, Insightful)
Last year 42,000 people were killed, just in the USA (one million worldwide), in HUMAN-driven vehicle accidents. And that's not reporting the number of people horribly disabled in accidents. Why are we allowing humans to drive vehicles? Why aren't we questioning the logic of allowing humans to drive cars? Humans are to stupid, unpredictable, and distractible to drive vehicles. Sorry but that's a fact. ]
The only reason humans are being allowed to drive is that the media conspires to not report as headlines every fatal human-driven car accident. If we sensationalize and put every fatal car accident as a headline on the major news outlets I guarantee people will be afraid to drive anywhere.
Autonomous vehicles are the only way to reduce the number of fatal, or maiming, car accidents. If we can reduce the number of fatal accidents to 41,999 from 42,000 by fully autonomous we should do that. And if it means experimenting, and failing on the road to get there. Yes we should do that. If early rocket engineers and their funders chickened out at the first rocket failure we would never have gone into space or landed on the moon. That said, we should have banned human-driven vehicles after the first fatal accident. We should have banned airplanes after the first fatal airplane accident in 1908 when Thomas Selfridge died in the Wright Model A.
Clarifying your point... (Score:5, Insightful)
If we can reduce the number of fatal accidents to 41,999 from 42,000 by fully autonomous we should do that.
I get your point, but I don't think others will when you say only one life will get saved upending the entire automobile industry paradigm. I think your point should be this: Let's aim for a number at least a couple orders of magnitude larger, shall we?
I think the big-picture problem doesn't have to do with the number of lives being saved, or what it will cost to get there, but rather who to blame when accidents involving autonomous vehicles happen, because they undoubtedly still will. The insurance industry will need concrete rules to follow for determining liability for claims, but so far, nobody's worked out that part of the equation.
criminal liability, can you get an DUI in one? (Score:2)
criminal liability, can you get an DUI in one? needs to be worked on as well.
Re: (Score:2)
It will not be the car manufacturers I tell you what. I am sure they will find a way to either not pay at all or make some other party take a fall.
Re: (Score:2)
Well, to be honest, an awful lot of those accidents are caused by humans who are NOT allowed to drive a vehicle - drunks (or otherwise impaired) and texters. And a very large part of the rest of the accidents are caused by inexperienced drivers who, frankly, drive a lot autonomous vehicles. What do I mean by that? They pay too much attention to 'rules' and when they encounter a situation where they don't have a good rule they have no idea what to do, so they do something incredibly stupid. Like seeing th
Re: (Score:2)
Ok, so? How about this excuse for autonomous then: "Well, to be honest, an awful lot of those accidents are caused by inadequate sensors and programming that didn't account for the situation." The difference is the human distracted driver will never fix himself, whereas the computer code and hardware will improve in every iteration.
Re: (Score:2)
You completely missed the point. You asked why we allow humans to drive. And the fact is, a large group of accident causers are in fact NOT allowed to drive.
Which leaves us with the group of humans who ARE allowed to drive, but cause the most accidents - inexperienced drivers. And autonomous cars drive exactly like them. Fortunately, even though inexperienced drivers have the most accidents there are relatively few of them on the road.
SDC advocates have been saying for a least a decade how superior SDC
Re: (Score:2)
"SDC advocates" .. which ones? First off, they actually ARE superior already and have been getting better. Besides not all SDC advocates speak for everyone. Most sane experts knew it will take some time -- and even if they didn't .. so what? Besides, it has not been even a decade! Did Robert Goddard in 1913 know all the challenges that building a rocket that can go to Mars will have? It took nearly 200 years to go from the first long-range high-performance metal rockets of Mysore, India to landing a man on
Re: (Score:2)
Which SDC advocates claim that SDCs will be superior to human drivers? The ones like you. Unless you mean that the death rate for human-driven vehicles means those vehicles should just not be used.
Self-driving cars only drive themselves in the easiest, safest situations. That means that saying they're safer than human drivers on the whole is comparing apples to oranges.
Re: (Score:2)
In other words, you have no clue whether they are safer or not, yet you claim to have knowledge that a computer with a sensor array can never drive better than a human and preserve more lives.
Re: (Score:2)
This whole article is about ways that self-driving cars make mistakes that people would not. You are the one who needs to show that they're safer, and you can't do that with the current evidence.
Instead, you make straw men to argue against. I never claimed that a computer with a sensor array can never drive better than a human -- I think they eventually will, but do not today. Until we get to Level 5 automation, where the car never needs a human to take over, there will be cases where humans drive better
Re: (Score:2)
I did show that. The statistics and numbers are there and presented. You keep denying them like a anti-vaxxer what can I do about that.
https://www.tesla.com/VehicleS... [tesla.com]
Re: (Score:2)
You're still comparing apples to oranges.
Re: (Score:2)
Repeating something over and over doesn't make it true. Autonomous vehicles are safer, a few cherry-picked accidents don't disprove that. Where are your numbers? One thing is sure 40,000 deaths are caused by human drivers, and even you admit that at least someday computers can be capable of doing better than that. Well, how will such cars ever get deployed if you never allow them to get developed? You will keep claiming they haven't been proven safe, no matter what. You won't even allow them to be put on ro
Re: (Score:2)
Tesla provides Level 2 autonomy -- which is very limited driver assistance. It only handles the easiest kind of driving, so it damn well better have a lower accident rate than overall driving.
Developing safe autonomous driving technology is hard. It requires lots of simulation and controlled environments. What it doesn't require is putting unqualified software into the real world where it will drive people into trucks, or highway barriers, or the other kinds of things that autonomous vehicles tend to do.
Re: (Score:2)
Re: (Score:2)
If we can reduce the number of fatal accidents to 41,999 from 42,000 by fully autonomous we should do that. And if it means experimenting, and failing on the road to get there. Yes we should do that.
We can reduce the number of deaths immediately by 10,000 merely by requiring a breathalyzer test before starting the car.
Do you support that? If not, your position is illogical and you should rethink your ideas.
Re: (Score:2)
Yes, I do 100% support the breathalyzer for everyone. Except for myself, since I know I won't drive drunk, but I don't get to make the rules do I?
Re: (Score:2)
Last year 42,000 people were killed, just in the USA (one million worldwide), in HUMAN-driven vehicle accidents. And that's not reporting the number of people horribly disabled in accidents. Why are we allowing humans to drive vehicles?
Who else would drive them? Kangaroos?
Why aren't we questioning the logic of allowing humans to drive cars?
We are way too busy. Too many places to go.
The only reason humans are being allowed to drive is that the media conspires to not report as headlines every fatal human-driven car accident. If we sensationalize and put every fatal car accident as a headline on the major news outlets I guarantee people will be afraid to drive anywhere.
Perhaps you should be questioning the logic of venturing outside yet before you do turn off the electricity to your building immediately. There are 50k electrical fires in the US alone every year. It's not safe.
Autonomous vehicles are the only way to reduce the number of fatal, or maiming, car accidents. If we can reduce the number of fatal accidents to 41,999 from 42,000 by fully autonomous we should do that.
Autonomous vehicles don't exist.
And if it means experimenting, and failing on the road to get there. Yes we should do that.
No we shouldn't. It's irresponsible and there is no reason for it.
If early rocket engineers and their funders chickened out at the first rocket failure we would never have gone into space or landed on the moon.
This unfalsifiable rhetoric is right up there with yelling YOLO before doing something insanely stupid.
That said, we should have banned human-driven vehicles after the first fatal accident. We should have banned airplanes after the first fatal airplane accident in 1908 when Thomas Selfridge died in the Wright Model A.
For the love of
Re:Take them off the streets, now; right now. (Score:4, Informative)
Re: (Score:2)
What human driver, on seeing that someone is about to t-bone them, would STOP in the middle of the intersection, all but guaranteeing an accident? The car was hit in the rear, which means if it kept going, or even sped up, it would have avoided the accident.
Re: (Score:3, Informative)
RTFA. The autonomous car started to make a left turn, saw an oncoming vehicle, and stopped to avoid being T-boned. Then a speeding human-driven car behind it rear-ended the stopped driverless car.
Re: (Score:3)
Re: (Score:3)
That is not what it says. It says it started to make the left turn, saw the Prius coming at it, stopped, and was hit by the Prius in the rear passenger side. If it had kept going (like the Prius driver no doubt expected it to do) the accident would not have occurred.
The Prius driver was wrong, no doubt about it. But a large part of driving is avoiding accidents, even when the other driver is at fault. And a large part of driving involves driving in a manner that other drivers expect (eg not stopping in a
Re: (Score:2)
The Prius driver was wrong, no doubt about it.
It is not that clear-cut. The Cruise could be cited for "failure to yield" because it entered but failed to clear somebody else's right-of-way or the Prius could be cited for "assured clear distance" because one is not allowed to just run into somebody that is in your intended path. Likely, this will come down to determining if the Cruise had good reason to stop and if the Prius had sufficient time to react.
Re: (Score:2)
It won't be long before we are comfortable with autonomous cars, and they will be far safer and consistent than humans in short oder.
One day someone will freak out that they cause 5,000 accidents per year when only 3 Billion Americans use them.
Anyway, that's not GM and not today. But it's funny how people want to give up on it because there are accidents. If it were the other way around, NOBODY would be giving humans keys to anything.
Asking robot overlords for the are keys; "Oh come on, I only fall asleep f
Re: (Score:2)
What human driver, on seeing that someone is about to t-bone them, would STOP in the middle of the intersection, all but guaranteeing an accident?
You're making assumptions. Lots of them:
a) you're assuming that the stop was the cause of the t-bone and that the car moving would have changed anything.
b) you're assuming that a human would identify they are being t-boned. In nearly every case the first they notice is that they are spinning sideways.
c) you're assuming that a human is capable of thinking and making a split second emergency decision. We're not. Even in simulators where people know something is about to surprise them we have proven that human
Re: (Score:2)
Re: (Score:2)
Agree. It's like we turned a pack of drunk drivers loose on the streets. I can't believe it's actually legal for these "experiments" to be driving on our roads.
Re: (Score:2)
Drunk drivers are on the road all the time, and you don't give a shit so stop this BS. Autonomous vehicles work fine and better than humans already, that's good enough to justify them being on the road. Even if they are 1% better that justifies putting them on the road.
Re: (Score:2)
Autonomous vehicles work fine and better than humans already
There is no such thing as an autonomous vehicle. All self driving cars require a human to baby sit them and make sure they don't do something stupid which for anyone who has read Tesla forums is all the time.
FFS the cruise control barely works without random nuisance breaking.
Re: Take them off the streets, now; right now. (Score:2)
Re: (Score:2)
I agree. A human here made a mistake (and broke the law) and hit a driverless car. We should ban humans from the streets.
Re: (Score:2)
Pff, hysterical much? I think that if you simply gave autonomous cars guns, everyone would be safer and it would reduce deaths.
Indeed (Score:2)
Take them off the streets!
Indeed, we need to get people-driven cars off the streets as soon as practical.
After all, it wasn't the AI driven car that caused the accident in this case - it was the human driven one that was both speeding and violating other rules of the road by going straight in a right turn lane.
Odds are, if the AI car had been human driven we'd have still seen an accident, but as part of like 6 Million [ddlawtampa.com] car accidents a year, it'd go unnoticed.
Re: (Score:2)
We should ban alcohol and weed also. Maybe caffeine and Tylenol as well. All can be considered mind-altering drugs. We should also ban sugar and probably salt as well. Oh and meat and dairy because those industries kill us with their food products also. Probably most produce as well since I just saw an article that mentions some 80% of USA waste-water samples show us all with cancer agents caused by roundup, a weed killer used on all our crops and animal feed.
Turns out, life is not a safe experience and all
Re: Winter NorthEast Driving (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: Waste of time (Score:2)
You dumb ass, the big car companies such as GM will be the ones dominating the self-driving car market by 2020. Tesla will be bankrupt and consigned to the dustbin of history.