After Low-Speed Bus Crash, Cruise Recalled Software for Its Self-Driving Taxis in March (sfchronicle.com) 89
San Francisco autonomous vehicle company Cruise recalled and updated the software of its fleet of 300 cars, reports the San Francisco Chronicle, " after a Cruise taxi rear-ended a local bus "when the car's software got confused by the articulated vehicle, according to a federal safety report and the company."
The voluntary report notes that Cruise updated its software on March 25th. Since last month's low-speed crash, which resulted in no injuries, Cruise CEO Kyle Vogt said the company chose to conduct a voluntary recall, and the software update assured such a rare incident "would not recur...." As for the March bus collision, Vogt said the software fix was uploaded to Cruise's entire fleet of 300 cars within two days. He said the company's probe found the crash scenario "exceptionally rare" with no other similar collisions.
"Although we determined that the issue was rare, we felt the performance of this version of software in this situation was not good enough," Vogt wrote in a blog post. "We took the proactive step of notifying NHTSA that we would be filing a voluntary recall of previous versions of our software that were impacted by the issue." The CEO said such voluntary recalls will probably become "commonplace."
"We believe this is one of the great benefits of autonomous vehicles compared to human drivers; our entire fleet of AVs is able to rapidly improve, and we are able to carefully monitor that progress over time," he said.
The Cruise car was traveling about 10 miles per hour, and the collision caused only minor damage to its front fender, Vogt's blog post explained. San Francisco's buses have front and back coaches connected by articulated rubber, and when the Cruise taxi lost sight of the front half, it made the assumption that it was still moving (rather than recognizing that the back coach had stopped). Or, as Cruise told the National Highway Traffic Safety Administration, their vehicle ""inaccurately predicted the movement" of the bus. It was not the first San Francisco incident involving Cruise since June, when it became the first company in a major city to win the right to taxi passengers in driverless vehicles — in this case Chevrolet Bolts. The city's Municipal Transportation Agency and County Transportation Authority recorded at least 92 incidents from May to December 2022 in which autonomous ride-hailing vehicles caused problems on city streets, disrupting traffic, Muni transit and emergency responders, according to letters sent to the California Public Utilities Commission....
Just two days before the Cruise crash in March, the company had more problems with Muni during one of San Francisco's intense spring storms. A falling tree brought down a Muni line near Clay and Jones streets on March 21, and a witness reported on social media that two Cruise cars drove through caution tape into the downed wire. A company representative said neither car had passengers and teams were immediately dispatched to remove the vehicles.
On Jan. 22, a driverless Cruise car entered an active firefighting scene and nearly ran over hoses. Fire crews broke a car window to try to stop it.
The voluntary report notes that Cruise updated its software on March 25th. Since last month's low-speed crash, which resulted in no injuries, Cruise CEO Kyle Vogt said the company chose to conduct a voluntary recall, and the software update assured such a rare incident "would not recur...." As for the March bus collision, Vogt said the software fix was uploaded to Cruise's entire fleet of 300 cars within two days. He said the company's probe found the crash scenario "exceptionally rare" with no other similar collisions.
"Although we determined that the issue was rare, we felt the performance of this version of software in this situation was not good enough," Vogt wrote in a blog post. "We took the proactive step of notifying NHTSA that we would be filing a voluntary recall of previous versions of our software that were impacted by the issue." The CEO said such voluntary recalls will probably become "commonplace."
"We believe this is one of the great benefits of autonomous vehicles compared to human drivers; our entire fleet of AVs is able to rapidly improve, and we are able to carefully monitor that progress over time," he said.
The Cruise car was traveling about 10 miles per hour, and the collision caused only minor damage to its front fender, Vogt's blog post explained. San Francisco's buses have front and back coaches connected by articulated rubber, and when the Cruise taxi lost sight of the front half, it made the assumption that it was still moving (rather than recognizing that the back coach had stopped). Or, as Cruise told the National Highway Traffic Safety Administration, their vehicle ""inaccurately predicted the movement" of the bus. It was not the first San Francisco incident involving Cruise since June, when it became the first company in a major city to win the right to taxi passengers in driverless vehicles — in this case Chevrolet Bolts. The city's Municipal Transportation Agency and County Transportation Authority recorded at least 92 incidents from May to December 2022 in which autonomous ride-hailing vehicles caused problems on city streets, disrupting traffic, Muni transit and emergency responders, according to letters sent to the California Public Utilities Commission....
Just two days before the Cruise crash in March, the company had more problems with Muni during one of San Francisco's intense spring storms. A falling tree brought down a Muni line near Clay and Jones streets on March 21, and a witness reported on social media that two Cruise cars drove through caution tape into the downed wire. A company representative said neither car had passengers and teams were immediately dispatched to remove the vehicles.
On Jan. 22, a driverless Cruise car entered an active firefighting scene and nearly ran over hoses. Fire crews broke a car window to try to stop it.
self driving unsafe at any speed! (Score:2, Troll)
self driving unsafe at any speed!
Re: (Score:3, Informative)
So are humans, Self driving cars don't have to be perfect, though they will be damn close to it. They only need to be safer than human drivers to save lives. Human drivers hit buses on a daily basis.
Re:self driving unsafe at any speed! (Score:4, Interesting)
Lets' add that each time an exception is identified, all cars in the fleet improve and I hope that the companies involved are sharing training data for safety. As such, all self-driving car companies will improve whenever one company faces an exception.
I wonder is the government has the option to regulate the self-driving car companies based on their response times to new anomalies.
For example, whenever a self-driving car company encounters an issue which isn't already in a government regulated series of unit/integration tests, then the company encountering the issue should be required within 24 hours to upload a new integration test/training data and then, as with CVEs, every self-driving company should be required to provide updates and publish response times to the new issue. Severity should be considered as well, like "crossing police tape" as high severity and "high risk of running over politician or lawyer" as optional to fix.
Overall, I very much look forward to getting "Good drivers" off the road. I think we should reach a point where possession of a drivers license should be extremely expensive and should be metered for use. In other words, I look forward to a time when we can charge human drivers a considerable amount for each kilometer driven. I feel this way because I think people who feel strongly enough that they are excellent drivers and that driving is fun are precisely the people on the road that scare me most.
Re: (Score:1)
Re:self driving unsafe at any speed! (Score:5, Informative)
Slashdot seems to feel they still have enough room to list the entire history of accidents of said cars from multiple companies on the blurb.
That's not true at all. The summary brief mentions that San Francisco saw 92 incidents in eight months last year. All the specific incidents mentioned were Cruise cars from this year. There have been other accidents in other places, and even at least one attempt to blame humans [theverge.com] for the mistakes made by automation. Self-driving cars, even with all the limits put on their deployment and use, still have about twice the accident rate (9.1 vs 4.1 per million miles). They have a long way to go before they can replace human drivers.
Re: self driving unsafe at any speed! (Score:2)
Re: (Score:2)
Re: self driving unsafe at any speed! (Score:2)
Re: (Score:2)
Re: (Score:2)
TFS said "incidents", but that was the only place that mentioned any other companies, so I assumed that was what the GP comment was referring to. Whether we count "accidents" or "incidents", there continue to be many beyond what TFS mentioned, so the GP comment is wrong either way.
Re: (Score:2)
And that is just it: As soon as self-driving are significantly safer and work well (not far off), in Europe the insurances will force adoption by massively higher rates for human-driven and in the US lawsuits will do the same. The only reason human drivers (except for professional expert drivers) are even allowed on the streets is lack of a better alternative. That is about to go away.
Re: self driving unsafe at any speed! (Score:2)
Re:self driving unsafe at any speed! (Score:4, Informative)
So are humans, Self driving cars don't have to be perfect, though they will be damn close to it. They only need to be safer than human drivers to save lives. Human drivers hit buses on a daily basis.
They even run down children [cbsnews.com] when bus lights are flashing letting the kids off, just like humans do. Another case where the vehicle doesn't stop for vehicles [cbsnews.com] with their flashing lights on.
At least Cruise is willing to do the right thing and figure out why this happened whereas Tesla appears to think killing people is the price to pay for "autonomous" driving.
Re: (Score:2)
We are in a middle of a twice a century trend where we think technology is reaching a point where its ability to automate will be too much for us to deal with, take jobs, and destroy civilization as we know it.
The Cotton Gin was made in the late 1700's with hope it would reduce the need for slaves in the Southern America, as it got rid of a time consuming job. However it ended up creating more slaves into 1800's because the increase ability for cotton production, had grown the plantations so the still purch
Re: (Score:2)
With each iteration quality of life for most people improved .. but it became VERY bad for people who couldn't adapt. That was/is the price. That's why government has to provide a safety net. It can't be a ridiculous safety net though. I feel like there should be a lifetime limit on how much direct cash you can get from the federal government. If you have medical issues, the treatment costs should be directly reimbursed to the clinic/hospital providing the care.
Re: (Score:2)
The safety net issue, isn't because it is too easy to say in, but it is too hard to get out.
Cutting the string with someone who needs assistance, will not push them to pull up their bootstraps and get over themselves. No they will just be in poverty, and move to crimes to survive. You are probably not going to teach a 55 Year old Coal Miner to write software, leave their 6 figure job, to a 5 figure entry level job, if they can even get a job.
Re: (Score:2)
Nope. This particular self-driving presents a low risk in some situations. No need to be an ass.
Re: (Score:2)
self driving unsafe at any speed!
"Thank you for using Johnny Cab!!"
Re: self driving unsafe at any speed! (Score:2)
Re: (Score:2)
It comes down to a Stupid Driver who is paying full attention to their driving, vs a smart driver who will get distracted, tired, and tunnel focus.
I have a Tesla with FSD Beta. I am actually safer driving with it on vs off. However It will do stupid things, and have difficultly with particular maneuvers which may piss off other drivers. Having a long commute that can take over an hour of a drive, it is really easy for me to Zone out, and having the car take over while I monitor it, is actually much easier
We're not there yet. (Score:4, Interesting)
Re: (Score:3, Insightful)
It's clear the technology isn't to the point where we can have fully self-driving vehicles on the road
Wait 'til you find out how many crashes were caused by humans today. You'll be down at the DMV protesting withing five minutes.
Re: (Score:3)
What if you take into account that vehicles in full self-driving mode have covered only a minuscule amount of the total car mileage traveled today and have done so only when in the simplest of conditions.
Re:We're not there yet. (Score:4, Funny)
What if you take into account that with a simple software update we can make an entire fleet of "drivers" magically better while humans have shown to be useless and incapable of improving sometimes even in the most basic conditions.
Re: (Score:2)
Self-driving car companies and their shills have been claiming that's possible for years, and they still have higher accident rates per mile than humans, even when limited to largely low-speed, good-visibility driving conditions on well-known roads.
https://injuryfacts.nsc.org/mo... [nsc.org] shows how much safer human drivers have gotten over time. Your claim that humans are "useless and incapable of improving" is a total lie.
Re: (Score:2)
Self-driving car companies and their shills have been claiming that's possible for years
That's because it *is* possible. Have you not heard of the concept of a software update?
If you think that a single software update is all that is needed to address every bug and edge case may I suggest you try using this thing called a "computer". It may give you some much needed insight as to how technology, especially software updates work.
shows how much safer human drivers have gotten over time
Nope. It shows how much safer cars have gotten over time. Quite specifically some of the greatest advancements we've made in vehicle safety has been through means of ha
Re: (Score:3)
A simple software update can also make an entire fleet of "drivers" suddenly lethally dangerous. Software update quality is dreadful for just about everything these days.
Re: (Score:2)
Software update quality is dreadful for just about everything these days.
Don't confuse your Google Assistant or Windows Update with a car. There's an order of magnitude more testing involved in the latter. Perfect? No, but far from "dreadful".
Re: (Score:2)
> What if you take into account that with a simple software update we can make an entire fleet of "drivers" magically better while humans have shown to be useless and incapable of improving sometimes even in the most basic conditions.
I'm still not going to go down and protest at the DMV.
Re: (Score:2)
Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.
Re: (Score:2)
Tesla software is not currently allowed to take full (unsupervised) control of the vehicle, even in "Full Self Driving" mode, and it still causes [nypost.com] lots of accidents.
Re: (Score:2)
Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.
Tesla is driving on highways which have much fewer collisions on per-mile basis. And they still manage to crash into semis and ambulances.
Re: (Score:2)
Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.
Yes, and according to the Department of Transportation [dot.gov], there were close to 300 million personal and commercial vehicles registered to drivers in the United Stats in 2020. Tesla's roughly 3 million vehicles (worldwide, not just in the U.S.) is a very small fraction of those. And I'd like to know where your claim of "tens of billions of miles" driven by Tesla cars came from, although I have a good idea you're sitting on it. Just 1 billion miles divided between 3 million vehicles is well over 300 million m
Re: (Score:2)
Tesla's roughly 3 million vehicles (worldwide, not just in the U.S.) is a very small fraction of those. And I'd like to know where your claim of "tens of billions of miles" driven by Tesla cars came from, although I have a good idea you're sitting on it. Just 1 billion miles divided between 3 million vehicles is well over 300 million miles per vehicle, and you're claiming tens of billions of miles.
Missed the sarcasm tag so just in case . 1 million cars doing 1,000 miles per month is 1 Billion miles. Worldwide Tesla has 4 million plus sales. US between 1-2 million. So yes, billions of miles. They publish their safety data for Tesla with and without auto pilot. FSD beta improves with each iteration.
In G-d We trust. Everyone else - bring data.
Re: (Score:2)
Re: (Score:2)
Just 1 billion miles divided between 3 million vehicles is well over 300 million miles per vehicle
Is this supposed to be a joke?
I can't believe anyone is really that bad at math.
Re: (Score:2)
Just 1 billion miles divided between 3 million vehicles is well over 300 million miles per vehicle
Is this supposed to be a joke?
I can't believe anyone is really that bad at math.
Yeah, mistakes happen. I'd just woken up and hadn't had my first cup of coffee yet, so divided by 3 instead of 3 million. The rest of the post is valid, however, the number of miles driven by autonomous vehicles is far, far less than those driven by humans.
Re: (Score:3)
Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles
In much the same way Internet Explorer is installed on millions of PCs
Re: (Score:2)
> Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.
True. My example was about mileage driven today. The total miles "autopilot" has driven is an even smaller percent of the total miles driven by cars.
Re: (Score:2)
Wait 'til you find out how many crashes were caused by humans today. You'll be down at the DMV protesting withing five minutes.
Sorry your child got ran over by our self-driving truck. Just imagine how much worse it could've been with a human driver!
Re: (Score:1)
It's only a misdemeanor to beat the crap out of a robot.
Re: (Score:2)
It's clear the technology isn't to the point where we can have fully self-driving vehicles on the road
Wait 'til you find out how many crashes were caused by humans today. You'll be down at the DMV protesting withing five minutes.
Wait 'till you discover the number of autonomous vehicles on the road vs the number of human driven vehicles, and then compare the number of crashes percentage wise between the two. You'll find the autonomous vehicle accident rate far, far higher than that of human drivers.
We're past there (Score:3)
Re: (Score:2)
There are still zero cars available with level 4 or 5 autonomy, and zero cars sold with level 3 autonomy in most countries. Level 3 still requires an attentive human driver at all times, so your claim is bollocks.
Re: (Score:2)
Re: (Score:3)
It's clear the technology isn't to the point where we can have fully self-driving vehicles on the road, especially mixed with non-self-driving vehicles.
If your goal is perfection, then we won't ever have fully self-driving vehicles on the road, then. Realistically, the only way to test the technology is to put it out on the roads and see how it behaves in the real world. You'll never catch all the corner cases in simulated drives, nor even close.
That said, the self-driving software ignored pretty much all of its sensor data, including brake light data and positioning, in favor of believing that the front part of an articulated bus must still be moving be
Re: (Score:2)
Re: (Score:2)
As long as the owner of the vehicle is financially and legally responsible for the damage it causes, I certainly do need it to be perfect. If insurance and DMV don't hold me responsible for the mistakes the driver makes when I'm not driving then I don't care as much.
In this case, the owner of the vehicle is GM. GM's Cruise self-driving tech isn't available for sale. All the vehicles are fleet vehicles (robotaxis).
Re: (Score:2)
Re: (Score:3, Insightful)
Bullshit. Self-driving only has to be significantly better than humans. It very likely already is. Claims like yours that indicate self-driving has to be perfect are just insightless nonsense. The very reason this basically non-story made the news is that accidents like this are extremely rare for self-driving. Compare that to the level of damage a human driver has to do to make the news.
Re: (Score:3)
Bullshit. Self-driving only has to be significantly better than humans.
And we're nowhere near that point yet. The percentage of autonomous accidents per vehicle is a lot higher than those vs human-driver, per vehicle type on the road.
Claims like yours that indicate self-driving has to be perfect are just insightless nonsense.
Please tell me where in my post I claim self-driving vehicles has to be perfect. Imagine the carnage on the road if the number of autonomous vehicles on the road were comparable to the number of human-driven vehicles, and the percentage of accidents for the autonomous vehicles were the same as it is now.
I stand my my statement that we are not at
Re: (Score:2)
And we're nowhere near that point yet. The percentage of autonomous accidents per vehicle is a lot higher than those vs human-driver, per vehicle type on the road.
Really depends on the metric. Yours is bullshit.
Re: (Score:2)
And we're nowhere near that point yet. The percentage of autonomous accidents per vehicle is a lot higher than those vs human-driver, per vehicle type on the road.
Really depends on the metric. Yours is bullshit.
Care to explain yourself, or do you often spew out Faux newsworthy crap?
Re: (Score:2)
Nope. Just this: "Percentage of autonomous accidents per vehicle" is pure nonsense and has no meaning.
Re: (Score:2)
Let me clear it up a bit. The percentage of accidents per vehicle type is higher for autonomous vehicles than for human driven vehicles, given the number of each on the road.
I'm sorry it was so confusing to you, but the last part of that sentence should've cleared it up.
Re: We're not there yet. (Score:2)
Better question (Score:3)
The real question is: How many human drivers rear-ended buses in that same time period?
Re: (Score:3)
The real question is: How many human drivers rear-ended buses in that same time period?
That is the wrong metric. A better metric is the number and severity of accidents per kilometer driven for SDVs and HDVs.
But it is not reasonable to lump Waymo, Tesla, Cruise, and others together. They should be evaluated separately. Tesla appears to be far ahead of Cruise, with Waymo somewhere in between. Of course, Tesla got where it is by putting a lot of cars on the road and collecting a lot of data.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Shouldn't the AI recognize the snippet as a "bus" and assume it's larger than a car? Or did they skip using neural nets due to their unpredictability?
Re: (Score:1)
My grandson bought one of these electric self driving EVs and the gas tank exploded and burnt it to the ground. Stick with a BBC!
huh? An EV which no gas tank and the gas tank exploded???
Re the article, Wouldn't the brake lights of the bus be on if it was stopped? how come it didn't see them...
Re: (Score:2)
Maybe it was just following too close. 99.9% of human drivers follow a turning vehicle too closely while going straight, maybe the programmers/training made the self driving vehicle behave the same.
It needs moar Lidars! (Score:5, Funny)
I am sure this is Elon Musk's fault somehow. It just has to be.
Compared to human drivers (Score:3)
Re: (Score:3)
I completely agree. And I am pretty sure that insurances (Europe) and lawsuits (US) will force adoption in the not too distant future. Human drives are, on average, are _bad_. Most also think they are excellent, in a nice Dunning-Kruger effect application.
Re: (Score:2)
Re: (Score:2)
I completely agree. And I am pretty sure that insurances (Europe) and lawsuits (US) will force adoption in the not too distant future.
Maybe. If you've spent much time driving in the same area with self-driving cars, you'll quickly get annoyed with them. If you've spent much time being a pedestrian in an area with self-driving cars, you'll also quickly get annoyed with them (why do Waymo cars stop in the middle of crosswalks?)
Re: (Score:2)
Human drivers (Score:5, Interesting)
1. Driving on the M25 motorway a car in front of me in the middle lane suddenly turns right. 90 degrees, straight across the left lane, right into the fields along the motorway. I was just shocked. Didnâ(TM)t react in any way. Had I been in the left lane at the right position Iâ(TM)d have driven straight into him instead of braking because this was just too unexpected.
2. On the motorway I was following a large van. Couldnâ(TM)t see past it. Suddenly I see the back of the van going up and its front wheels going down. So it looked like he was braking very very hard. Noticed it before I saw it slowing down. I braked hard, aiming to stop right behind the van, praying nobody. Iâ(TM)d be curious if a self driving car would have figured out the van was braking hard that quickly.
3. On a dual lane road, slowly overtaking a truck, a motorbike not far ahead. Suddenly the truck moves into my lane. I didnâ(TM)t know what was behind me, and there wasnâ(TM)t enough time to look, so I accelerated, past him, right up to the motorbike, then braked hard. (Wasnâ(TM)t the truckers fault. I saw in the back mirror that another parked truck had gone into _his_ lane. )
I wonder if self driving cars are at all prepared for that kind of thing. For example do they keep track of vehicles and pedestrians around them do they know what manoeuvres are possible without risk.
Re: (Score:2)
Re: (Score:3)
On a dual lane road, slowly overtaking a truck, a motorbike not far ahead. Suddenly the truck moves into my lane. I didnÃ(TM)t know what was behind me, and there wasnÃ(TM)t enough time to look
You shouldn't have to look to know. You should already have looked, so you'd already know. No time? Too congested? Then it's not safe to pass.
AVs have the potential to be safer mostly because they don't have to do these things to begin with.
Re: (Score:2)
For #1 it seems like a self driving car would not have the "shock" or "too unexpected" human reactions, and would just do the best according to the algorithm. So likely the same as you if in the middle lane, and braking if in the left lane.
For #2 self driving cars with LIDAR should be pretty good at seeing sudden changes of speed in a vehicle they are following, They wouldn't need the other visual cues.
Re: (Score:2)
1. Driving on the M25 motorway a car in front of me in the middle lane suddenly turns right. 90 degrees, straight across the left lane, right into the fields along the motorway. I was just shocked. Didnâ(TM)t react in any way. Had I been in the left lane at the right position Iâ(TM)d have driven straight into him instead of braking because this was just too unexpected.
A machine would not have been shocked and could have reacted much faster than a human could. Though it may still have been impossible to do anything useful.
2. On the motorway I was following a large van. Couldnâ(TM)t see past it. Suddenly I see the back of the van going up and its front wheels going down. So it looked like he was braking very very hard. Noticed it before I saw it slowing down. I braked hard, aiming to stop right behind the van, praying nobody. Iâ(TM)d be curious if a self driving car would have figured out the van was braking hard that quickly.
You were wrong when you said that the drop in the front/rise in the back happened before deceleration. You just didn't notice the deceleration because human dept perception isn't very good. A self-driving car using RADAR or LIDAR would have noticed the change instantly, before the change in vehicle angle, and reacted well before you could have. It also w
Exploitable. (Score:2)
Even if the system was working perfectly, and the car properly obeys every rule every time -- this is exploitable. Have someone perform a swoop-and-squat in front of it, box it in among other vehicles, and a human driver might recognize they're under attack and be willing to drive through other vehicles or humans. The car is going to sit there and let the occupants get murdered, or at best it's going to allow itself to be stolen rather than attempting an escape. The complete lack of situational awareness me
the secret service will not let an self drive syst (Score:2)
the secret service will not let an self drive system trap some like that.
SO they may have an real driver or have there AI in kill mode.
Re: (Score:2)
There are a lot more vulnerable targets than just government officials with Secret Service protection. Yes, driving through someone is hard, especially if they know how to set up a proper roadblock, but it may be better than facing hijackers with AKs at close range.
I personally had an incident where I was driving a friend's car because he had a headache. The first task was to drop his brother off at the brother's girlfriend's house, and in doing so, we started to be followed by a car we didn't recognize. Wh
Re: (Score:2)
Re: (Score:2)
It's not about "the last time you saw it happen" because the road hasn't been clogged with significant numbers of autonomous vehicles. This is a new threat, there is no history to refer to. As thieves optimize to corner and corral autonomous vehicles, those that are still being driven by real humans may find themselves in this situation.
Home invasion robberies developed almost "out of nowhere" too. It only takes one to come up with a plan that others can copy.
Re: (Score:2)
Sounds like a pretty silly algorithm (Score:2)
The article makes it sound like the predictive and kinematic parts of its self driving algorithm trump any sort of fundamental logic, such as, âoedo not proceed if there is an object in front of you.â
If the sensors of the car, on the other hand, did not even see the stationary bus, i have to wonder a bitâ¦
Re: (Score:2)
Same old arguments (Score:2)
Those claiming that the human driver is terrible, and must be terminated from driving because self driving eliminates the cause of accidents, And those who don't believe that to be the case, pointing out every accident as proof.
Digging below the easy arguments is we find out a few things.
Driving a vehicle is very complicated. Driving through tape off areas onto power lines might be an edge case, but in most/all cases, a functioning human would st
Wiley Coyote (Score:2)
Self driving cars have far less intelligence than Wiley Coyote, who almost invariably crashes into a granite cliff with a road painted on it.
low-speed bus == "9600 serial" ? (Score:2)
talk about confusing terms!
This is one of the scariest things I have read (Score:3)
." As for the March bus collision, Vogt said the software fix was uploaded to Cruise's entire fleet of 300 cars within two days. He said the company's probe found the crash scenario "exceptionally rare" with no other similar collisions.
Seriously - a safety critical piece of software was designed, coded, and QA tested in 2 days?
really?
Uhm, I get move fast, but I would hope the simple automated tests would take 2 days, and focused testing on just the fix would take more, then integration testing and regression testing...