Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation AI

After Low-Speed Bus Crash, Cruise Recalled Software for Its Self-Driving Taxis in March (sfchronicle.com) 89

San Francisco autonomous vehicle company Cruise recalled and updated the software of its fleet of 300 cars, reports the San Francisco Chronicle, " after a Cruise taxi rear-ended a local bus "when the car's software got confused by the articulated vehicle, according to a federal safety report and the company."

The voluntary report notes that Cruise updated its software on March 25th. Since last month's low-speed crash, which resulted in no injuries, Cruise CEO Kyle Vogt said the company chose to conduct a voluntary recall, and the software update assured such a rare incident "would not recur...." As for the March bus collision, Vogt said the software fix was uploaded to Cruise's entire fleet of 300 cars within two days. He said the company's probe found the crash scenario "exceptionally rare" with no other similar collisions.

"Although we determined that the issue was rare, we felt the performance of this version of software in this situation was not good enough," Vogt wrote in a blog post. "We took the proactive step of notifying NHTSA that we would be filing a voluntary recall of previous versions of our software that were impacted by the issue." The CEO said such voluntary recalls will probably become "commonplace."

"We believe this is one of the great benefits of autonomous vehicles compared to human drivers; our entire fleet of AVs is able to rapidly improve, and we are able to carefully monitor that progress over time," he said.

The Cruise car was traveling about 10 miles per hour, and the collision caused only minor damage to its front fender, Vogt's blog post explained. San Francisco's buses have front and back coaches connected by articulated rubber, and when the Cruise taxi lost sight of the front half, it made the assumption that it was still moving (rather than recognizing that the back coach had stopped). Or, as Cruise told the National Highway Traffic Safety Administration, their vehicle ""inaccurately predicted the movement" of the bus. It was not the first San Francisco incident involving Cruise since June, when it became the first company in a major city to win the right to taxi passengers in driverless vehicles — in this case Chevrolet Bolts. The city's Municipal Transportation Agency and County Transportation Authority recorded at least 92 incidents from May to December 2022 in which autonomous ride-hailing vehicles caused problems on city streets, disrupting traffic, Muni transit and emergency responders, according to letters sent to the California Public Utilities Commission....

Just two days before the Cruise crash in March, the company had more problems with Muni during one of San Francisco's intense spring storms. A falling tree brought down a Muni line near Clay and Jones streets on March 21, and a witness reported on social media that two Cruise cars drove through caution tape into the downed wire. A company representative said neither car had passengers and teams were immediately dispatched to remove the vehicles.

On Jan. 22, a driverless Cruise car entered an active firefighting scene and nearly ran over hoses. Fire crews broke a car window to try to stop it.

This discussion has been archived. No new comments can be posted.

After Low-Speed Bus Crash, Cruise Recalled Software for Its Self-Driving Taxis in March

Comments Filter:
  • self driving unsafe at any speed!

    • Re: (Score:3, Informative)

      by backslashdot ( 95548 )

      So are humans, Self driving cars don't have to be perfect, though they will be damn close to it. They only need to be safer than human drivers to save lives. Human drivers hit buses on a daily basis.

      • by LostMyBeaver ( 1226054 ) on Monday April 10, 2023 @12:08AM (#63437656)
        I think the main point is that within such a small amount of time, there are genuine self driving taxis in service which seem to have good enough driving records that Slashdot seems to feel they still have enough room to list the entire history of accidents of said cars from multiple companies on the blurb.

        Lets' add that each time an exception is identified, all cars in the fleet improve and I hope that the companies involved are sharing training data for safety. As such, all self-driving car companies will improve whenever one company faces an exception.

        I wonder is the government has the option to regulate the self-driving car companies based on their response times to new anomalies.

        For example, whenever a self-driving car company encounters an issue which isn't already in a government regulated series of unit/integration tests, then the company encountering the issue should be required within 24 hours to upload a new integration test/training data and then, as with CVEs, every self-driving company should be required to provide updates and publish response times to the new issue. Severity should be considered as well, like "crossing police tape" as high severity and "high risk of running over politician or lawyer" as optional to fix.

        Overall, I very much look forward to getting "Good drivers" off the road. I think we should reach a point where possession of a drivers license should be extremely expensive and should be metered for use. In other words, I look forward to a time when we can charge human drivers a considerable amount for each kilometer driven. I feel this way because I think people who feel strongly enough that they are excellent drivers and that driving is fun are precisely the people on the road that scare me most.
        • by Anonymous Coward
          I've tried driving "by the book" and can attest to the difficulty of doing so. It will be interesting to see what actually driving by the book might look like if a majority of vehicles on the road are automated. I'd love to see the orchestra running at max speed. That would be cool. Zoom zoom!
        • by Entrope ( 68843 ) on Monday April 10, 2023 @06:49AM (#63437938) Homepage

          Slashdot seems to feel they still have enough room to list the entire history of accidents of said cars from multiple companies on the blurb.

          That's not true at all. The summary brief mentions that San Francisco saw 92 incidents in eight months last year. All the specific incidents mentioned were Cruise cars from this year. There have been other accidents in other places, and even at least one attempt to blame humans [theverge.com] for the mistakes made by automation. Self-driving cars, even with all the limits put on their deployment and use, still have about twice the accident rate (9.1 vs 4.1 per million miles). They have a long way to go before they can replace human drivers.

          • Accident or incident? Stopping in the middle of a lane is not an accident, but is recorded as an incident.
            • Any stupid thing that a human doesn't do is an accident if it causes a collision.
              • Except every single stupid thing, like stopping in the middle of the lane IS exactly what humans also do, I have seen it so many times, I even got in an accident due to a moron like that, stopping in the middle of the road, while I had plenty of time to stop and hit the brakes, my car didn't stop due to oily/slippery road, and ofcourse the f-ing moron just drove on and let the people behind him with his mess.
            • by Entrope ( 68843 )

              TFS said "incidents", but that was the only place that mentioned any other companies, so I assumed that was what the GP comment was referring to. Whether we count "accidents" or "incidents", there continue to be many beyond what TFS mentioned, so the GP comment is wrong either way.

      • by gweihir ( 88907 )

        And that is just it: As soon as self-driving are significantly safer and work well (not far off), in Europe the insurances will force adoption by massively higher rates for human-driven and in the US lawsuits will do the same. The only reason human drivers (except for professional expert drivers) are even allowed on the streets is lack of a better alternative. That is about to go away.

        • Insurance rates are based on risk of the need for the insurance company to pay out, not relative risk. Unless human drivers suddenly become worse drivers than they currently are then their insurance costs should not change. Indeed, if self-driving cars behave more safely then human drivers are likely to have fewer accidents too in terms of accidents per human driver.
      • by quonset ( 4839537 ) on Monday April 10, 2023 @06:47AM (#63437936)

        So are humans, Self driving cars don't have to be perfect, though they will be damn close to it. They only need to be safer than human drivers to save lives. Human drivers hit buses on a daily basis.

        They even run down children [cbsnews.com] when bus lights are flashing letting the kids off, just like humans do. Another case where the vehicle doesn't stop for vehicles [cbsnews.com] with their flashing lights on.

        At least Cruise is willing to do the right thing and figure out why this happened whereas Tesla appears to think killing people is the price to pay for "autonomous" driving.

      • We are in a middle of a twice a century trend where we think technology is reaching a point where its ability to automate will be too much for us to deal with, take jobs, and destroy civilization as we know it.

        The Cotton Gin was made in the late 1700's with hope it would reduce the need for slaves in the Southern America, as it got rid of a time consuming job. However it ended up creating more slaves into 1800's because the increase ability for cotton production, had grown the plantations so the still purch

        • With each iteration quality of life for most people improved .. but it became VERY bad for people who couldn't adapt. That was/is the price. That's why government has to provide a safety net. It can't be a ridiculous safety net though. I feel like there should be a lifetime limit on how much direct cash you can get from the federal government. If you have medical issues, the treatment costs should be directly reimbursed to the clinic/hospital providing the care.

          • The safety net issue, isn't because it is too easy to say in, but it is too hard to get out.
            Cutting the string with someone who needs assistance, will not push them to pull up their bootstraps and get over themselves. No they will just be in poverty, and move to crimes to survive. You are probably not going to teach a 55 Year old Coal Miner to write software, leave their 6 figure job, to a 5 figure entry level job, if they can even get a job.

    • by gweihir ( 88907 )

      Nope. This particular self-driving presents a low risk in some situations. No need to be an ass.

    • self driving unsafe at any speed!

      "Thank you for using Johnny Cab!!"

    • No it isn't, but it's still in the early days. And even now it's already impressive. Just as they say i the article, these self driving vehicles can learn so much faster, on the real road, as any human can. The article tells about a lot of incidents regarding these vehicles in a rather short period of time, but in comparison to how many incident are created by real humans, it's a fraction of that. Real human drivers are way WAY more responsible for crashes even with the state of the self driving software o
    • It comes down to a Stupid Driver who is paying full attention to their driving, vs a smart driver who will get distracted, tired, and tunnel focus.

      I have a Tesla with FSD Beta. I am actually safer driving with it on vs off. However It will do stupid things, and have difficultly with particular maneuvers which may piss off other drivers. Having a long commute that can take over an hour of a drive, it is really easy for me to Zone out, and having the car take over while I monitor it, is actually much easier

  • We're not there yet. (Score:4, Interesting)

    by bjwest ( 14070 ) on Sunday April 09, 2023 @10:26PM (#63437598)
    It's clear the technology isn't to the point where we can have fully self-driving vehicles on the road, especially mixed with non-self-driving vehicles.
    • Re: (Score:3, Insightful)

      by Joce640k ( 829181 )

      It's clear the technology isn't to the point where we can have fully self-driving vehicles on the road

      Wait 'til you find out how many crashes were caused by humans today. You'll be down at the DMV protesting withing five minutes.

      • What if you take into account that vehicles in full self-driving mode have covered only a minuscule amount of the total car mileage traveled today and have done so only when in the simplest of conditions.

        • by thegarbz ( 1787294 ) on Monday April 10, 2023 @03:40AM (#63437780)

          What if you take into account that with a simple software update we can make an entire fleet of "drivers" magically better while humans have shown to be useless and incapable of improving sometimes even in the most basic conditions.

          • by Entrope ( 68843 )

            Self-driving car companies and their shills have been claiming that's possible for years, and they still have higher accident rates per mile than humans, even when limited to largely low-speed, good-visibility driving conditions on well-known roads.

            https://injuryfacts.nsc.org/mo... [nsc.org] shows how much safer human drivers have gotten over time. Your claim that humans are "useless and incapable of improving" is a total lie.

            • Self-driving car companies and their shills have been claiming that's possible for years

              That's because it *is* possible. Have you not heard of the concept of a software update?

              If you think that a single software update is all that is needed to address every bug and edge case may I suggest you try using this thing called a "computer". It may give you some much needed insight as to how technology, especially software updates work.

              shows how much safer human drivers have gotten over time

              Nope. It shows how much safer cars have gotten over time. Quite specifically some of the greatest advancements we've made in vehicle safety has been through means of ha

          • by _merlin ( 160982 )

            A simple software update can also make an entire fleet of "drivers" suddenly lethally dangerous. Software update quality is dreadful for just about everything these days.

            • Software update quality is dreadful for just about everything these days.

              Don't confuse your Google Assistant or Windows Update with a car. There's an order of magnitude more testing involved in the latter. Perfect? No, but far from "dreadful".

          • > What if you take into account that with a simple software update we can make an entire fleet of "drivers" magically better while humans have shown to be useless and incapable of improving sometimes even in the most basic conditions.

            I'm still not going to go down and protest at the DMV.

        • Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.

          • by Entrope ( 68843 )

            Tesla software is not currently allowed to take full (unsupervised) control of the vehicle, even in "Full Self Driving" mode, and it still causes [nypost.com] lots of accidents.

          • Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.

            Tesla is driving on highways which have much fewer collisions on per-mile basis. And they still manage to crash into semis and ambulances.

          • by bjwest ( 14070 )

            Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.

            Yes, and according to the Department of Transportation [dot.gov], there were close to 300 million personal and commercial vehicles registered to drivers in the United Stats in 2020. Tesla's roughly 3 million vehicles (worldwide, not just in the U.S.) is a very small fraction of those. And I'd like to know where your claim of "tens of billions of miles" driven by Tesla cars came from, although I have a good idea you're sitting on it. Just 1 billion miles divided between 3 million vehicles is well over 300 million m

            • Tesla's roughly 3 million vehicles (worldwide, not just in the U.S.) is a very small fraction of those. And I'd like to know where your claim of "tens of billions of miles" driven by Tesla cars came from, although I have a good idea you're sitting on it. Just 1 billion miles divided between 3 million vehicles is well over 300 million miles per vehicle, and you're claiming tens of billions of miles.

              Missed the sarcasm tag so just in case . 1 million cars doing 1,000 miles per month is 1 Billion miles. Worldwide Tesla has 4 million plus sales. US between 1-2 million. So yes, billions of miles. They publish their safety data for Tesla with and without auto pilot. FSD beta improves with each iteration.

              In G-d We trust. Everyone else - bring data.

              • by bjwest ( 14070 )
                Sorry, it was early and mathematical mistakes happen. The rest of the post is still valid.
            • Just 1 billion miles divided between 3 million vehicles is well over 300 million miles per vehicle

              Is this supposed to be a joke?

              I can't believe anyone is really that bad at math.

              • by bjwest ( 14070 )

                Just 1 billion miles divided between 3 million vehicles is well over 300 million miles per vehicle

                Is this supposed to be a joke?

                I can't believe anyone is really that bad at math.

                Yeah, mistakes happen. I'd just woken up and hadn't had my first cup of coffee yet, so divided by 3 instead of 3 million. The rest of the post is valid, however, the number of miles driven by autonomous vehicles is far, far less than those driven by humans.

          • by thomn8r ( 635504 )

            Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles

            In much the same way Internet Explorer is installed on millions of PCs

          • > Tesla Autopilot is installed in millions of cars that have collectively driven tens of billions of miles.

            True. My example was about mileage driven today. The total miles "autopilot" has driven is an even smaller percent of the total miles driven by cars.

      • Wait 'til you find out how many crashes were caused by humans today. You'll be down at the DMV protesting withing five minutes.

        Sorry your child got ran over by our self-driving truck. Just imagine how much worse it could've been with a human driver!

      • by bjwest ( 14070 )

        It's clear the technology isn't to the point where we can have fully self-driving vehicles on the road

        Wait 'til you find out how many crashes were caused by humans today. You'll be down at the DMV protesting withing five minutes.

        Wait 'till you discover the number of autonomous vehicles on the road vs the number of human driven vehicles, and then compare the number of crashes percentage wise between the two. You'll find the autonomous vehicle accident rate far, far higher than that of human drivers.

    • It's clear humans are unsafe to the point we can't have fully human driven vehicles on the road, especially mixed with self-driving vehicles.
      • by Entrope ( 68843 )

        There are still zero cars available with level 4 or 5 autonomy, and zero cars sold with level 3 autonomy in most countries. Level 3 still requires an attentive human driver at all times, so your claim is bollocks.

      • I agree with you. Fortunately, most new vehicles today aren't fully human-driven. Humans aren't very good at applying the brakes in an emergency so many vehicles have a system to deploy them if a crash is imminent (although those systems aren't perfect) and once they are applied, there are automatic braking systems that prevent the tires from locking up and reducing the coefficient of friction. Also there are now backup cameras and sonar to warn a driver if there is a small child or other unseen obstacle
    • by dgatwood ( 11270 )

      It's clear the technology isn't to the point where we can have fully self-driving vehicles on the road, especially mixed with non-self-driving vehicles.

      If your goal is perfection, then we won't ever have fully self-driving vehicles on the road, then. Realistically, the only way to test the technology is to put it out on the roads and see how it behaves in the real world. You'll never catch all the corner cases in simulated drives, nor even close.

      That said, the self-driving software ignored pretty much all of its sensor data, including brake light data and positioning, in favor of believing that the front part of an articulated bus must still be moving be

      • As long as the owner of the vehicle is financially and legally responsible for the damage it causes, I certainly do need it to be perfect. If insurance and DMV don't hold me responsible for the mistakes the driver makes when I'm not driving then I don't care as much.
        • by dgatwood ( 11270 )

          As long as the owner of the vehicle is financially and legally responsible for the damage it causes, I certainly do need it to be perfect. If insurance and DMV don't hold me responsible for the mistakes the driver makes when I'm not driving then I don't care as much.

          In this case, the owner of the vehicle is GM. GM's Cruise self-driving tech isn't available for sale. All the vehicles are fleet vehicles (robotaxis).

    • Re: (Score:3, Insightful)

      by gweihir ( 88907 )

      Bullshit. Self-driving only has to be significantly better than humans. It very likely already is. Claims like yours that indicate self-driving has to be perfect are just insightless nonsense. The very reason this basically non-story made the news is that accidents like this are extremely rare for self-driving. Compare that to the level of damage a human driver has to do to make the news.

      • by bjwest ( 14070 )

        Bullshit. Self-driving only has to be significantly better than humans.

        And we're nowhere near that point yet. The percentage of autonomous accidents per vehicle is a lot higher than those vs human-driver, per vehicle type on the road.

        Claims like yours that indicate self-driving has to be perfect are just insightless nonsense.

        Please tell me where in my post I claim self-driving vehicles has to be perfect. Imagine the carnage on the road if the number of autonomous vehicles on the road were comparable to the number of human-driven vehicles, and the percentage of accidents for the autonomous vehicles were the same as it is now.

        I stand my my statement that we are not at

        • by gweihir ( 88907 )

          And we're nowhere near that point yet. The percentage of autonomous accidents per vehicle is a lot higher than those vs human-driver, per vehicle type on the road.

          Really depends on the metric. Yours is bullshit.

          • by bjwest ( 14070 )

            And we're nowhere near that point yet. The percentage of autonomous accidents per vehicle is a lot higher than those vs human-driver, per vehicle type on the road.

            Really depends on the metric. Yours is bullshit.

            Care to explain yourself, or do you often spew out Faux newsworthy crap?

            • by gweihir ( 88907 )

              Nope. Just this: "Percentage of autonomous accidents per vehicle" is pure nonsense and has no meaning.

              • by bjwest ( 14070 )

                Let me clear it up a bit. The percentage of accidents per vehicle type is higher for autonomous vehicles than for human driven vehicles, given the number of each on the road.

                I'm sorry it was so confusing to you, but the last part of that sentence should've cleared it up.

    • We're already at the point where it is, especially woth all the crappy human drivers. But yeah, there are still a lot of improvements to be done to make them even safer, but those can only happen by having these vehicles on the actual road so the real life incidents happen and the system can learn from it. Ofcourse IMHO it should also be mandatory for these companies to learn from the daily reports by insurance companies and police reports of actual human incidents.
  • by Joce640k ( 829181 ) on Sunday April 09, 2023 @10:26PM (#63437600) Homepage

    The real question is: How many human drivers rear-ended buses in that same time period?

    • The real question is: How many human drivers rear-ended buses in that same time period?

      That is the wrong metric. A better metric is the number and severity of accidents per kilometer driven for SDVs and HDVs.

      But it is not reasonable to lump Waymo, Tesla, Cruise, and others together. They should be evaluated separately. Tesla appears to be far ahead of Cruise, with Waymo somewhere in between. Of course, Tesla got where it is by putting a lot of cars on the road and collecting a lot of data.

      • How the heck would Tesla be ahead of Cruise. Unless by "ahead" you mean killing the most people? A Tesla could not serve as a driverless taxi like a Cruise vehicle could. I don't think even Musk makes such a ridiculous claim. How did this get modded up when it puts Tesla in the same category as Waymo?
    • I don't think many human drivers would assume that the back half of the bus doesn't exist because the front half isn't visible.
      • by Tablizer ( 95088 )

        Shouldn't the AI recognize the snippet as a "bus" and assume it's larger than a car? Or did they skip using neural nets due to their unpredictability?

  • by AlanObject ( 3603453 ) on Monday April 10, 2023 @12:30AM (#63437670)

    I am sure this is Elon Musk's fault somehow. It just has to be.

  • by battingly ( 5065477 ) on Monday April 10, 2023 @12:30AM (#63437672)
    These aren't perfect, but I'll take them over the insane human drivers I see on the road every day, who are oblivious to the consequences of their stupidity. Autonomous drivers can't come fast enough for my taste.
    • by gweihir ( 88907 )

      I completely agree. And I am pretty sure that insurances (Europe) and lawsuits (US) will force adoption in the not too distant future. Human drives are, on average, are _bad_. Most also think they are excellent, in a nice Dunning-Kruger effect application.

      • Human drivers are better, on average, than the self-driving Cruise cars. There are many technologies that have been adopted via a combination of market and regulation such as anti-lock brakes and backup cameras. The idea that there will be a sudden switch from human drivers assisted by automation to full automation is silly. It will be a gradual transition as the automation does more and more.
      • I completely agree. And I am pretty sure that insurances (Europe) and lawsuits (US) will force adoption in the not too distant future.

        Maybe. If you've spent much time driving in the same area with self-driving cars, you'll quickly get annoyed with them. If you've spent much time being a pedestrian in an area with self-driving cars, you'll also quickly get annoyed with them (why do Waymo cars stop in the middle of crosswalks?)

    • For taxi service, Uber/Lyft drivers are quite good and I'm pretty sure much better than the average human driver. Since this article is about self-driving taxis, even if they beat the *average* human driver, they wouldn't necessarily beat the average Uber/Lyft driver. Not because those drivers have special training or skills but because they are in their own vehicles and unsafe driving would get them a poor rating so they have extra incentive to be safe. Compared to the average *medallion* taxi ride. I'v
  • Human drivers (Score:5, Interesting)

    by gnasher719 ( 869701 ) on Monday April 10, 2023 @05:19AM (#63437856)
    I had a few dangerous situations in the last years, and was wondering how a self driving car would have done.

    1. Driving on the M25 motorway a car in front of me in the middle lane suddenly turns right. 90 degrees, straight across the left lane, right into the fields along the motorway. I was just shocked. Didnâ(TM)t react in any way. Had I been in the left lane at the right position Iâ(TM)d have driven straight into him instead of braking because this was just too unexpected.

    2. On the motorway I was following a large van. Couldnâ(TM)t see past it. Suddenly I see the back of the van going up and its front wheels going down. So it looked like he was braking very very hard. Noticed it before I saw it slowing down. I braked hard, aiming to stop right behind the van, praying nobody. Iâ(TM)d be curious if a self driving car would have figured out the van was braking hard that quickly.

    3. On a dual lane road, slowly overtaking a truck, a motorbike not far ahead. Suddenly the truck moves into my lane. I didnâ(TM)t know what was behind me, and there wasnâ(TM)t enough time to look, so I accelerated, past him, right up to the motorbike, then braked hard. (Wasnâ(TM)t the truckers fault. I saw in the back mirror that another parked truck had gone into _his_ lane. )

    I wonder if self driving cars are at all prepared for that kind of thing. For example do they keep track of vehicles and pedestrians around them do they know what manoeuvres are possible without risk.
    • I am not an expert on self-driving cars but case #2 is relatively easy for any radar/lidar system to detect that you are approaching the other object quickly. Even adaptive cruise control can handle that.
    • On a dual lane road, slowly overtaking a truck, a motorbike not far ahead. Suddenly the truck moves into my lane. I didnÃ(TM)t know what was behind me, and there wasnÃ(TM)t enough time to look

      You shouldn't have to look to know. You should already have looked, so you'd already know. No time? Too congested? Then it's not safe to pass.

      AVs have the potential to be safer mostly because they don't have to do these things to begin with.

    • For #1 it seems like a self driving car would not have the "shock" or "too unexpected" human reactions, and would just do the best according to the algorithm. So likely the same as you if in the middle lane, and braking if in the left lane.

      For #2 self driving cars with LIDAR should be pretty good at seeing sudden changes of speed in a vehicle they are following, They wouldn't need the other visual cues.

    • 1. Driving on the M25 motorway a car in front of me in the middle lane suddenly turns right. 90 degrees, straight across the left lane, right into the fields along the motorway. I was just shocked. Didnâ(TM)t react in any way. Had I been in the left lane at the right position Iâ(TM)d have driven straight into him instead of braking because this was just too unexpected.

      A machine would not have been shocked and could have reacted much faster than a human could. Though it may still have been impossible to do anything useful.

      2. On the motorway I was following a large van. Couldnâ(TM)t see past it. Suddenly I see the back of the van going up and its front wheels going down. So it looked like he was braking very very hard. Noticed it before I saw it slowing down. I braked hard, aiming to stop right behind the van, praying nobody. Iâ(TM)d be curious if a self driving car would have figured out the van was braking hard that quickly.

      You were wrong when you said that the drop in the front/rise in the back happened before deceleration. You just didn't notice the deceleration because human dept perception isn't very good. A self-driving car using RADAR or LIDAR would have noticed the change instantly, before the change in vehicle angle, and reacted well before you could have. It also w

  • Even if the system was working perfectly, and the car properly obeys every rule every time -- this is exploitable. Have someone perform a swoop-and-squat in front of it, box it in among other vehicles, and a human driver might recognize they're under attack and be willing to drive through other vehicles or humans. The car is going to sit there and let the occupants get murdered, or at best it's going to allow itself to be stolen rather than attempting an escape. The complete lack of situational awareness me

    • the secret service will not let an self drive system trap some like that.
      SO they may have an real driver or have there AI in kill mode.

      • by Mal-2 ( 675116 )

        There are a lot more vulnerable targets than just government officials with Secret Service protection. Yes, driving through someone is hard, especially if they know how to set up a proper roadblock, but it may be better than facing hijackers with AKs at close range.

        I personally had an incident where I was driving a friend's car because he had a headache. The first task was to drop his brother off at the brother's girlfriend's house, and in doing so, we started to be followed by a car we didn't recognize. Wh

    • It's very hard to drive *through* a vehicle in that situation. Fortunately, here in Florida, you can keep a loaded gun in the passenger compartment and just shoot all of the other drivers. If it turns out it wasn't an assault, just somebody braking too hard, well you stood your ground. When is the last time you've seen such an assault? If that were a real risk, paying for things like OnStar would make sense as the crash report would summon law enforcement.
      • by Mal-2 ( 675116 )

        It's not about "the last time you saw it happen" because the road hasn't been clogged with significant numbers of autonomous vehicles. This is a new threat, there is no history to refer to. As thieves optimize to corner and corral autonomous vehicles, those that are still being driven by real humans may find themselves in this situation.

        Home invasion robberies developed almost "out of nowhere" too. It only takes one to come up with a plan that others can copy.

    • There is a fifty foot long bridge on the way into my area that is one lane, and drivers are expected to negotiate that themselves. I wonder how self driving will handle that.
  • The article makes it sound like the predictive and kinematic parts of its self driving algorithm trump any sort of fundamental logic, such as, âoedo not proceed if there is an object in front of you.â

    If the sensors of the car, on the other hand, did not even see the stationary bus, i have to wonder a bitâ¦

    • Regardless of whether they "see" the bus, I would think that radar/lidar would detect that the car is rapidly approaching another object. Heck at 10mph the parking sensors should have kicked in to avoid collision!
  • These things always boil down to the same old same old.

    Those claiming that the human driver is terrible, and must be terminated from driving because self driving eliminates the cause of accidents, And those who don't believe that to be the case, pointing out every accident as proof.

    Digging below the easy arguments is we find out a few things.

    Driving a vehicle is very complicated. Driving through tape off areas onto power lines might be an edge case, but in most/all cases, a functioning human would st

  • Self driving cars have far less intelligence than Wiley Coyote, who almost invariably crashes into a granite cliff with a road painted on it.

  • talk about confusing terms!

  • ." As for the March bus collision, Vogt said the software fix was uploaded to Cruise's entire fleet of 300 cars within two days. He said the company's probe found the crash scenario "exceptionally rare" with no other similar collisions.

    Seriously - a safety critical piece of software was designed, coded, and QA tested in 2 days?

    really?

    Uhm, I get move fast, but I would hope the simple automated tests would take 2 days, and focused testing on just the fix would take more, then integration testing and regression testing...

Say "twenty-three-skiddoo" to logout.

Working...