Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation

Cruise Robotaxi Shutdown Expands, Pressing Pause On Supervised and Manual Trips Too (theverge.com) 38

An anonymous reader quotes a report from The Verge: Cruise's robotaxis are front and center of the industry's trust issue after losing their California permit following an incident where a pedestrian ended up stuck underneath one of its cars. It already halted service nationwide and said it's installing new updates. Now Cruise has announced it's taking its cars off of public roads while it undergoes a full safety review. Meanwhile, Cruise board member and GM legal executive VP Craig Glidden is "expanding" his role to lead Cruise's Legal, Communications, and Finance teams: "In the coming days, we are also pausing our supervised and manual AV operations in the U.S., affecting roughly 70 vehicles. This orderly pause is a further step to rebuild public trust while we undergo a full safety review. We will continue to operate our vehicles in closed course training environments and maintain an active simulation program in order to stay focused on advancing AV technology."
This discussion has been archived. No new comments can be posted.

Cruise Robotaxi Shutdown Expands, Pressing Pause On Supervised and Manual Trips Too

Comments Filter:
  • Human drivers (Score:4, Interesting)

    by backslashdot ( 95548 ) on Wednesday November 15, 2023 @09:36AM (#64006943)

    Human drivers kill 40,000 people just in the USA (1 million worldwide). That 40,000 includes many innocent passengers and pedestrians.That's more than double the number of murders in the US. That's triple the amount of gun murders in the US. We need to ban human-driven vehicles. Why are people being allowed to drive cars? It's a fucking murder weapon, humans can't be trusted with it. We need autonomous vehicles ASAP. Government and health/life insurance companies should fund some of the R&D.

    • Human drivers kill 40,000 people just in the USA

      Deaths happen all the time for a variety of reasons due to humans.

      That 40,000 includes many innocent passengers and pedestrians.

      See above.

      That's more than double the number of murders in the US. That's triple the amount of gun murders in the US.

      Humans are the cause of these problems. We should develop autonomous beings so this won't happen.

      We need autonomous vehicles ASAP.

      No we don't. Autonomous vehicles will and are bein
    • by 2TecTom ( 311314 )

      self-driving transport will happen but it will take a long time for widespread adoption

      it would be far faster and more effective if our governments adopted best design and management practices from those countries with the best safety records

      if communities were integrated, mixed use, mixed class and more organic as opposed to bureaucratic and commercialized, we wouldn't have traffic as we know it

      some places don't have traffic

      so long as we let developers develop developments based on developers profits, soci

      • Flying a plane is easier than driving a car or riding a bicycle in a city environment under ideal conditions. We'll see safe, fully autonomous airliners working passwnger flights in schedules revenue service a few decades before we see safe autonomous cars, if it's even possible.
        • Flying a plane is easier than driving a car or riding a bicycle in a city environment under ideal conditions. We'll see safe, fully autonomous airliners working passwnger flights in schedules revenue service a few decades before we see safe autonomous cars, if it's even possible.

          The economics are completely different.

          The pilot is only a small fraction of the cost for a passenger plane, but a huge portion of the cost for a transport truck and an even bigger portion of a taxi.

          • And? You have to program a computer for every variable by a predefined set of rules. Not really possible with vehicles not set on tracks in a fenced off right of way. The closest you're gonna get to autonomous cars is the PRT system at the University of West Virginia. The closest to autonomous cars you're gonna get that actually scales is the Vancouver Skytrain.
            • And? You have to program a computer for every variable by a predefined set of rules. Not really possible with vehicles not set on tracks in a fenced off right of way. The closest you're gonna get to autonomous cars is the PRT system at the University of West Virginia. The closest to autonomous cars you're gonna get that actually scales is the Vancouver Skytrain.

              It's not about the rules sets, it's about economics.

              Airplane autopilots are already capable of everything but maybe takeoff [usatoday.com], and the exception of takeoff is probably just because there's no current need. It wouldn't be that hard to make those planes fully automated.

              The persistence of the pilot is due to the fact that the cost of the pilot is quite small compared to the added safety (at least perceived safety).

              With SkyTrain the software comes on a 3.5 disk [wikipedia.org], that tells me the restriction from further automati

              • ...can I buy some pot from you? Like seriously do you just not understand what variables are or something? The SkyTrain works because they did everything they could to keep the codebase small and easy to debug, and eliminate as many variables as possible, by running lines elevated, below ground or in a fenced-off right-of-way with no crossings. And even then, in icy or snowy weather, they still run the trains with a driver on board. Labor is the largest component of any transit system, and not one that'
                • ...can I buy some pot from you? Like seriously do you just not understand what variables are or something?

                  Yeah... I've worked on mission critical systems so I'm not going to give your snide comments much weight.

                  The SkyTrain works because they did everything they could to keep the codebase small and easy to debug, and eliminate as many variables as possible, by running lines elevated, below ground or in a fenced-off right-of-way with no crossings. And even then, in icy or snowy weather, they still run the trains with a driver on board.

                  And they could do that with other train systems as well. The reason they don't is the economic incentives aren't there.

                  I'm not arguing that self-driving cars aren't a much tougher problem than auto-pilots for trains or planes. But it's not a meaningful comparison because the economics are so different. Planes are already self-flying, but there's not much cost savings in removing the pilot so they stay.

              • The way Tesla is doing it is using purely deep learning and no predefined rules. Feeding it videos of driving and the neural nets make the rules based on what it sees in the training data. Reference: https://www.cnbc.com/amp/2023/... [cnbc.com]

                • The way Tesla is doing it is using purely deep learning and no predefined rules. Feeding it videos of driving and the neural nets make the rules based on what it sees in the training data. Reference: https://www.cnbc.com/amp/2023/... [cnbc.com]

                  Meaning that its behaviour is always fundamentally random.

                  They're not following the evidence, they're following Elon Musk's non-expert intuition that has been repeatedly been proven wrong.

            • False. The way Tesla is doing it in FSD version 12 onwards is using purely deep learning and no predefined rules. Feeding it videos of driving and the neural nets make the rules based on what it sees in the training data. Reference: https://www.cnbc.com/amp/2023/... [cnbc.com]

              In FSD v12 they have removed all code that for example tells the system what a traffic light is, the AI figured it out based on the training videos fed to it.

    • You make a better argument for quality public transportation than you make an argument that self driving cars aren't algorithmically impossible to operate by nature of operating environment.
    • Re: (Score:3, Informative)

      by nash ( 8306 )

      The problem Cruise had was they hid the details of the accident. Not the accident itself. A coverup to the regulator is inexcusable.

      The vehicle hit a person... bad. But then it drove to the side of the road dragging the person under the vehicle. Cruise "neglected" to pass that footage to regulator. The ONLY reason it was discovered was that a passer by filmed it and passed it on..

      • Or it's probable that they didn't have a camera in the correct place to see the pedestrian. They have the mindset that they only need sensors for regular driving events and not for a passenger being dragged by the vehicle. If that camera was there, would the AI even came close to understanding what is happening?
        • by nash ( 8306 )

          Oh no, they had the video.

          When the regulator asked for it specifically; they had it.

          https://www.dmv.ca.gov/portal/... [ca.gov]
          https://www.theregister.com/20... [theregister.com]

          • Ok so the question remains, is there any understanding of something like a pedestrian being dragged by the vehicle?
            • by nash ( 8306 )

              Who knows? It's not the issue in this case however.

              Practically speaking; it's going to happen. Self driving cars are going to kill people. Even if a million times better then human drivers; it will happen. Lots of debate about how dangerous they are at the moment - some say safer; some not so sure. More data is definitely needed. In the long run it's almost certainly going to be better.

              The car in this case detected it had hit something (good). Then it decided to to drive to the side of the road (in

              • Ok well until someone can definitively prove they are safer, let's not ask members of the public to risk themselves for the sake of corporate profits.
    • by mjwx ( 966435 )

      Human drivers kill 40,000 people just in the USA (1 million worldwide). That 40,000 includes many innocent passengers and pedestrians.That's more than double the number of murders in the US. That's triple the amount of gun murders in the US. We need to ban human-driven vehicles. Why are people being allowed to drive cars? It's a fucking murder weapon, humans can't be trusted with it. We need autonomous vehicles ASAP. Government and health/life insurance companies should fund some of the R&D.

      You do realise is, all you are saying is that you're daft with a car as you are with firearms.

      Other developed nations do not have the same issues. Road fatalities per 100,000 pop (data from 2019 unless otherwise indicated):
      Norway: 2.0
      Sweden: 2.2
      Switzerland: 2.2
      United Kingdom: 2.9
      Denmark: 3.4
      Germany: 3.7
      Spain: 3.7
      Finland: 3.8
      Japan: 4.1
      Australia: 4.5
      France: 5
      Italy: 5.2
      Austria: 5.4
      Belgium: 5.4
      Canada:5.8



      United States: 12.9

      That sandwiches you between Uruguay and Egypt (a place where

    • by rossdee ( 243626 )

      "That 40,000 includes many innocent passengers and pedestrians."

      and presumably the rest are guilty passengers and pedestrians.

      • I don't believe in that, but a previous time I posted these stats people were saying “who cares about traffic deaths — bad drivers deserve it!” Darwin award lololololol etc.

  • by JBMcB ( 73720 ) on Wednesday November 15, 2023 @10:36AM (#64007109)

    I assume that, when a taxi service is involved in a fatal accident in California, that taxi service looses it's license as well, pending a full safety review.

    Oh wait, I haven't found a single instance of that happening. CA taxi drivers must be some of the safest drivers in the world!

    • I assume that, when a taxi service is involved in a fatal accident in California, that taxi service looses it's license as well, pending a full safety review.

      Oh wait, I haven't found a single instance of that happening. CA taxi drivers must be some of the safest drivers in the world!

      No but when a driver is particularly unsafe they can lose their license (taxi or not) and since all Cruise vehicles basically have the same software driver I see this as being essentially the same thing.

      • In this case, it was a human driver that first hit the pedestrian who tumbled under the self-driving car!
        The self-driving car promptly stopped as soon as it registered something wrong, as you are supposed to, i guess very few human drivers will have the insight to stop that fast when another car brings accidents to your side...

        • Most humans would "register something wrong" the instant they witnessed a pedestrian bouncing off the hood of the car next to them. Yet it took the AV several seconds longer to "register something wrong" apparently.
  • All are exciting technologies, none are living up to the hype or projected adoption curves. I don't expect VR or Robotaxis to disappear as badly as 3DTV's, but I also expect the breathless optimism and hype to settle way down ("Individual car ownership will be BANNED within a decade!!!"). Bold promises will give way to valuable niche applications that while valuable, will be well short of hopes.

    Automate long haul trucking? Likely, the cost motivation is huge and the environment more controllable.

    Replace

    • We will get self-driving vehicles anywhere that one is willing to update the infrastructure such that it can be done successfully. There's a town near me that operates a self-driving shuttle service. (Can't really call it a bus since the vehicles only hold a few passengers) It runs between four points in town. It supposedly works great when it's working (Has been shut down whenever I've been there).
  • GM wins again!

  • It is good this happened.

    We needed to show regulators that companies like GM who actually have to recall their network connected cars to perform software updates shouldn't be allowed to make self driving or any other kind of cars.
    • Hrm, I don't think that's the issue here, and I believe they can update over the wire. The issue is the whole algorithm and sensors. GM is behaving sensibly here. Its not fighting and claiming they are perfect, its very un Tesla like which is awesome. Teslas are in no way safe to be run without a human driver. GM, well they don't think they are anymore so I'm not going to argue with them, but this kind of safety reaction over a single incident means I'll be more likely to trust them in the future.
  • Personally, I won't consider "self-driving vehicles" safe until they can do all the things a human can do (and do them better). For instance, if they break down, they should be able to take a warning triangle out of their boot, and place it 45 metres behind the car as is legally required in many European countries. I haven't seen or read about any self-driving vehicle that can do this, so I really don't want them on roads anywhere near me just yet...

Make sure your code does nothing gracefully.

Working...