Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Transportation

After Suspending Its Self-Driving Cars, Cruise Takes Steps to Win Back Trust (nytimes.com) 76

Cruise stopped its driverless operations nationwide last week. But the New York Times reports on the company's moves since then...

- Cruise hired the law firm Quinn Emanuel to investigate its response to a San Francisco incident involving a pedestrian, "including its interactions with regulators, law enforcement and the media."
- A separate review of the incident is being doncuted by Exponent, a consulting firm that evaluates complex software systems.
- The company's rivals "fear Cruise's issues could lead to tougher driverless car rules for all of them."
- "Cruise employees worry that there is no easy way to fix the company's problems, said five former and current employees and business partners."

Company insiders are putting the blame for what went wrong on a tech industry culture — led by 38-year-old [Chief Executive Kyle] Vogt — that put a priority on the speed of the program over safety. In the competition between Cruise and its top driverless car rival, Waymo, Mr. Vogt wanted to dominate in the same way Uber dominated its smaller ride-hailing competitor, Lyft. "Kyle is a guy who is willing to take risks, and he is willing to move quickly. He is very Silicon Valley," said Matthew Wansley, a professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies. "That both explains the success of Cruise and its mistakes."

When Mr. Vogt spoke to the company about its suspended operations on Monday, he said that he did not know when they could start again and that layoffs could be coming, according to two employees who attended the companywide meeting. He acknowledged that Cruise had lost the public's trust, the employees said, and outlined a plan to win it back by being more transparent and putting more emphasis on safety. He named Louise Zhang, vice president of safety, as the company's interim chief safety officer and said she would report directly to him...

With its business frozen, there are concerns that Cruise is becoming too much of a financial burden on G.M. and is hurting the auto giant's reputation... The shutdown complicates Cruise's ambition of hitting its goal of $1 billion of revenue in 2025. G.M. has spent an average of $588 million a quarter on Cruise over the past year, a 42 percent increase from a year ago. Each Chevrolet Bolt that Cruise operates costs $150,000 to $200,000, according to a person familiar with its operations.

This discussion has been archived. No new comments can be posted.

After Suspending Its Self-Driving Cars, Cruise Takes Steps to Win Back Trust

Comments Filter:
  • > review of the incident is being doncuted by Exponent

    Do you mean "documented"?

  • As a corporation, Cruise is capitalistic, and capitalism is a voluntary system, so this "accident" was probably staged by Soros.

    • by Viol8 ( 599362 )

      " this "accident" was probably staged by Soros."

      Yes, thats right, the 93 year old got together his cabal in the Bilderberg group and had a brainstorming session on how to screw up a self driving car for [reasons].

      God knows who modded you up but they must be smoking the same bad weed as you.

    • by a5y ( 938871 )

      Your post history is a wild ride.

      • Damn he isn't even joking about Soros.

        • here's a secret. i'm always joking in this subintellectual cesspit.

          if i actually point out the obvious fact that praising the creativity of CEOs willing to take risks with self-driving cars, is literally the same thing as praising them for risking the lives of others without their consent, some chud will want to "debate" that so i don't bother with it and just agree 1000% with my cynical interpretation of their position instead.

          it's a little game i play.

  • by Required Snark ( 1702878 ) on Monday November 06, 2023 @06:02AM (#63983768)
    It's against the natural order for corporations to be held accountable for the damage they cause. They have all the rights of actual humans PLUS immunity and potentially longer existence and can't be put in jail for criminal actions. And don't forget the complete lack of morality or guilt or any even vaguely emotional state except vast diffuse greed. They truly are better then mere people, if by better you mean able to out-compete anything else no matter what wretched mess ensues.

    So they really deserve our trust, which can be trivially restored after they screw up. All they need to do is lie about changing and everything will be forgiven and forgotten and we can all comfortably go back to things like autonomous vehicles clogging the roads, blocking emergency vehicles and making lots of money!

    Long live our new AI overlords, on land, sea, and air, and in our financial institutions, health care, social media, government, agriculture, retail sector, ..........

    • [Corporations] have all the rights of actual humans PLUS immunity and potentially longer existence and can't be put in jail for criminal actions.

      Did the supreme court rule that corporations had *all* the rights of humans? Or was it just the 1st amendment rights of free speech?

      I thought the supreme court ruling was about political donations and advertising; the explicit examples being, a corporation can put up a billboard showing a political message or make a donation to a specific candidate (ie - has the right to do that).

      Additionally, IIRC, the ruling also stated if corporations *didn't* have free speech rights it would be impossible to regulate, b

      • Did the supreme court rule that corporations had *all* the rights of humans? Or was it just the 1st amendment rights of free speech?

        I thought the supreme court ruling was about political donations and advertising; the explicit examples being, a corporation can put up a billboard showing a political message or make a donation to a specific candidate (ie - has the right to do that).

        Additionally, IIRC, the ruling also stated if corporations *didn't* have free speech rights it would be impossible to regulate, being impossible to determine which actions taken by a corporation would violate the [no free speech] law.

        Also, corporations can't be put into jail, but their human managers certainly can. Google Sam Bankman-Fried for a clear example.

        Given that rights can't be selectively applied against people, this would seemingly apply to corporations as well.

      • by jvkjvk ( 102057 )

        >Did the supreme court rule that corporations had *all* the rights of humans? Or was it just the 1st amendment rights of free speech?
        >I thought the supreme court ruling was about political donations and advertising

        I thought one of the reasons given for corporations having Free Speech rights was essentially that the individuals in the corporations had those rights and by infringing on the rights of the Corporation to do that you are infringing on the rights of the individuals within that corp to do it.

      • Your selective interpretation of one of the ironic phrases I used is impressively and deliberately obtuse and stupid. I was pointing out how corporate money and power can subvert individual well being, and you headed off in the completely opposite direction. Spoken like a true believer in authoritarian corporate control.

        Those who control corporate resources effectively have no limits on how much they can spend on buying influence. They hire lobbyist, give vast amounts of anonymous money to elected officia

      • I forgot to mention that Sam Bankman-Fried was convicted because he stole money from rich people. If he had engaged in normal corporate theft, like when Wells-Fargo opened fake accounts for tens of thousands of ordinary users, there would have been no personal accountability. No one in upper management lost their job or suffered any economic consequences when that happened, and for the most part stockholders were not much effected in the long run.

        Another example was the demise of Washington Mutual, which

    • just good old fashion human overlords. After thousands of years about 42% of the population somehow still hasn't learned that being ruled over isn't a good thing, and they still long for a strong man to take charge of them...
  • "The company's rivals "fear Cruise's issues could lead to tougher driverless car rules for all of them."

    Right now today there isn't a single state or federal regulation mandating driverless cars.

    And quite frankly, I don't care of a 3-ton manslaughtering piece of machinery DOES have the strongest regulations. We passengers are going to need it to combat against the Greed that hardly gives a shit.

    If you can't stand the heat, get the fuck out of the kitchen.

    • by Viol8 ( 599362 ) on Monday November 06, 2023 @06:15AM (#63983780) Homepage

      Nobody asked for self driving cars, there were no campaigns or petitions for them and they won't make the roads less busy , reduce fuel use and probably won't make a huge difference to casualty rates either once they're driving the same number of miles in all the varied conditions that human drivers manage without incident.

      The only reason these things exist is because these companies see a big payout at the end of the day so too bad if they get hobbled with very strict rules. These tech bro toys could do serious damage if not properly regulated so franky fuck em if
        they don't like it.

      • After the first fatal Tesla accidents, public opinion of driverless cars dropped like a rock. I haven't seen any recent survey results, but I don't know why public acceptance would improve much. The public seems to have no problem with driver assist features based on similar technology, but most of them don't want to turn control over to the machine. This is a technology well ahead of its time.
      • by RobinH ( 124750 )
        The average person doesn't care that much about a self-driving car, but if you're a company that employs drivers, you definitely want self-driving vehicles. This is particularly true now with the steadily declining labour pool (simply because more people retire per year than graduate right now) and the fact that we've effectively reached full employment (let's face it, if you're at 3% unemployment, the remaining people are mostly unemployable). Anything we can automate, we should, simply because driving i
        • However, I've long felt like self-drive is one of those engineering projects where it's easy to get to 90% and do a demo to get investment, but practically impossible to get to 99.99% which you'll need for commercial acceptance.

          Alcohol infusing drivers kills thousands every year. Drugged driving certainly don't help the statistics but we killed tens of thousands with pill bottles too, no car necessary. We barely even give a shit about that to prevent drunk/drugged driving from harming others. And we're certainly not looking to put an "assault" label on alcohol or drugs no matter how much harm happens. And we do far less for the new growing epidemic, since a distracted driving device (smartphone) sits in damn near every drivers

      • won't reduce traffic google "induced demand" or look up the YouTuber "Adam Something".

        Walkable cities and public transit really are the only solution to traffic jams. It's just a question whether you love cars enough that you'll put up with being stuck in traffic for 60-90 minutes. Older folks definitely will. Not sure about anyone under 30.
      • by dstwins ( 167742 )
        Actually except for a few people, most EVERYONE is asking for self-driving cars.. (especially for those that commute or long haul driving).
        Many people are looking at self-driving cars as a way to mitigate traffic (most road delays have more to do with humans gawking or not being privy to the problems until they are in someone's face which means slowdowns).
        Most people are looking at self-driving cars to give individuals that wouldn't otherwise be able to drive with confidence (elderly, infirm, medical condit
        • There are some things that AI based driving could excel at. Highways could be replaced by four-way intersections where the cars time themselves to go through at specific times. Traffic signals could be tossed because vehicles could speed up or slow down.

          Heck, even vehicle deliveries and long trips become painless, especially if a vehicle can stop and get fuel by itself. One could just crash out, let the vehicle drive itself for a couple days, only really stopping when one gets bored and needs a break. T

          • by Viol8 ( 599362 )

            LOL :)

            I'd lay off the weed if I were you. Sure, in THEORY self driving cars could do all of these things , just like in FACT a modern airliner can take off, land and even fly in formation with other airliners all on its own. But they don't because no technology is 100% reliable , plus for your idea to happen all human driven vehicles would have to be banned from the roads. And if you think "so whats the problem with that?" it simply means you've never heard of motorbikes, scooters or classic cars.

            • I've been working on a short story for a while about human drivers trolling AI-based autonomous vehicle systems.
              It's hard to write - not because the premise is implausible, but because it becomes more and more plausible, and making it believable enough to satisfy a reader is hard.
              And we all know how that trolling will manifest - tiny events that set off unstable oscillations within traffic systems.

  • And yet... (Score:3, Interesting)

    by SuperDre ( 982372 ) on Monday November 06, 2023 @06:27AM (#63983798) Homepage

    And yet these selfdriving vehicles are already driving safer as most humans. Is there room for improvement? Hell yes, but without these cars on the road, improvement will be harder to do as real life problems will only present itself during being on the actual road.
    But I'm all for setting up a public library of situations which occure on the road and which these cars should be able to handle. That library grows with daily accident reports and actual situations the selfdriving cars come across. And all situations should be tagged if the system (per brand) can handle it. This makes it easier for the developers of the system to check if they have all (extreme) situations done, but it also makes it easier for governments to have a checklist for approving new systems.
    I'm still all for selfdriving cars as they will make the roads much MUCH safer, even though I love driving myself. Some systems might even let the use think they are driving themselves, but in extreme situations will just automatically take over.

    • by AmiMoJo ( 196126 )

      The problem is that everyone is chasing Waymo, trying to catch up as quickly as possible. Whoever gets there first with viable technology is going to make a fortune as taxi firms and shipping companies rush to install it.

      Waymo has a long head start and has been able to take things slowly and carefully, resulting in zero serious accidents. Just some low speed bumps with other vehicles.

      It's a very hard problem to crack. You can't just feed some cameras to an AI like Musk seems to think. You have to blend data

      • Well, to be honest, Cruise also hadn't been involved with any serious accident, until that freak one, which in the end was actually better for the victim that the car stopped instead of drove off her, according to the first responders. You can't blame Cruise for that serious accident, as it could also have happened to Waymo. But Waymo did kill a dog, so it did also have a serious accident. Also I do wonder how many holdups of traffic are due to Waymo cars and not Cruise, as Waymo hasn't been driving for ver

      • It's a very hard problem to crack. You can't just feed some cameras to an AI like Musk seems to think.

        Technically that is how we drive, so if you had a human-level AI, you could do that. Still a hard problem, but Musk is right.

        • by AmiMoJo ( 196126 )

          True, but we are a very very very long way from human level AI.

          We can't even do vision well enough to recognize solid objects that the car is about to crash into yet.

      • There's a bit of an irony here, with the article quoting someone describing the Cruise CEO as "very Silicon Valley". Waymo is an actual Silicon Valley company (started by Google), but they've taken things slowly and carefully, almost to a fault. Safety really has been their top priority. Cruise is part of a traditional car company (GM) that ought to have safety deeply embedded in their culture. Instead they sacrificed safety in the rush to get a product out as fast as possible. The roles seem to have s

    • If I run someone down by mistake I can go to jail for vehicular manslaughter.

      When a company's robot does the exact same thing, nothing happens.

      Let us know when the CEO is at the same risk for jail as me when his robot kills someone. After that we can discuss the rest of it. Before that, I don't care.

      • That's not true, the company is responsible according to current laws.
        • That's not true, the company is responsible according to current laws.

          The company being responsible doesn't really mean anything. Companies are, at worst, fined heavily if they break a law. Well, it would be fined heavily if the fine was levied at an ordinary citizen, but for most large companies the fines seem like crumbs falling off a giant cookie. It has zero long-term impact other than becoming a part of the cost calculation going forward. "So if we kill someone, we get fined this much. Better calculate how many people we will kill per year, and bake that into the cost."

          • by uncqual ( 836337 )

            Companies get sued by victims and/or their families.

            I don't know if this is a in clear cut case as I've not seen all the evidence so can't determine if the driverless car's apparent inappropriate action caused significant additional injury.

            However in a clear cut case, the victim of an ordinary motorist's actions will often have a hard time finding a lawyer to take the case on contingency because there's just not enough money to make it worthwhile. Most motorists don't have much liability insurance (often ju

            • If I were seriously injured in a car accident that was clearly the fault of either the other driver or a Cruise self driving taxi, I'd much prefer the self driving taxi case as I'm likely to be far better compensated for my injuries in that case.

              Thank you. Somebody FINALLY articulated a decent reason for why driverless cars *MAY* have a leg to stand on against human drivers. Granted, that only lasts until enough lobbying money gets thrown at congress to make driverless injuries the fault of literally anybody BUT the corporation that owns the tech, but for now? Better argument than any other I've heard.

            • >The manufacturer can be grilled on why the line of code or the branch in the AI decision tree was so stupid

              Imagine trying to program these things? The number of boundary conditions, all the possible scenarios and exceptions, etc. Brutal.

    • Re:And yet... (Score:4, Insightful)

      by phantomfive ( 622387 ) on Monday November 06, 2023 @08:43AM (#63984100) Journal

      And yet these selfdriving vehicles are already driving safer as most humans.

      False, that is what Cruise and Waymo have told you, but Cruise has been caught hiding information, and Waymo doesn't release all their data.

      You don't have enough information to determine whether they are safer.

    • without these cars on the road, improvement will be harder to do as real life problems will only present itself during being on the actual road.

      They can be on the road, they just need a safety driver until they are actually safer than humans.

    • Sure, in certain controlled conditions like wide straight US roads where the traffic mostly behaves predictably and follows the rules.

      Now put them in Delhi or Rome where the rules are seem as guidance, not legally binding, or some tiny european village with 2 way roads barely wider than a car where there has to be some kind of negotiation between drivers as to who goes first. Good luck!

  • by thesjaakspoiler ( 4782965 ) on Monday November 06, 2023 @06:34AM (#63983810)

    With robots handing out ice creams for free?

  • And yes, I did google it ... and still don't know.

    Some sort of undocumented synonym for "evaluated"?

  • by chas.williams ( 6256556 ) on Monday November 06, 2023 @07:05AM (#63983856)
    Please no. If anything, we need fewer rules for driverless cars. We need to get these things to market to something, something, and profit!
    • Exactly.

      1: run over random pedestrians with our beta test level devices
      2: ...
      3: profit!

    • Please no. If anything, we need fewer rules for driverless cars. We need to get these things to market to something, something, and profit!

      There seem to be no end to the prophecies of perfect road records when these stories come up. All sorts of volunteers sing their praises. People just flat out believe that they absolutely 100% are safer than human drivers, with zero to miniscule data to back up that assertion. Whatever these self-driving companies are doing, they've done a fantastic job of getting a certain subset of the more technology focused folks in the public to worship them as if they were the second coming.

      And, sadly, your joke here

  • It's all fun n games for startup asshole's ego to apply software startup bullshit when making some dumb social media app but moron just learned that when you use the public as your involuntary beta testers for your shitty real world devices, real people are going to get hurt and you can't just do an emergency fix over the weekend and move forward when you've run over some woman, dragged her 20 feet under your robot and it parks on her.

    • "Move fast and break things" is fine for social networks, but in Fintech it gets you arrested, and in biotech and self-driving cars it kills people. You need to fix your bugs and keep them fixed.
  • ...doesn't work for bot-cars? Who knew?

  • In the competition between Cruise and its top driverless car rival, Waymo, Mr. Vogt wanted to dominate in the same way Uber dominated its smaller ride-hailing competitor, Lyft. "Kyle is a guy who is willing to take risks, and he is willing to move quickly. He is very Silicon Valley,"

    Obligatory XKCD [xkcd.com]

  • I doubt there are any coding standards that could prevent fatal accidents in self-driving cars, but there have been plenty of abnormal behaviours reported for such cars that are very clearly programming errors. Are companies writing self-driving vehicle software legally obliged to follow any set standard, beyond AUTOSAR and MISRA for the vehicular computers?

    As for testing, ok, we know they do manual testing by driving on test circuits and on actual roads, but what other testing do they do? Do they have a li

  • ...on the manager and hiring lawyers
    The problem is not the managers or the culture. The problem is that it's insanely hard to make self-driving cars perfect
    Getting 99% reliability is easy, but the last 1% gets exponentially harder
    I have no doubt that the problems will eventually be solved, but many more years of testing are needed before widespread deployment

  • Alternatively we could put on our big boy pants and go back to being the land of the free and the home of the brave. People die, that's life. Fix the software and move on.
  • I had, indeed, lost trust in self-driving cars. I was worried that there wasn't enough attention paid to the sensor resolution and refresh rate, combined with the training of the neural nets to interperet that sensory data into a 3d model of the surroundings. I even doubted that the ever-altering, but persistant model of the world on which the software was basing life-and-death decisions was flawed or imprecise. It all seemed like an enormously complex physics/computation problem.
    Happily, my trust in this t

  • Cruise is becoming too much of a financial burden on G.M. and is hurting the auto giant's reputation

    This literally made me spit out the food I was eating when I read this. What reputation? G.M. already has arguably the worst reputation of all the big automakers. Bob Lutz drove that company into the ground. Maybe I'm just a bit jaded because the G.M. vehicle I once had spent way too much time in the shop for non-routine maintenance work, whereas the two vehicles I've had for the last 10+ years (a Honda Civic and a Toyota Tundra) just work.

  • I see articles with headlines like this every day:
    "Wrong way driver on freeway kills family"
    "15-year-old in stolen car runs red light and kills pregnant woman and her baby" (this was a couple miles from my house)
    "DUI accident kills 3"

    So, tell me again how good humans are at driving.

    Even the Cruise accident in SF... would the self-driving car have hit the pedestrian if a *human* driver hadn't hit her first and knocked her into the path of the self-driving car? If that first car had also been a Cruise or Way

  • The problem with applying AI to the real world is that humans can use ALL of their knowledge to deal even with the most trivial task, and human adults typically have A LOT of knowledge.

    Meanwhile AI systems based on statistical training are going to have trouble when rare events occur. That trouble is likely to be compounded when rule-based systems are also in play, using rules made in advance by humans who did not foresee some rare event.

  • Roads are extremely dynamic environments. Even with a human brain we have to pause and analyze what is going on in some unexpected circumstance like a detour, people jaywalking, road markings missing, traffic violations, double parkers blocking you, someone you suspect is going to pull out suddenly in front of you, damaged roads, etc. The logic in these vehicles is pretty basic, stop if you see something there, go if you don't, stay to the right, etc. I never thought these things were ready to be unleash

Our policy is, when in doubt, do the right thing. -- Roy L. Ash, ex-president, Litton Industries

Working...