Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Transportation

GM-Owned Cruise Has Lost Interest In Cars Without Steering Wheels (yahoo.com) 72

Yesterday, GM announced it was delaying production of the Cruise Origin indefinitely, opting to use the Chevy Bolt as the main vehicle for its self-driving efforts. Introduced four years ago, the Cruise Origin embodied a futuristic vision with no steering wheels or pedals and 'campfire' seating for six passengers, all while providing wireless internet. However, as Fortune's Jessica Mathews writes, the company appears to have lost interest in that vision (source paywalled; alternative source) -- at least for now. From the report: To hear GM CEO and Cruise Chair Mary Barra, the demise of the Origin comes down to costs and regulation. GM's "per unit-costs will be much lower" by focusing on Bolts instead of Origin vehicles, Barra wrote in a quarterly letter to shareholders Tuesday. Barra discussed the regulatory challenges during the quarterly earnings call, explaining the company's view that deploying the Origin was going to require "legislative change." "As we looked at this, we thought it was better to get rid of that risk," Barra said.

All robo-taxi companies have been waiting on the green light from regulators for the approvals needed to add these futuristic pedal-less cars into their commercial fleets. While the National Highway Traffic Safety Administration adjusted its rules so that carmakers could manufacture and deploy cars without pedals or steering, state DMVs still have many restrictions set in place when it comes to people riding in them. GM isn't completely swearing off the concept of steering-wheel free cars -- Barra noted that there could be an opportunity for a "vehicle like the Origin in the future."

This discussion has been archived. No new comments can be posted.

GM-Owned Cruise Has Lost Interest In Cars Without Steering Wheels

Comments Filter:
  • If you willingly get in some so-called 'self driving car' that has no steering wheel, then I'd have to say you're suicidal, so it's just as well they're not going to produce a death machine like that.
    • People who say something like you just had said had probably never crossed the paths (LOL) on the streets with a driver like, for example, me.

    • by Tony Isaac ( 1301187 ) on Wednesday July 24, 2024 @08:12PM (#64653424) Homepage

      I trust self-driving technology a lot more than I trust the average 16-year-old behind the wheel. Or for that matter, the average 40-year-old behind the wheel.

      • You have no scientific data to support that claim. It's just emotion, it's what you want to be true. It's also what I want to be true.
        • You have no scientific data to support that claim. It's just emotion, it's what you want to be true. It's also what I want to be true.

          Claims are not just science-backed or emotion. Nor does it necessarily have to be what we want to be true. There's no reason to use such underhanded tactics to discredit someone else.

          We have our own experiences and actuaries to inform us. And a claim like "I trust" is not a claim of absolute fact.

        • by Tony Isaac ( 1301187 ) on Wednesday July 24, 2024 @10:15PM (#64653604) Homepage

          There is actually plenty of data, though it's not apples to apples.

          According to NTHSA https://www.forbes.com/advisor... [forbes.com]
          The accident rate for self-driving cars is 9.1 crashes per million miles.
          The accident rate for human drivers is 4.2 crashes per million miles.

          However, most minor human-caused crashes are never reported, while all driverless crashes are reported.
          AND 69% of all driverless crashes are Teslas, which isn't exactly the best example of self-driving technology.

          So perhaps the numbers aren't fully representative, but on the whole, driverless cars are approaching the same level of safety as human drivers. And I have no doubt they will soon surpass human driver safety.

          Just because I didn't cite "scientific data" in my post, doesn't mean it doesn't exist.

          • Is it not illegal to have no driver in a Tesla?

          • The car companies lied to the regulators about their numbers.
          • With that statistic for human drivers vs self-driving cars, it makes me wonder - there are known bad drivers out there.
            For example:
            drunk drivers [nhtsa.gov] are 4 times more likely to crash at .08, and at .15, it's 12 times. 32% of fatal accidents involve a drunk driver. So let's multiply: 68% of fatal accidents do not involve drunk driving, so let's say that the rate when we remove drunk driving would be 2.9. Multiply that by the 4 times as likely to crash at 0.08, and we're at 11.4, which is higher than 9.1. At

            • We don't have good data for self driving cars. Good job looking up the numbers for humans though, you deserve credit for that.
              • Tony Issac, the poster above me, posted a citation on the accident rate for self driving cars.

                That said, given that they're still under active fairly intense development, they're the equivalent of said 16 year old. Most of them are still at the learning permit stage (Tesla) with only one having gotten their restricted license (Waymo).

                I expect the accident rates to drop rapidly for now and like with human drivers - different company's products probably have different rates.

                I forgot to make my point clearer

                • You don't know how safe Waymo is.
                  • https://www.theverge.com/2023/... [theverge.com]

                    6.7 times less likely be in an injury crash, 2.3 times less likely to be in a police reported crash.

                    But I think that you misunderstood what I meant by "Restricted license".

                    Learning permit: Requires an adult with a full license in the car.
                    Restricted license: They no longer require the adult, but are still under a number of restrictions, like how many passengers they can have, what hours they can drive, etc...
                    In Waymo's case, they've won approval to not have the safety drive

          • The accident rate for self-driving cars is 9.1 crashes per million miles. The accident rate for human drivers is 4.2 crashes per million miles.

            However, most minor human-caused crashes are never reported, while all driverless crashes are reported. AND 69% of all driverless crashes are Teslas, which isn't exactly the best example of self-driving technology.

            So perhaps the numbers aren't fully representative, but on the whole, driverless cars are approaching the same level of safety as human drivers. And I have no doubt they will soon surpass human driver safety.

            The bias is likely to be the other way round - ie SD crashes are under-reported and represented.
            For one thing, people are more likely to engage SD on long boring open freeways and country roads where crash rates are lower anyway.
            Then there is the fact that in an SD initiated crash, the human driver may take control at the last moment to try to avert the crash, unsuccessfully, but it gets recorded as a human crash.
            Then there is the fact a company like Tesla, which has a record of making false stements and

            • Most self-driving miles are in the form of robotaxi miles in places like San Francisco.

              The Tesla crashes I don't consider representative, because their self-driving software is immature and half-baked. That removes two-thirds of self-driving crashes from the mix.

              The robotaxi crashes are meticulously recorded, unlike, as you suggested, Tesla.

              • Most self-driving miles are in the form of robotaxi miles in places like San Francisco.

                This is precisely why the crash data comparison is meaningless. Human beings drive much worse roads in much worse conditions and at faster rates.

                • While the data may not match precisely, that doesn't make it meaningless. While country roads may be "worse" in some respects, on city streets traffic is more of a problem. In general, interacting with traffic is a bigger component of accidents, than country roads. Further, 80% of the US population lives in cities https://www.census.gov/newsroo... [census.gov]., and 75% of accidents happen close to home. https://www.farristhomas.law/b... [farristhomas.law]. These two facts make the country miles less important.

                • > Human beings drive much worse roads in much
                  > worse conditions and at faster rates.

                  Clearly you've never driven in San Francisco. Between the hills, and the narrow roads in residential areas, and the blind corners, and the poorly marked lanes on some roads, and the uneven pavement and un-repaired potholes on others, and the madhouse that is rush hour in SOMA and the FiDi, and the fog that can roll in and keep you from seeing past your hood, and the junkies who wander out in traffic at stop lights to

            • Finally, I rather die from my own mistake or shortcoming than from a mistake or shortcoming in software written by some out-sourced contractor

              BINGO!!

              I could not have expressed my sentiment when reading this any better.

          • > 69% of all driverless crashes are Teslas

            Look, I think the muskrat is as much a tool as you do... probably a lot more, actually. But if you're going to hate on him, you should hate on him for the nasty things he really has done, not made-up straw men. Teslas are not fully self-driving, even with FSD enabled. Tesla themselves specify as much in their documentation:

            "The currently enabled Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous." [tesla.com]

            A

            • I said nothing about my feelings about Musk. Those are irrelevant.

              What is relevant is that, in the "self driving accident statistics" the majority were Teslas, which isn't a truly self-driving vehicle, as you pointed out. My point is that including Teslas in this number, skews the accident rate artificially high for automated vehicles.

              My feelings about Musk have nothing to do with this.

      • Re: (Score:2, Troll)

        by drinkypoo ( 153816 )

        The average 16 year old has never got a pedestrian stuck under their car and then tried to drive away with them still down there. But every single car like the one that did that to the woman in SF would have made the same mistake.

        • Whole fleets of self-driving cars run the same code.

          They will make the same mistakes in the same situations.

          If you don't understand that, you don't understand the risk of self-driving cars.

          • The tech behind the fictional HAL 9000, HAL being an acronym for Heuristic ALgorithmic, is that it would be based on some kind of self-learning, self-organizing principle rather than reasoning based on programmed rules--think Asimov's Laws of Robotics.

            If the self-driving car makes the same mistakes in the same situations, improving it could be based on well-known software development processes: supply the same situation to see of the same mistake occurs, understand the rules programmed into the system th

            • they need an test track not the real streets.
              Also needing
              super mapped roads = fail
              needs remote non local data = fail
              lacking an local emergency mode = fail
              coming to stop in the middle of the road if it can't deal with what is going on = fail
              lacking some kind of e-stop button = fail

          • I suspect you've never tried to pin down non deterministic behavior. It's hard, even with completely identical inputs. Given real-world sensor data, inputs won't be identical.
            • On one hand, you're right, they won't behave exactly the same.

              On the other hand, they may well behave very similarly, to the point of being functionally identical. The people at Cruise certainly thought so, because they pulled all of their cars temporarily. The cars are making decisions. Even if they are retraining in real time, which I hope they are not, they still have the bulk of the model involved and their behavior will [hopefully] not be changing very rapidly without a human involved deciding whether

        • Under the same circumstances, few human drivers would have done better than the Cruise car did.

          https://www.latimes.com/califo... [latimes.com].

          The woman, a pedestrian, was struck by a hit-and-run vehicle at 5th and Market streets and thrown into the path of Cruise’s self-driving car, which pinned her underneath, according to Cruise and authorities. The car dragged her about 20 feet as it tried to pull out of the roadway before coming to a stop.

          Now, tell me how that self-driving car was at fault!

        • The "average" self driving car hasn't either, it took some pretty unusual circumstances for it to happen. That said, I've seen a number of reports where human drivers did just that - drove away after hitting somebody.

          It's something to add to the test scenarios to not happen again.

        • The average 16 year old has never got a pedestrian stuck under their car and then tried to drive away with them still down there.

          Erm the "average" self driving car hasn't either. As to whether people have hit others and then driven away, not only do humans do this all the time unintendedly (I have first hand experience being gently hit by a car only to have the fucking moron behind the wheel drive over my foot when they decided to move the car before seeing what happened), but humans also have a nice long history of intentionally leaving the scene of accidents. We have whole laws dedicated to just that despicable behaviour.

          But every single car like the one that did that to the woman in SF would have made the same mistake.

          A signific

        • by unrtst ( 777550 )

          The average 16 year old has never got a pedestrian stuck under their car and then tried to drive away with them still down there.

          If you want to be pedantic, the EXACT same thing can be said of self driving cars; "The average self driving car has never got a pedestrian stuck under their car and then tried to drive away with them still down there." NOTE, that is because of the "average" predicate. So you may be correct that your average driver of any age hasn't done that, but then you're wrong to say or imply that the average self driving car has.

          If we drop the "average" qualifier, then you'd be wrong that a driver has never done that:

      • I trust self-driving technology a lot more than I trust the average 16-year-old behind the wheel. Or for that matter, the average 40-year-old behind the wheel.

        Over-hyperbolize much? I can guarantee that if every single vehicle were made self-driving with the absolute BEST examples of self driving available on this day, that we would be in a global gridlock situation. And yet that doesn't seem to happen when people are in control despite some drivers being demonstrably worse than even mediocre self driving tech.

        Look, we all get it, some drivers are worse than merely 'bad'. Many drivers do not pay enough attention to navigate safely despite being perfectly capable

        • You can guarantee gridlock would happen if all cars were self-driving? That's amazing! You must have done some intensive research to come up with that conclusion.

          In Houston where I live, we have gridlock already.

          The difference is, self-driving technology is continually improving. Human driving skills, are not. It's a matter of time before self-driving technology far exceeds that of human drivers. This will take some time, yes, but so will adoption of self-driving technology. By the time it's widespread, I'm

    • *responding to my own comment*

      I see the SDC fanbois are still around, and parroting the same tired old rhetoric about how human drivers aren't safe and the Machine Gods Will Save Us from ourselves. BULLSHIT. It's more shit excuse for technology, I hope all of you fanbois get in a crash in one and are emotionally scarred for life when you realize that you have NO CONTROL over the situation and could have DIED. SDCs are garbage tech just like AI is and I can't wait for the day they give up on it and admit th

    • Umm, the issue isn't the lack of a steering wheel. The issue is getting into a GM car.
      Frankly, I'll gladly trust a driverless car without a steering wheel before a person who considers themselves a good driver.
  • by phantomfive ( 622387 ) on Wednesday July 24, 2024 @08:28PM (#64653448) Journal
    Their technology isn't ready for a car without steering wheels, so they're not going to build it.
    • You're not wrong, but the reality was it was idealistic to begin with. Even scifi movies have realised this where characters are driving self driving cars often find themselves in situations where they need to assume manual control and are given a steering wheel.

      The idea was pie in the sky aspirational. Just look at the idea for the campfire seating. What a great way of rolling back 4 decades of passenger safety in one fell swoop. There's a reason we don't make cars like that anymore. It was a dumb idea fro

      • I could imagine seats facing backwards would be safer tho
        • by unrtst ( 777550 )

          I could imagine seats facing backwards would be safer tho

          Our old VW microbus had two bench seats in the back. The front one slid in on rails. You could take it out, turn it around, and put it back in. My family drove it around like that for a while - it was kinda fun, though some people got more carsick in the backward bench. Safety... eh, I'm not sure that microbus was the best example to begin with, lol.

          (tried to find a pic or video of those type of seats, but came up empty. Rails ran side to side so you could slide the bench in through the sliding side door. M

          • That isn't at all uncommon for cars of that era. It's just not common anymore with modern safety standards. (Bonus points those flippable rear facing seats usually didn't have headrests either in many cases). We had an old Ford Maverik which included a flippable rear bench.

            • by unrtst ( 777550 )

              ... Bonus points those flippable rear facing seats usually didn't have headrests either in many cases ...

              Headrest? There weren't even any shoulder harnesses/belts :-) Good times, lol

        • I could imagine seats facing backwards would be safer tho

          Absent all other safety mechanisms yes seats facing backwards are safer in frontal collisions, seats facing forwards are for rear enders. But we aren't absent all other safety mechanisms. With the advent of half your car turning into balloons with passenger airbags, pillar airbags, side airbags, they are all designed to address momentum in one direction. It's something you can't easily do (or at least would take a fuckton of R&D to get right) if two groups of passengers are facing each other (though in

  • ". . . explaining the company's view that deploying the Origin was going to require "legislative change."

    That sounds, to me, like an admission that the technology does not, and will not, any time soon, meet level 5 requirements. That there is no viable path to doing so.

    Ergo, the only way they can ever deploy such vehicles is if the law changes to reclassify them.

  • Really? Cause that's the only kind of car I'm interested in.

  • It is well known that driving standards vary enormously across the world. Surprisingly perhaps, those in the USA are not much better than in many third world nations. So it is from the USA that much of the SD initiative comes from. Standards in Western Europe are much higher, and Americans themselves who come to live in Western Europe are taken aback by the high standard required by the driving tests. One American said that his test in the USA consisted of driving once round a parking lot with the tester
    • We in the US take driving education the same way we take all education, it's an afterthought. Some folks luck out and have parents or family that care to teach them the real way to drive, or you can get driver's ed, which teaches you the rules of the road, then throws you out there with a frightened teacher right on the road.

      I was driving tractors and farm implements by the time I was seven or eight. I had relatives into racing so learned performance driving long before I had a license. I've only had one ac

    • Standards in Western Europe are much higher, and Americans themselves who come to live in Western Europe are taken aback by the high standard required by the driving tests.

      Granted it's been nearly 40 years since my driving test in New York, but the only "hard" thing I recall was parallel parking which you had to nail on the first tryin order to pass.

      Out of curiosity, what are some of the "higher standards?" Examples of what you must do to pass?

      .. set some decent driving test standards and apply it retros

  • Comment removed based on user account deletion
  • I just feel like...like I'm just a token being pushed around a giant map. I want to live and laugh and love. A steering wheel will make all the difference. You'll see. One day, my robotaxi's coming in. And it won't be filled with puking vandals, no. It will smell of new car and maybe a hint of peppermint. And I'm gonna turn that wheel right off the rails and straight into the harbor all on my own. My own free will. Ah, freedom! Steering Wheel Freedom.

  • Apparently, the idea of campfire seating was so that the central area could be used for roasting wieners and marshmallows when the battery caught fire. However, the cost of providing marshmallows and hot dogs was deemed excessive, so the project was scrapped.

If you would know the value of money, go try to borrow some. -- Ben Franklin

Working...