Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation United States

Waymo Says Austin, Texas, Will Be Its Next Robotaxi City (theverge.com) 18

An anonymous reader quotes a report from The Verge: Waymo's fourth robotaxi city will be Austin, Texas. It will be a bit of a homecoming for the Alphabet-owned self-driving company. Waymo said that it will kick off the process for a commercial robotaxi service in the city later this year. But that doesn't mean passengers can hail one of the company's driverless vehicles quite yet; Waymo's playbook is to start with manual testing, following by supervised testing, fully autonomous driving, and then, eventually, passenger services. The company has been testing its vehicles on the streets of Austin since March, laying the groundwork for the eventual launch of a commercial ridehailing service.

Waymo says its driverless taxis will traverse "a large portion of the city night and day," covering spots like "the heart of downtown, Barton Hills, Riverside, East Austin, Hyde Park and more." The company makes no mention of the Austin-Bergstrom International Airport, where the taxi business is typically the most lucrative. The company also noted that "autonomous vehicles help improve road safety," a claim that sounds true on the surface but is hard to prove. Waymo has released several datasets that show its vehicles to be adept at avoiding certain collisions. But humans drive billions of miles every year -- orders of magnitude more real-world driving than the comparatively tiny fleet of AVs on the road today. And while there are an unacceptable number of fatalities every year, humans are actually good drivers -- for the most part.

This discussion has been archived. No new comments can be posted.

Waymo Says Austin, Texas, Will Be Its Next Robotaxi City

Comments Filter:
  • avoiding certain collisions so not 100%? but 99? 95? 80? 70? 60? 50?

    and what happens when one it can't avoid kills someone?

    • Re: (Score:2, Informative)

      by geekmux ( 1040042 )

      avoiding certain collisions so not 100%? but 99? 95? 80? 70? 60? 50?

      What in this world do you demand and expect perfection from? Just curious.

      and what happens when one it can't avoid kills someone?

      Politics. That's what happens.

      And while there are an unacceptable number of fatalities every year, humans are actually good drivers -- for the most part.

      If humans were such good drivers, we wouldn't be watching so many scramble to replace shitty humans who drive distracted, drunk, and drugged to kill 40,000 people every year in America. Even if autonomous solutions reduced that number by only half, it would still be a considerable improvement.

      • The named (Score:4, Interesting)

        by Iamthecheese ( 1264298 ) on Friday August 04, 2023 @10:05AM (#63740096)
        The people not killed because an autonomous car was driving instead of a human are nameless. The people who will be killed by autonomous cars have names. That fact alone means the evidence has to be overwhelming and backed by political will for us to admit autonomous cars are better. (When they actually become better)
        • The people not killed because an autonomous car was driving instead of a human are nameless. The people who will be killed by autonomous cars have names. That fact alone means the evidence has to be overwhelming and backed by political will for us to admit autonomous cars are better. (When they actually become better)

          There are two distinct problems with automating driving.

          One, designing and training a system to be near-perfect and fully accepted by society, proven to be FAR safer than any human driver.

          Two, preventing that near-perfect always-safe system from murdering countless people by getting hacked.

          Needless to say, it's that last part that's the real bitch. Humans slowly killing each other with cars is perfectly acceptable in society today. Creating real safety in autonomous systems means you're not even going to

      • That's a good way to get a test pool - mandate driverless cars for anyone that's got a DUI conviction. Statistically that should be an easier set of human drivers to demonstrate driverless superiority.
        • That's a good way to get a test pool - mandate driverless cars for anyone that's got a DUI conviction. Statistically that should be an easier set of human drivers to demonstrate driverless superiority.

          That's not a bad suggestion, and one I might be able to get behind once we have built up some real statistics about self-driving cars. Right now the comparison is equivalent to saying that mound of sand you fiddle with in your desk zen garden is a much cleaner environment than the Rocky Mountains. While it may be true, there's a scope issue that makes the comparison a bit iffy at the moment. But, if, instead of just continuing to let drunk drivers drive after DUI convictions, we force them to take self-driv

          • The reason penalties for DUI are cash fine and legal fee based if no injuries happened, and mostly no prison time if the harm is paid for by insurance. The defendant has to drive to work to pay their legal fees and support the pool of insurance around the DUI convicts. If the driver is rural there is no choice but release back to some means of transport. DUI is so rare amongst non repeaters, non poor life choices crowd that the court system would fold without the frequent fliers. It also becomes t
          • The idea that we need to replace ALL drivers with self-driving is one that just grates on somebody that's been driving for over thirty-five years and has a total of three accidents.

            I hope you understand that after an equally long driving record, I consider my safety record of three accidents luck, not skill.

            It's hardly you I fear. It's the other 99% of smartphone addicts who find driving distracting, even when behind the wheel. And you should know and understand that after 35 years you've been damn lucky. Not just a safe driver.

            Where's the proof that I need self-driving to improve my safety?

            Well, let's start with the obvious given your admitted age+...vision and reflexes not quite being what they used to be, especially at night. Then we'll mov

        • but can you get an DUI in an car with no controls?? or will the law see an app on the phone = in control and in control = DUI

        • That's a good way to get a test pool - mandate driverless cars for anyone that's got a DUI conviction.

          Speaking of those barely conscious behind the wheel, It's going to be quite interesting when it's no longer the 30-something drunk driver who statistically represents the real danger for the rest of us, but instead morphs into the bubbly 19-year old social media addict representing the growing justification for driverless cars to be mandated for those who harm via distracted driving.

          Not everyone who drinks is an addict that drives dangerously behind the wheel. Damn near every young adult/adult owns a pers

      • I can see two piles of bodies: People killed by autonomous cars, and people killed by other people. I would probably say that the body count from a self-driving car will be a lot less over time. Especially because cities can place beacons or other items which a self-driving car will heed, such as stopping before a crosswalk or knowing that a green light up ahead will stay green for "x" more seconds, so the car knows that it either needs to speed up, or get prepared to stop without slowing down too much t

    • avoiding certain collisions so not 100%? but 99? 95? 80? 70? 60? 50?

      and what happens when one it can't avoid kills someone?

      What's a scenario you're thinking of?

      Remember, it's speed that kills.

      The great thing about city driving is that you start with a lower speed, so passenger fatalities are already rare.

      And most city fatalities are due to hitting pedestrians, and that's something a LIDAR equipped self-driving car with instantaneous reactions can actually be really good at.

      The only scenarios that I'd be a little worried about are things like a too-conservative self-driving car getting stuck in an intersection and getting t-bone

  • It's bad enough when a regular driver hits you, but now, well, now nobody will be responsible for your injuries and/or death.

"The following is not for the weak of heart or Fundamentalists." -- Dave Barry

Working...