Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Transportation AI

Man Trapped in Circling Waymo on Way to Airport (cbsnews.com) 130

It "felt like a Disneyland ride," reports CBS News. A man took a Waymo takes to the airport — only to discover the car "wouldn't stop driving around a parking lot in circles." And because the car was in motion, he also couldn't get out.

Still stuck in the car, Michael Johns — a tech-industry worker — then phoned Waymo for help. ("Has this been hacked? What's going on? I feel like I'm in the movies. Is somebody playing a joke on me?") But he also filmed the incident... "Why is this thing going in a circle? I'm getting dizzy," Johns said in a video posted on social media that has since gone viral, garnering more than two million views and interactions....

The Waymo representative was finally able to get the car under control after a few minutes, allowing him to get to the airport just in time to catch his flight back to LA. He says that the lack of empathy from the representative who attempted to help him, on top of the point that he's unsure if he was talking to a human or AI, are major concerns. "Where's the empathy? Where's the human connection to this?" Johns said while speaking with CBS News Los Angeles. "It's just, again, a case of today's digital world. A half-baked product and nobody meeting the customer, the consumers, in the middle."

Johns, who ironically works in the tech industry himself, says he would love to see services like Waymo succeed, but he has no plans to hop in for a ride until he's sure that the kinks have been fixed. In the meantime, he's still waiting for someone from Waymo to contact him in regards to his concerns, which hasn't yet happened despite how much attention his video has attracted since last week.

"My Monday was fine till i got into one of Waymo 's 'humanless' cars," he posted on LinkedIn . "I get in, buckle up ( safety first) and the saga begins.... [T]he car just went around in circles, eight circles at that..."

A Waymo spokesperson admitted they'd added about five minutes to his travel time, but then "said the software glitch had since been resolved," reports the Los Angeles Times, "and that Johns was not charged for the ride."

One final irony? According to his LinkedIn profile, Johns is a CES Innovations Awards judge.

Man Trapped in Circling Waymo on Way to Airport

Comments Filter:
  • "Trapped" (Score:3, Informative)

    by timeOday ( 582209 ) on Sunday January 05, 2025 @08:52PM (#65065259)
    Oh how scary. Except actually: "If at any time you want to end your ride early, tap the Pull over button in your app or on the passenger screen, and the car will find a safe spot to stop. "

    https://support.google.com/way... [google.com]

    • Re:"Trapped" (Score:5, Insightful)

      by timeOday ( 582209 ) on Sunday January 05, 2025 @08:53PM (#65065265)
      ... unless "not only was he unable to stop the car" means he tried the button and it didn't work.
      • Re:"Trapped" (Score:5, Informative)

        by ArmoredDragon ( 3450605 ) on Sunday January 05, 2025 @11:49PM (#65065495)

        I think the bigger miracle is that he was able to reach somebody at google. It should probably occur to him that this...

        Johns, who ironically works in the tech industry himself, says he would love to see services like Waymo succeed, but he has no plans to hop in for a ride until he's sure that the kinks have been fixed. In the meantime, he's still waiting for someone from Waymo to contact him in regards to his concerns, which hasn't yet happened despite how much attention his video has attracted since last week.

        ...aint happenin'.

        • Anybody who works in tech knows that Google "tech support" doesn't respond to inquiries. Which is why they went to the press instead, in order to publicly shame them into a response.

          It makes me wonder if I should do something similar with my GMail support issues. Instead of deleting the dozens of random Google "account recovery" e-mails I get every week, maybe I should be screenshotting them and sending them to Wired?

      • by vbdasc ( 146051 )

        It's just that the car was unable to find a safe spot to stop. He should've waited a hour or two until such a place became available.

    • Re:"Trapped" (Score:5, Insightful)

      by Zurk ( 37028 ) <zurktech@NOspAm.gmail.com> on Sunday January 05, 2025 @09:18PM (#65065311) Journal

      " the car will find a safe spot to stop." not "its an emergency stop button". the car obviously couldnt find a safe spot to stop which is why it kept circling....

      • Re:"Trapped" (Score:4, Insightful)

        by angel'o'sphere ( 80593 ) <angelo.schneiderNO@SPAMoomentor.de> on Monday January 06, 2025 @04:42AM (#65065719) Journal

        Unbelievable that the car has not big - GIGANTIC BIG - emergency stop button like any escalator has.

        • by Luckyo ( 1726890 )

          Stopping in the middle of a highway is an extreme safety risk. Hence the lack of escalator style emergency stop button.

          • Stopping in the middle of a highway is an extreme safety risk.

            And yet this HAPPENS! In Real Life! The auto tire blows out, you cannot continue driving for more than a few meters. Drivers are taught to turn on their flashing lights and pull over, then call for a tow. Even if you're on the freeway! It is unsafe to continue your trip! Cars have broken down in the middle lane!

            I've had it happen to me; the controls broke, I was stuck in neutral on the freeway. I turned on the emergency lights and nearly instantly all the traffic behind me slowed down, moved out of t

            • by Luckyo ( 1726890 )

              The ask was for escalator style emergency button. That typically engages emergency brakes and halts the system. You do that on a highway and the only thing on your mind is going to be the grille of the truck behind you going through said mind. Or more accurately, brain.

      • " the car will find a safe spot to stop." not "its an emergency stop button". the car obviously couldnt find a safe spot to stop which is why it kept circling....

        Lets not be overly kind here for pre-IPO stock sake. The car, was literally circling a parking lot. The car obviously didn’t have a damn clue as to where it actually was, safe spot be damned. If the car can park, then it was actually surrounded by safe spots.

        We should give autonomous solutions credit when appropriate, but we should also chastise the shit out of them when appropriate too. An autonomous car stuck in a literal parking lot, has NO business being on public roadways without a licensed d

        • by mysidia ( 191772 )

          Define an “emergency” stop for a car moving at freeway speeds.

          Um; It's well-explained to every driver in driver's Ed the basic process for performing an emergency stop on the freeway.

          And by the way: you still stop when necessary in an emergency stop even if there is no safe spot to pull over. A tree can still have fallen in the road in front of you, even if there is no shoulder and zero space to divert.

          Presumably if the autonomous vehicle is working properly there should never be a cause t

          • Now maybe we should require autonomous vehicles to pass a Drivers Ed course, take a driving exam with the DMV, and then fill out the paper test? Optionally the autonomous vehicle can wait in line to get an unflattering picture taken as well.

      • What if the car is on fire? Do you still need to wait for it to find a safe spot to exit?
    • Re:"Trapped" (Score:5, Interesting)

      by Mr. Dollar Ton ( 5495648 ) on Sunday January 05, 2025 @10:58PM (#65065431)

      Could not have happened to a more suitable candidate, too.

      Mike Johns is a tactician who sits at the intersection of tech, entertainment, media, and politics — providing creative direction, message development, content strategy, and a 360 approach to business development and growth.
      Operating in Los Angeles, Johns serves clients globally with London as his second home. Johns leverages his diverse experience and connection to enhance the image and reputation of public figures, artists, businesses, organizations, and agendas that move the world forward. Goal, impact 1 billion people within the next ten years. He has been featured on Fox News, Wireless Weekly, Los Angeles Times, Variety, Vibe, Black Enterprise, Mobile Entertainment and is among Hollywood’s who’s who decision makers.

      ASK ME ABOUT:

      DATA | BLOCKCHAIN TECHNOLOGY | AUTONOMOUS CARS | SMART CITIES | INTERNET OF EVERYTHING | ARTIFICIAL INTELLIGENCE | CONSUMER ELECTRONICS | DIGITAL LITERACY | ROBOTICS | ENTERTAINMENT | POP CULTURE | INFLUENCER MARKETING | FUTURE OF WORK | SPACE EXPLORATION | EDUCATION

      The tactician and renowned expert on autonomous cars, who could not find his way out of a robotaxi... LOL.

      • by geekmux ( 1040042 ) on Monday January 06, 2025 @07:39AM (#65065925)

        Could not have happened to a more suitable candidate, too.

        Mike Johns is a tactician who sits at the intersection of tech, entertainment, media, and politics — providing creative direction, message development, content strategy, and a 360 approach to business development and growth. Operating in Los Angeles, Johns serves clients globally with London as his second home. Johns leverages his diverse experience and connection to enhance the image and reputation of public figures, artists, businesses, organizations, and agendas that move the world forward. Goal, impact 1 billion people within the next ten years. He has been featured on Fox News, Wireless Weekly, Los Angeles Times, Variety, Vibe, Black Enterprise, Mobile Entertainment and is among Hollywood’s who’s who decision makers.

        ASK ME ABOUT:

        DATA | BLOCKCHAIN TECHNOLOGY | AUTONOMOUS CARS | SMART CITIES | INTERNET OF EVERYTHING | ARTIFICIAL INTELLIGENCE | CONSUMER ELECTRONICS | DIGITAL LITERACY | ROBOTICS | ENTERTAINMENT | POP CULTURE | INFLUENCER MARKETING | FUTURE OF WORK | SPACE EXPLORATION | EDUCATION

        The tactician and renowned expert on autonomous cars, who could not find his way out of a robotaxi... LOL.

        Is that LIST supposed to be taken seriously? Ask him about “DATA”? I almost feel like biting, just to see what this john has to say about the resume whoring.

        • Is that LIST supposed to be taken seriously?

          Is this a serious question? :)

          I mean, I have no idea what that guy supposes, but I'm sure you're taking it about as seriously as I do.

      • by Luckyo ( 1726890 )

        This list suggests he created a hoax for attention. Though "this has been fixed within a couple of minutes" vs his claim of "I'm not doing this again until they fix it" already suggested that he was creating a hoax for attention.

        • Exactly. Who the fuck gets in a robotaxi and starts bitching about "human connection" and "empathy" when things go wrong? That's like going to a Starbucks and being mad that the barista wouldn't take the time to talk extensively about the newest developments in coffee bean harvesting.

          Fuck this attention seeking asshole.

    • Hardware Stop (Score:5, Insightful)

      by Roger W Moore ( 538166 ) on Sunday January 05, 2025 @11:15PM (#65065447) Journal
      Most machines in contact with human need to have an emergency off button for safety. Having a button in an app that tells a computer to please stop is fine but when something goes wrong you need a "disconnect engine from power" hardware button that a computer cannot override. While that opens the possibility for the car to stop somewhere unsafe that's the human rider's responsibility not to push it on e.g. a motorway.

      Having no way to physically disconnect the power is not safe - humans can recognize far more emergency situations than an AI can currently - and it should be a basic requirement for all AI systems that they all have a physical shutdown button that stops things at a hardware, not software level.
      • A hard (actual switch, not a touchscreen) "pull over now" button should be easily accessible to the passengers, and a hardware "Emergency Stop" button behind a glassbreak to cut the power to the drive train immediately and unlock doors.
      • While that opens the possibility for the car to stop somewhere unsafe that's the human rider's responsibility not to push it on e.g. a motorway.

        Of course when all cars are autonomous, tailgating and aggressive driving will no longer happen and so hitting the emergency brake button will be completely safe, even on motorways/freeways.

        • Perhaps but by their very nature, emergencies generally happen when something has gone badly wrong and if that is the thing intended to keep you safe it's going to cause problems. For example, there is much discussion about autonomous vehicles communicating with each other to make driving at speed safer. If that's the case and your brake-warning system is part of what fails then stopping on a motorway may still cause an accident. Plus, once stopped you still have to get out of the vehicle and onto the shoul
    • Next time pack a hammer.
      • by stooo ( 2202012 )

        Next time, don't use a half finished robot driver.
        As a guy working in the field, he sure has learned his lesson.

    • by mspohr ( 589790 )

      RTFA
      He did contact support but they were unable to stop the car.

  • by Valgrus Thunderaxe ( 8769977 ) on Sunday January 05, 2025 @08:52PM (#65065261)
    Do these cars lock the passengers in? If there was a human behind the wheel, this would be considered a hostage situation.
  • Can you open the door? I bet the car goes into an emergency stop if the doors are opened.

  • Just the beginning (Score:5, Interesting)

    by Berkyjay ( 1225604 ) on Sunday January 05, 2025 @09:32PM (#65065343)

    The more ubiquitous these get the more of these issues will start occuring and the less companies like Google will care. Right now every small issue gets blown up in the media. But pretty soon these cars will be running over people, people will be robbed, and more traffic accidents will occur and we will never heard about.

    • just wait for the self driving track to wipe out an school bus full of kids.

    • I'm not positive we are not there already. As an example, Tesla's FSD has had 51 fatalities. https://en.wikipedia.org/wiki/... [wikipedia.org]. Now you might say that is not very many, but contrast that with defective air bag fatalities (https://apnews.com/article/takata-air-bag-explosion-shrapnel-death-007e4bfaf08fbebffaefb369abeb591c 28 fatalities). The air bag problem resulted in a massive recall with millions (billions?) spent to do the recall. FSD not so much. So waymo may well have many more events that have happened
  • by quonset ( 4839537 ) on Sunday January 05, 2025 @09:42PM (#65065349)

    Back in December a Waymo was recorded repeatedly looping around a roundabout [reddit.com]. Supposedly the company issued a software fix for this glitch.

    Apparently not.

    • They fixed it for traffic circles, not for parking lots.
      Obligatory commitstrip: https://www.commitstrip.com/en... [commitstrip.com]?

    • If the same code navigates a traffic circle as a parking lot then I think Waymo has bigger problems than this bug. It's like saying there's only one possible buffer overflow bug in the entire Linux kernel and once that is fixed there can't be any more buffer overflows.

    • by mjwx ( 966435 )

      Back in December a Waymo was recorded repeatedly looping around a roundabout [reddit.com]. Supposedly the company issued a software fix for this glitch.

      Apparently not.

      They probably fixed that glitch... this is likely to be an entirely new and exciting glitch.

      If computers can think faster than humans, they can also fuck things up faster than humans.

  • A properly empathetic underpaid phone support worker would have at least played him some nice music on a tiny violin, to mitigate the absolute horror of slowly circling in the parking lot 8 times.

    (My guess: there was a drain, and the AI was circling it)

    • by rta ( 559125 )

      i agree with the first part of your statement, that this is much ado about nothing, so i don't understand where the 2nd part comes from.

      From a "taking a cab to the airport" transportation POV how is his overall experience materially different from being stuck in a traffic jam for 10 minutes?

      My guess? Tempest in a tea cup.

      • Well, if a cabbie is driving in circles, its because he thinks you're a stupid tourist who doesn't know the area, so he is trying to rip you off by running up the meter... the impossibility of which is one of the selling points of the likes of Uber and Waymo in the first place. If a Waymo gets itself stuck into a loop you at least know it's an error, not deliberate malice.

      • Just making a shitty "circling" joke. Though I imagine whatever was causing the double rerouting (visible in the video where the car shows its intended path) is going to be excised from the code.

        Also, I would consider this much worse than being stuck in traffic for a while. More comparable to eg the cabbie falling asleep at the wheel without causing an accident -- no one likes a reminder that the AI controlling the vehicle can fail in unexpected non-human ways.

      • Where's the personal touch, the human contact says man who uses app to hail a robotaxi : /

  • by Cyberax ( 705495 ) on Sunday January 05, 2025 @11:00PM (#65065433)
    Well, duh. Every technology has these kinds of teething issues. It's more interesting that the issues reported are so small. Waymo has done hundreds thousands of rides, and so far the complaints are mostly of this kind.

    That being said, a way to do an emergency stop that can't be overridden is probably a good idea.
    • Re: (Score:2, Insightful)

      by rsilvergun ( 571051 )
      I think the problem is that their teething problems are taking place on roads human beings drive on and these are multi-ton vehicles moving at high speeds.

      Basically everyone in the city is being forced to take part in a extended alpha test whether they agreed to or not.
      • by Cyberax ( 705495 )

        I think the problem is that their teething problems are taking place on roads human beings drive on and these are multi-ton vehicles moving at high speeds.

        Waymo so far has been far safer than human drivers. Your point would stand if there were no cars, and Waymo suddenly decided to introduce them into the society.

      • I think the problem is that their teething problems are taking place on roads human beings drive on and these are multi-ton vehicles moving at high speeds.

        Your analogies are a bit messed up. Teething issues by definition are production issues. Baby's teeth don't fall out in the womb, and in projects you don't resolve teething issues before release, by definition they are after release. Given that none of these issues resulted in safety hazards and the most hazardous thing on the road are the human drivers (drivers, really? Not pedestrians, or people actually vulnerable, but you're concerned about other road users behind the comfort blanket of their multiple t

  • I wonder what would happen if the vehicled malfunctioned in an area without cell networks. Even satellite network can be obstructed by tall trees, precisely in the same areas that lack cell coverage.

    There needs to be some sort of button the passenger can press to force the vehicle to make an emergency stop.

    • Apparently, there is such a button, but I suspect the real answer is that the cars are set up to avoid such areas.
      Plus, the areas can be smaller than expected. A cell phone is typically restricted to around 300 mW. Maximum power is actually 4-5 Watts. So if the car has a bigger antenna than a cell phone, located higher up, and can transmit at around 10 times the power, well, it can get good signal in areas that the typical phone can't.

  • Where's the human connection to this?

    Certainly not in a driverless car, duh!

  • ... to an AI, it's response being "what a loser, kill yourself lol." We hope this has been helpful, goodbye.
  • It's a dilemma with any sort of automated tech: On the one hand, there ought to be an "override" button that absolutely interrupts the automation. On the other hand, if there is such a button, it will be abused by idiots, which is why it doesn't exist.

    On the gripping hand, this guy had a phone number to call, reached someone, and they were able to solve his problem. "Lack of empathy"? WTF? Call-center workers don't have any empathy left, after all the abuse they take.

  • by MrKaos ( 858439 ) on Monday January 06, 2025 @05:53AM (#65065793) Journal

    Empathy is a cognitive construct requiring a reasonably high IQ to simulate another persons experiences and then be able to ascertain how they feel about a situation then construct a cognitive solution they will release the other person from said situation.

    Empathy will be out of reach of AI and stupid people for a very long time.

  • by geekmux ( 1040042 ) on Monday January 06, 2025 @07:09AM (#65065883)

    ”Where's the human connection to this?"

    This? Coming from the one who voluntarily chose to pass up every human-powered taxi cab and Uber/Lyft-badged car to specifically use the autonomous car without a human driver?

    I truly have no fucking idea how society managed to warp the concept of expectations this badly. People are absolutely delusional for failing to understand a single timeless concept; Be careful what you ask for. You just might get it.

    • On your logic we should never have abandoned horses because it made lots of people unemployed. The remarkable feature of the capitalist economies is that they DO generate more jobs despite getting rid of so many. The problems mostly come where a local industry is destroyed - be that coal mines or car factories.

    • Yeap. Over-commodification and over-automation. Screw people who skip common sense because an app promises them perpetual motion or self-driving taxis.
  • Is it better to be ripped off by a human driver than by a machine?

  • Unless you are Alanis Morisette.
  • "Where's the empathy? Where's the human connection to this?"

    Kind of an odd complaint from someone who got into a driverless car instead of a taxi or Uber. Choosing to pick a robotic car over a human driver and then complaining about a lack of human connection seems indication a lack of awareness.
  • Every piece of industrial controls I work on has to have a hardware based "emergency stop" button on it. Why do Waymo machines not have that basic safety measure.
  • ... the good old days when it took a human cabbie to pad the fare by driving around.

We can found no scientific discipline, nor a healthy profession on the technical mistakes of the Department of Defense and IBM. -- Edsger Dijkstra

Working...