Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Transportation

Should Waymo Robotaxis Always Stop For Pedestrians In Crosswalks? (yahoo.com) 228

"My feet are already in the crosswalk," says Geoffrey A. Fowler, a San Francisco-based tech columnist for the Washington Post. In a video he takes one step from the curb, then stops to see if Waymo robotaxis will stop for him. And they often didn't.

Waymo's position? Their cars consider "signals of pedestrian intent" including forward motion when deciding whether to stop — as well as other vehicles' speed and proximity. ("Do they seem like they're about to cross or are they just sort of milling around waiting for someone?") And Waymo "also said its car might decide not to stop if adjacent cars don't yield."

Fowler counters that California law says cars must always stop for pedestrians in a crosswalk. ("It's classic Silicon Valley hubris to assume Waymo's ability to predict my behavior supersedes a law designed to protect me.") And Phil Koopman, a Carnegie Mellon University professor who conducts research on autonomous-vehicle safety, agrees that the Waymos should be stopping. "Instead of arguing that they shouldn't stop if human drivers are not going to stop, they could conspicuously stop for pedestrians who are standing on road pavement on a marked crosswalk. That might improve things for everyone by encouraging other drivers to do the same."

From Fowler's video: I tried crossing in front of Waymos here more than 20 times. About three in ten times the Waymo would stop for me, but I couldn't figure out what made it change its mind. Heavy traffic vs light, crossing with two people, sticking one foot out — all would cause it to stop only sometimes. I could make it stop by darting out into the street — but that's not how my mama taught me to use a crosswalk...

Look, I know many human drivers don't stop for pedestrians either. But isn't the whole point of having artificial intelligence robot drivers that they're safer because they actually follow the laws?

Waymo would not admit breaking any laws, but acknowledged "opportunity for continued improvement in how it interacts with pedestrians."

In an article accompanying the video, Fowler calls it "a cautionary tale about how AI, intended to make us more safe, also needs to learn how to coexist with us." Waymo cars don't behave this way at all intersections. Some friends report that the cars are too careful on quiet streets, while others say the vehicles are too aggressive around schools... No Waymo car has hit me, or any other person walking in a San Francisco crosswalk — at least so far. (It did strike a cyclist earlier this year.) The company touts that, as of October, its cars have 57 percent fewer police-reported crashes compared with a human driving the same distance in the cities where it operates.
Other interesting details from the article:
  • Fowler suggests a way his crosswalk could be made safer: "a flashing light beacon there could let me flag my intent to both humans and robots."
  • The article points out that Waymo is also under investigation by the National Highway Traffic Safety Administration "for driving in an unexpected and disruptive manner, including around traffic control devices (which includes road markings)."

At the same time, Fowler also acknowledges that "I generally find riding in a Waymo to be smooth and relaxing, and I have long assumed its self-driving technology is a net benefit for the city." His conclusion? "The experience has taught my family that the safest place around an autonomous vehicle is inside it, not walking around it."

And he says living in San Francisco lately puts him "in a game of chicken with cars driven by nothing but artificial intelligence."


Should Waymo Robotaxis Always Stop For Pedestrians In Crosswalks?

Comments Filter:
  • by Valgrus Thunderaxe ( 8769977 ) on Saturday January 04, 2025 @06:43PM (#65062403)
    What does the law say? That's what that should be required to do.
    • The CA DMV driver manual states that a car must stop/yield when a pedestrian is any part of the sidewalk (with an exception for divided streets), but Iâ(TM)ve never found any part the CVC (California Vehicle Code) that states this.

      Waymo lets people book pickups and drop-offs in areas that are marked âoeNo Stopping Any Timeâ, and thereâ(TM)s lots of other ways that Waymoâ(TM)s driving doesnâ(TM)t set a good example for other drivers to reference.

  • 1) The car's code should obey the law. Full stop. Anything less is a deliberate violation of the law, not an accident. And since it's code and not a human... it'll be consistently violating the law and can't be made to comply with threat of fines or jail time.

    2) There are assholes who will enjoy dipping their toes into the crosswalk area to stop the self-driving vehicles. This is, however, a separate problem and does not excuse coding ignoring a clear traffic law into the system.

    • And since it's code and not a human... it'll be consistently violating the law and can't be made to comply with threat of fines or jail time.

      Something we humans can only dream of.

    • What if obeying the law would cause immediate harm to humans or eve a greater likelihood of harm? The common example is the one person deciding to drive the speed limit on a road full of people going 5 - 10 mph (8 - 16 kmph for those not using vastly superior freedom units) over the posted limit which leads to an increase in accident rate.

      Personally I think that with careful tuning and adjustments to the algorithm the Bay Area can have self-driving cars along with a solution to the homeless and drug addi
      • >What if obeying the law would cause immediate harm to humans or eve a greater likelihood of harm?

        The breaking of pretty much any law can be successfully defended on those grounds in court if necessary - and if your action was obviously necessary, it's never even going to see a courtroom unless you're dealing with a particularly corrupt system.

        I see no reason why you wouldn't code THAT into your traffic rules... it's an uber-rule, kind of like the Hippocratic Oath or the First Law of Robotics.

        • by djinn6 ( 1868030 )

          The only long term option is to fix the inconsistency. Either change the law, change the speed limit, or fine everyone until they stop speeding.

          Also if you know where the First Law of Robotics came from, you should agree that there are significant dangers to loosely interpretating laws and coming up with unwritten uber-laws like the Zeroth Law.

      • What if obeying the law would cause immediate harm to humans or eve a greater likelihood of harm? The common example is the one person deciding to drive the speed limit on a road full of people going...over the posted limit which leads to an increase in accident rate.

        First, the law in your state says that slow-moving vehicles should be driven in the right-hand lane for traffic. [ca.gov] Slower vehicles staying out of the passing lanes helps keep everyone safer.

        Second, the type of crashes you're describing (rear end

      • by shilly ( 142940 )

        That’s a really terrible example.

        If it’s a single lane, then all the cars behind will now be travelling at the speed limit too. As they brake, their brake lights will come on, and everyone will slow down.

        If it’s more than one lane, then the driver should drive at the speed limit (or below) in the slowest lane, and other drivers can overtake if they want to break the law, without having to be impeded by the driver who is obeying the law.

        There are no circumstances in which this is dangerous.

    • If the car always stop it's also a way for street robbers to trap people or hijack the cars for parts.

    • 1) The car's code should obey the law. Full stop. Anything less is a deliberate violation of the law, not an accident. And since it's code and not a human... it'll be consistently violating the law and can't be made to comply with threat of fines or jail time.

      That's definitely a supportable position and it's not the only supportable one. There's definitely room for abuse on the part of pedestrians. Some goofball might get their kicks by intentionally blocking robocars which is something a real human would not put up with. Is that really what we want?

      How about speed limits? I see very few cars actually driving at or below the speed limit unless limited by traffic. The common advice for human drivers is you're safer travelling at the speed of other cars rather tha

    • Traffic laws aren't really like computer code, they wouldn't actually work as expected if followed rigidly because they've never been put to that test. Automating things always brings these "between the lines" issues into sharp relief.

      If you put a Big Brother computer program into everybody's car that could truly and strictly detect every violation of law and immediately rack up the fines, driving wouldn't even be feasible. (I know some people would think otherwise, that they 'never break the law'... b

  • In almost states, cars are required to stop for pedestrians in crosswalks. In California, cars must stop for people on the sidewalk who are in the process of framing the thought to step into a crosswalk. This is an unusually high bar for automated cars.

    • by PPH ( 736903 )

      In California, ...

      cars must stop for people on the sidewalk who are in the process of framing the thought

      That's clearly the empty set. Drive on.

  • stepped into the crosswalk and the Google car didn't stop, why wasn't anyone from Google ticketed and held to account?
  • by Bandraginus ( 901166 ) on Saturday January 04, 2025 @07:10PM (#65062471)

    More than once have I stopped at a pedestrian crossing, waiting for a pedestrian who is standing right on the curb. They're facing the crosswalk, but they're just standing there. How long is it practical for me to yield? The pedestrian doesn't seem interested in communicating their intent.

    Or worse yet, sometime pedestrians standing there wave me through, but legally I'm liable if they change their mind, so I always yield. But then you are gesticulating like jerks to each other: you go, no you go, no you go.

    My point is, even though the law is clear, in practice it's sometimes not black & white. I get why the Waymo devs have built in a "pedestrian intent" mechanism.

  • by HyperQuantum ( 1032422 ) on Saturday January 04, 2025 @07:51PM (#65062545) Homepage

    Yes, they should stop Waymo often.

  • For electric vehicles, stopping or braking is a lot less costly than on gas powered vehicles. Not only can they recover the energy loss, but also they are quicker to accelerate.

    As such, yes, Waymo should stop for every pedestrian. Specially for a company like Waymo, that will need a lot of public support & goodwill, if they want to become mainstream.

    For distracted or undecided pedestrians, it could use lights or sounds to alert them of the fact that it is safe for them to cross. Like nowadays, the li
    • by PPH ( 736903 )

      For distracted or undecided pedestrians, it could use lights or sounds to alert them of the fact that it is safe for them to cross.

      By "it", are you referring to the Waymo? Because if a car (robotic or otherwise) flashes lights or uses a sound (which might be interpreted as a horn) on any pedestrians, there will be a riot. Conducted by The Urbanists and Walkable Cities crowds.

      If you are referring to crossing signals, we have these in many places already. But they depend on pedestrians understanding the meaning of the Big Red Hand. Few do.

  • Counterpoint (Score:4, Interesting)

    by edi_guy ( 2225738 ) on Saturday January 04, 2025 @08:34PM (#65062605)

    Iâ(TM)ve crossed in front of Waymos at least 60 times probably closer to 100 at this point. They are ubiquitous in my area. Never had an issue, car always behaved reasonably and predictably.

    Ditto that for when Iâ(TM)m a bicyclist sharing the road and as an auto driver. Waymos at this stage are more consistent, predictable, and generally law abiding than the humans on the road. Always room for improvement but they are ready for prime time now.

    It is of course popular to dump on them, but as more folks experience robot taxis both as occupants and sharing the road these sort of hit pieces will go out of style. When they come to your area they will be instantly popular, I have no doubt.

  • From reading this article, you might assume that human drivers have never hit people in crosswalks, and that somehow Waymo vehicles are creating some new danger that no one has ever anticipated. But looking at the article, you find this:

    No Waymo car has hit me, or any other person walking in a San Francisco crosswalk - at least so far.

    Given the way that pedestrians behave and humans drive in most major cities, I find this particular statement very reassuring with respect to self-driving technology. I've personally been hit by a car in a crosswalk (the driver was busy chatting on her cell phone), knocked

    • by shilly ( 142940 )

      I don’t think that’s a fair interpretation of the article. It’s really not an unreasonable standard of perfection to expect Waymo cars to obey the law, which includes stopping at crosswalks for pedestrians.

  • So I say "No" for Waymo drivers too.

    For one, it's an obvious "denial of service" method. Just put a foot in a crosswalk and it'll never move forward again. Like that cone in front of them (or was it in front and behind?). Could just sit on the curb and mess with it while using your phone.

    Doesn't matter what that one person 'feels'. Also doesn't matter what the law is, because again... why should robotic taxis be more 'honest' than humans?

    Nobody allows the expected following distance while driving on fr

    • For one, it's an obvious "denial of service" method. Just put a foot in a crosswalk and it'll never move forward again.

      A pedestrian can already stand in the middle of the road to "Denial of Service" normal cars with human drivers. If large number of people turn in to massive dickheads prepared to do incredibly tedious things for vast amounts of time just to fuck with other people then society will break down with or without robocars.

  • Should Waymo Robotaxis Always Stop For Pedestrians In Crosswalks?

    Ah... the automated-car "prioritize occupant" vs. "prioritize pedestrian" passenger-settable toggle -- from the Amazon TV show Upload [wikipedia.org]. Quite the conundrum.

    (The show is funny and poignant -- and Andy Allo is adorable -- I recommend it.)

  • Betteridge (Score:4, Funny)

    by PPH ( 736903 ) on Saturday January 04, 2025 @09:47PM (#65062725)

    Anyone?

  • But I always stop for pedestrians if I can see them. Even if they are jaywalking (which is now legal in California)

  • by robinjo ( 15698 ) on Sunday January 05, 2025 @02:00AM (#65063103)

    It depends on what leads to a better outcome for people in general.

    Laws are strict. People don't follow them strictly. That's because the aim of the law is to improve life and guide us. We're not meant to spend our lives completely obeying every law. So as a large group of humans, we tend to gravitate towards partially following the purpose of the law instead of strictly following the letter of the law. Once you add AI, your aim should be for the machine to also imitate that - which is not a simple thing to do.

    If you think that the machine should strictly follow every law and "teach" us humans to also do that, you're missing the whole point of being human. Humans are imperfect and will never ever be perfect. Life where you strive to be perfect and not break a single law or rule is unbearable. It's a utopia for unbearable control freaks.

    And last, laws are also written by imperfect human beings. And some are written by unbearable control freaks. We're not supposed to blindly follow them to the letter.

  • by Misagon ( 1135 ) on Sunday January 05, 2025 @03:39AM (#65063191)

    If self-driving cars don't obey the traffic laws, then the argument of them being safer than human drivers goes out the window.

A language that doesn't affect the way you think about programming is not worth knowing.

Working...