Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Transportation

Robotaxi Haters In San Francisco Are Disabling the AVs With Traffic Cones (techcrunch.com) 92

An anonymous reader quotes a report from TechCrunch: A decentralized group of safe streets activists in San Francisco realized they can disable Cruise and Waymo robotaxis by placing a traffic cone on a vehicle's hood, and they're encouraging others to do it, too. The "Week of Cone," as the group is calling the now-viral prank on Twitter and TikTok, is a form of protest against the spread of robotaxi services in the city, and it appears to be gaining traction with residents who are sick of the vehicles malfunctioning and blocking traffic. The protest comes in the lead-up to a hearing that will likely see Waymo and Cruise expand their robotaxi services in San Francisco. The California Public Utilities Commission (CPUC) is set to approve the expansion of both Cruise's and Waymo's autonomous vehicle passenger service deployments in San Francisco on July 13. The agency doesn't give companies permission to operate their AVs on public roads -- that's the Department of Motor Vehicles' domain. But it does grant companies the authority to charge passengers a fare for that service, which is an essential ingredient to scaling robotaxi and autonomous delivery operations sustainably. In May, the CPUC posted draft resolutions approving the expansion, despite mounting opposition from city agencies and residents.

Opponents called out the string of AVs that have impeded traffic, public transit and emergency responders, and asked that the CPUC move cautiously, set up workshops, collect more data, prohibit robotaxi deployment downtown and during peak hours, and limit the expansion of fleet sizes. Other opponents like the San Francisco Taxi Workers Alliance and the Alliance for Independent Workers have protested the spread of robotaxis, which they say will eliminate the need for taxi and ride-hail drivers. Safe Street Rebel's cone campaign is a bid to raise awareness and invite more pissed-off San Franciscans to submit public comments to the CPUC before next week's hearing. "These companies promise their cars will reduce traffic and collisions, but instead they block buses, emergency vehicles and everyday traffic," reads one video posted on social media. "They even un-alived a person and a dog. And they're partnering with the police to record everyone all the time without anyone's consent. And most importantly they require streets that are designed for cars, not people or transit. They exist only for profit-driven car companies to stay dominant and make it harder for transit to stay afloat."

While the above statement is a bit hyperbolic, there are nuggets of truth. [...] Nonetheless, the group brings up a common concern about unleashing autonomous vehicles onto public roads -- the lack of input from everyday people who have to deal with the vehicles on the ground. Congressional efforts to regulate self-driving cars have lagged for several years, so most regulation comes from state departments of transportation and departments of motor vehicles. The group is inviting others to follow its lead and disable the vehicles by "gently placing" cones on a driverless -- meaning, empty -- car's hood. Some people are apparently sending in submissions, but it's unclear how many people have sent images to Safe Street Rebel.
"Not only is this understanding of how AVs operate incorrect, but this is vandalism and encourages unsafe and disrespectful behavior on our roadways," Waymo said in a statement. "We will notify law enforcement of any unwanted or unsafe interference of our vehicles on public roadways."

"Cruise's fleet provides free rides to late-night service workers without more reliable transportation options, has delivered over 2 million meals to food insecure San Franciscans, and recovers food waste from local businesses," said Cruise in a statement. "Intentionally obstructing vehicles gets in the way of those efforts and risks creating traffic congestion for local residents."
This discussion has been archived. No new comments can be posted.

Robotaxi Haters In San Francisco Are Disabling the AVs With Traffic Cones

Comments Filter:
  • By creating malfunctions?
    Do I get that correctly.
    I think folks like this used to be called luddites... and penalties attached to the behavior they engaged in.
    Let's hope these won't be as significant... 12000 troops in the past.

    • by XXongo ( 3986865 )

      By creating malfunctions? Do I get that correctly.

      Well, they place the cones when the taxi is stopped, I assume. And they can choose a place where it won't block traffic, rather than when the taxi gets confused in the middle of an intersection.

    • by phantomfive ( 622387 ) on Friday July 07, 2023 @06:03PM (#63666699) Journal
      They're protesting malfunctions by preventing operation. At least get your story straight.
      • Tossing wooden shoes in prevented operation... A malfunction.
        Stopping the automated cars operating IS, by that definition, a malfunction.
        I had the story straight.

    • Like all those luddites who are against anti-personnel mines? Can't stop the march of progress! All hail progress!

    • Instigating a failure in a safe scenario, e.g. while the automated car is stopped by the curb, to prevent the failure from happening in an unsafe scenario, e.g. while the automated car is in the middle of traffic flowing at speed, is pretty sound engineering.

  • by postbigbang ( 761081 ) on Friday July 07, 2023 @04:44PM (#63666445)

    They're still unproven to many. They stop randomly, make stupid decisions, and even kill people.

    Do their coders and AI trainers get the ticket when they these vehicles do stupid tricks? No. They should.

    Moreover, the insurance they must carry should reflect the fact that they're no better than student drivers on a good day. If they're going to be "unleashed" on the public, they have to bear the consequences.

    • by DaFallus ( 805248 ) on Friday July 07, 2023 @04:54PM (#63666487)

      They're still unproven to many. They stop randomly, make stupid decisions, and even kill people.

      Do their coders and AI trainers get the ticket when they these vehicles do stupid tricks? No. They should.

      Moreover, the insurance they must carry should reflect the fact that they're no better than student drivers on a good day. If they're going to be "unleashed" on the public, they have to bear the consequences.

      California already has over 4 million uninsured drivers on their roads, and on a good day most people on the road are no better than student drivers either. The key is that the machines will improve whereas the human drivers only appear to be getting worse.

      • Source of your data?

        • by DaFallus ( 805248 ) on Friday July 07, 2023 @05:32PM (#63666591)

          Source of your data?

          Number of licensed drivers in California: here [statista.com]
          Percentage of uninsured drivers in California: here [iii.org]

          16% of 27,000,000 is about 4,000,000

          • That isn't counting unlicensed and uninsured drivers.
          • An HTTP and HTTPS access to iii.org produces an Apache server test screen, and the info access is forbidden.

            Another source, please?

          • by vux984 ( 928602 )

            The iii link methodology raises a lot of questions.

            Its based on insurance injury claims, and the ratio that the uninsured motorist clause of the insurance was invoked vs it being billed to the other drivers insurance.

            Is that a truly good proxy for the number of uninsured people on the road?

            Anyone in an accident with a vehcile committing a crime, joyriding, etc would likely end up using the uninsured motorist component. And the likelihood of an accident by someone in a stolen car seems like it might be highe

      • by zmollusc ( 763634 ) on Saturday July 08, 2023 @06:30AM (#63667749)

        The key is that the machines will improve whereas the human drivers only appear to be getting worse.

        This is easily proven by the way all software systems improve over time. Windows, Google Search, Twitter, Video Game Consoles, Youtube, Microsoft Office... all the software gets more efficient, more compact, more reliable and easier to use each and every update.

      • California already has over 4 million uninsured drivers on their roads, and on a good day most people on the road are no better than student drivers either.

        That is a crappy justification:

        "Look, they do it worse mom, why can't I do it since I am as good as them and will get better?!?!"

        "Well son, you are an AI and have no intrinsic need to go anywhere and they are humans that need to go somewhere and other humans dictated that they needed to use a car (or walk 20 miles to the doctor appointment and 20 miles back all while taking the afternoon off from work)."

    • and even kill people.

      Really? Who has been killed by a Robotaxi in San Francisco? I mean they ran over a dog that bolted into the street, but it really sounds like you're being hysterical.

      Moreover, the insurance they must carry should reflect the fact that they're no better than student drivers on a good day.

      Firstly. [Citation Required]. Secondly as a taxi company they already carry insurance that reflects a far higher risk than you or any of the millions of other people on the road (to say nothing of the uninsured).

      Please relax, you're being hysterical.

      • Didn't say it was in SF: https://www.nytimes.com/2018/0... [nytimes.com]

        There's not enough training. There won't be enough training for years. No one wants to pay for the training, but it NEEDS TRAINING for safety assurance.

        There's a reason there's a public protest. I'm not saying it's legal, but the sentiments are there and the VCs only look at return, not public safety. It will take years to gain public confidence, let alone a way to tax the living hell out of the taxi companies.

        • To pedantic right back at you... you did say "people" which is plural. As far as I know that is the only one. (Unless you try to shoehorn Tesla deaths into the same category, which is even more disingenuous.) And that company is no longer in autonomous cars, so it seems like the consequences were pretty serious.

          Do you have any data to back up your assertions? Like comparison of accident statistics between human-driven hired cars and autonomous hired cars in the same area? I don't know what the data s

        • So your argument is that since a company known for reckless indifference to rules and regulations made a self driving car under an irresponsible timeline that predictably killed one person 5 years ago and ceased operations in the sector, this fact should be held against completely different companies who have not followed that path and have not killed anyone. Interesting take.

          Does it cause traffic screwups occasionally? Absolutely. Are they as bad as the traffic screwups humans cause? Not even close. Yester

    • The future of humans vs AI. Put a picture of a kid chasing a ball on the hood of self driving cars until the AI learns to ignore kids chasing balls.
    • When was the last time it killed someone? What about all the people (40,000 just in the USA) killed in traffic accidents?

      • *40,000 per YEAR

        • Sadly, these people were killed by others, or themselves.

          One was killed by a programming crew. None should have happened.

          Most onerous are the numbers. The death by proxy is as terrible as those by engineers that couldn't design automotive functionality, or by the bartenders that kept serving a drunk.

          tl;dr is that none are good, but the programming team let it happen through error of omission. The omission was training, like letting a 12 year old drive the car.

          • What does it matter to a dead person if their accident was caused by bad programming or a drunk driver? And btw that one death was YEARS ago -- you can't look at airplane accidents in the 1910s to determine whether airplanes today are safe. You can't paint every autonomous vehicle company the same as that one that caused a death. If autonomous vehicles can reduce the number of deaths from 40,000 to 39,999 then it's worth having. Though all indications are that autonomous can be 10 to 100 times safer than a

    • They're still unproven to many. They stop randomly, make stupid decisions, and even kill people.

      This is a way more accurate description of human drivers than it is a Waymo or Cruise taxi. I see the first two multiple times a day, and the last is borne out by the statistics.

      • Nothing I've said implies that humans have no faults. Even with years of experience, they do things badly. They learn.

        AI becomes trained, too.

        The analogy of a 12 year old driver is appropriate here. We need to train the AI black boxes more thoroughly, and let them mature through that process.

        The statistics are lopsided? No, the sample size for humanity is huge. This is why we've adapted training, autos, signs/communications, roadways, etc for human drivers. Now we're making AI training sets adapt to what wo

        • I neither stated nor implied that you claimed humans are faultless. You stated several reasons why robotaxis aren't ready which humans are much, much more guilty of in both absolute numbers and at the rate in which they occur. This does not make any sense whatsoever, and I called you out on it.

          Even counting Uber's idiotic and predictable self-driving car death against Waymo, this would make Waymo 7 times safer than the average human driver RIGHT NOW. Waymo has driven over 10 million miles. Statistically, a

          • You make the error of imposing statistical analysis of humans on data-trained AI black boxes. Such an analysis will fail until the two are indistinguishable, which is period decades away.

            I don't argue rotten training of both humans and AI. Humans have other facilities that AI does not have. The domain of human reasoning only has a small intersection with the Venn diagram of AI "reasoning".

            Worse, there are eleven current monoliths of AI reasoning with few intersections, as they're built from both different h

            • You seem to be incapable of reading what I write without injecting a dizzying array of your own thoughts into them, then treating those thoughts as if I said them.

              I am applying nothing to AI training algorithms, etc., that is a very weird twisting of your own mind which you are applying to me. I am looking at nothing but results, and the results contradict your claims directly. It is literally impossible for me to make a logical fallacy with claims I did not make nor imply in any way. You're starting with t

          • Are we sure that we are comparing apples and apples? Are there even any stats available for human driver accidents per mile whilst only driving in the areas which robocars operate AND in vehicles of similar age and mechanical condition to the self-driving vehicles?

            Also, is there any proof that these self-driving vehicles are actually doing all the decision making themselves? How often are they falling back upon some minimum wage jerk remotely monitoring the car's cameras and giving the car the 'go for it' s

  • by dynamo ( 6127 ) on Friday July 07, 2023 @04:46PM (#63666455) Journal

    I don't know the story behind the sentence: "They even un-alived a person and a dog.", or if the story behind it is true or not, but I do know, the choice to use "un-alived" instead of killed or a similar real word makes them seem less serious about discussing this issue, and kinda ridiculous, at least from a linguistic standpoint.

    I don't know whose feelings are supposed to be being protected by avoiding using the word killed, but I don't see how it could be working, or how that tiny extra unconscious step of mental calculation could possibly prevent anyone from feeling the pain of contemplating death or the infliction of it by a robotic car. This is just stupid and changing language to make sad things more vague or hard to grasp is just annoying and makes the person saying it a less effective person. This whole trend of rewriting perfectly good words that are not offensive, and are necessary for normal civilized human communication, to soften them, should itself be killed.

    • by DaFallus ( 805248 ) on Friday July 07, 2023 @04:57PM (#63666497)
      I've seen people using that phrase more often on social media sites like TikTok because using words like kill, murder, and suicide is supposedly grounds to get the content removed.
      • by dynamo ( 6127 )

        Thank you for explaining, that at least makes some rational sense for how it could have started. Of course soon those sites will start censoring un-alive and then the cycle will go on.

        • Then people will move to other terms, possibly drawing from other languages, like '' (Russian), 'Occidere' (Latin), 'Töten' (German), 'Drepa' (Icelandic), 'Bulala' (Xhosa), or 'Fasioti' (Samoan), or wander off into other types of euphemisms like 'discorporate', 'embalmprep', or some other term, until the social media sites do something clever like employ AI text processing to determine context to distinguish between a conversation about a self-driving vehicle hitting and killing a pedestrian and someon
    • by SvnLyrBrto ( 62138 ) on Friday July 07, 2023 @05:33PM (#63666595)

      Yeah... Im not normally one to whinge about "snowflakes." But there's something about these dipshitted euphemisms like "unalived" and "died by suicide" that jumps out of the page and irks the hell out of me too. "Killed" and "committed" are supposed to be "triggers" and offensive now? Ugh. And to see it in a serious publication like TechCrunch is just gross. Professionals need to be better and follow a consistent and professional style guide. Grown adults, especially those who write for a living and should know better, trying to be cool by aping every little teen neologism are pathetic.

      • by dynamo ( 6127 )

        For what itâ(TM)s worth, I am not a typical âoesnowflakeâ whiner either. Iâ(TM)m too far left to support most democrats, and Iâ(TM)d be thrilled to see trump spend the rest of his life in prison. It also offends me when democrats who refuse to say the words single payer healthcare in public, let alone try to do something to get it for us, call themselves progressives. I like when words mean something.

  • by Baron_Yam ( 643147 ) on Friday July 07, 2023 @04:47PM (#63666461)

    These vehicles are not ready yet. They can and do indeed fail in ways they cannot recover from and will block traffic, and while any car can fail mechanically and a human driver can have a heart attack or a mental breakdown... it's got to be frustrating to be stuck behind a driverless car that won't move. There needs to be a requirement for these vehicles to be returned to manual control so someone can get them off the road, just like someone might move my car out of the street if I had a health emergency or abandoned it in the middle of the street... and if that results in stolen cars, oh well. Good thing you have a video feed to use to catch the carjacker, right?

    As for the fatalities... I suspect they're safer than human drivers. If that's true, the fact they they've killed at least one person is not a particularly good reason to try and ban them.

    They are taking one of the few bottom end exploitative jobs left available to the desperate. If you're driving a taxi or trying to make money with Uber or Lyft, it's not like you have a lot of other options. That's another issue we should be trying to solve.

    • If you drive for Uber or Lyft as your primary income source, you most certainly could go work retail instead. Probably be better off. If you are doing it as a part time gig then sure, you are likely to have fewer options given how flexible that work is.

      As you said though, Uber and Lyft are exploitive jobs that prey on the desperate. We shouldn't even allow them to exist but they have friends in high places.

      • If you drive for Uber or Lyft as your primary income source, you most certainly could go work retail instead.

        Do you know of any retail jobs hiring for a 9-11am, 1-2pm, 9-12pm split shift on Mondays, Wednesdays, and Thursdays and 8pm to 2am on Fridays and Saturdays? This probably looks like a normal schedule for an Uber driver with kids and a working spouse.

        How common are these retail jobs that let you have a flexible schedule so you can work around child care, your spouse's schedule, the care of a family member needing help, and any of the hundreds of other life situations that preclude a 9-5 schedule? There's par

        • I get what you are saying. This super restricted person is just trying to add a few dollars to the income pool between child and senior care. Those people likely make up an incredibly small portion of possible Uber drivers. I also wonder what your super restricted person would of done 15 years ago before Uber existed.

          They could maybe go work at a shoe store. My ex had a job at Payless Shoes years ago while she was in school that would give her three hour shifts about 3 to 4 times a week. She didn't stay the

    • As for the fatalities... I suspect they're safer than human drivers. If that's true, the fact they they've killed at least one person is not a particularly good reason to try and ban them.

      Uber killed a person, where the system was clearly at fault and should have avoided the collision. But Uber isn't really a credible AV company.

      The company that has killed far more people is Tesla. They're the "wink-wink" I'm marketing and charging you for a Level-5 car even though I can only legally say it's a Level-2 car. Tesla has killed far more people than Uber.

      Meanwhile Cruise and Waymo have killed no one and likely will in the future kill no one. They will indeed annoy a bunch of people, like they

    • it's got to be frustrating to be stuck behind a driverless car that won't move

      That hasn't happened to me yet (I don't drive much in SF), but I've had these driverless cars stop right in front of me in a crosswalk, blocking the crosswalk for no reason. "Don't block a crosswalk" seems like a pretty simple use case.

      As for the fatalities... I suspect they're safer than human drivers.

      They need to measure safety against non-drunk drivers. Because the drunk drivers are already not allowed to drive, and these cars have to be better than that.

    • They are taking one of the few bottom end exploitative jobs left available to the desperate. If you're driving a taxi ...

      The Uber-Lyft comment might be valid, but I am not sure it applies to conventional taxi drivers. I am not any of those things, but one of the reasons that taxi services have fought against the new tech services is that it intrudes on meaningful income as a taxi driver.

      What should be relatable to Slashdot readers is that I had a neighbor who was a well employed electrical engineer for a giant Fortune 500 company, enjoying his job, and making a nice middle class suburban living. Then, he was laid off. Need

  • Good for them! (Score:3, Interesting)

    by marcle ( 1575627 ) on Friday July 07, 2023 @04:49PM (#63666469)

    The residents of San Francisco, as well as first responders, bus drivers, etc., are being plagued with an experimental technology that's clogging their streets and endangering people. Not just the residents, but the City government, fire department, etc. are adamantly against this. Unfortunately, the tech corporations appear to have bought out the state government, which is where the decision will be made. I applaud any grass roots effort to harass these corporations. Simply putting a cone on top of the hood doesn't injure anybody, doesn't destroy any property, but it does send a loud and clear message.

    • Simply putting a cone on top of the hood doesn't injure anybody

      How does it not block the road, potentially injuring people by blocking emergency services? That may be true even if parked if it's near a fire hydrant.

      • blocking emergency services? That may be true even if parked if it's near a fire hydrant.

        Emergency services in SF already have aggressive methods for removing cars parked in front of fire hydrants. That is a solved problem.

      • If the emergency services are still staffed by humans rather than robots, one of the humans can remove the cone from the hood, and the robot will continue on its merry way. If there is a human passenger, and such person is not locked inside the robot, they too can remove the cone.

      • "How does it not block the road,"

        Do you think that people are running out in the middle of the street to put cones on moving cars?

  • right alongside the e-scooters :)

    • right alongside the e-scooters :)

      Could not agree more. Almost hit one those scooter assholes with my car last Wednesday. The bastard ran a red light without looking left or right over four lanes of traffic in front of three cars, including mine, that all had to slam on the brakes. At least the cyclists go slower and, for the most part, follow the damn rules.

    • by PPH ( 736903 )

      Who's "them"? The robotaxi haters? In that case, I'm 100% with you.

  • At a minimum, disturbing someone else's private property - the unmanned vehicle - with the intent to cause problems strikes me as criminal mischief.

    Also, unless they are just plain stupid or actually proud enough to say "yes, I did this and I'm proud of doing so," they will be wearing disguises. These cars have cameras, after all.

    • Probably considered a low level crime and won't be dealt with anyway, much like so many other problems the city is having issues with.

    • SF won't even take the time or effort to go after people that smash the windows of cars to steal stuff. You think they will bat an eye at a cone on a hood?
    • Putting an object on another without damaging it isn't illegal anywhere in the US AFAIK (normal citizen, not attorney). It may be rude. But rudeness is protected speech in the US.

      If the act results in damages it's incumbent on the entity to sue the perpetrator in civil court to recover damages after demonstrating they were in fact the result of the act and foreseeable by the average person. If there's no way to know if an empty stationary self driving vehicle is hired by a customer, then it's arguable th

      • Putting an object on another without damaging it isn't illegal anywhere in the US AFAIK

        This action would technically be considered vandalism (you don't know what the cone is doing to the paint, or if it's bending the metal) and if you just put a hand on another person without permission that is the definition of assault.

  • Vandalism? (Score:5, Insightful)

    by 93 Escort Wagon ( 326346 ) on Friday July 07, 2023 @04:52PM (#63666481)

    I don't see how placing a traffic cone on a car constitutes vandalism.

    • Depends on intent. Dropping a banana peel is just littering - unless you drop it on steep stairs with the intent of causing someone an injury.
    • If someone did that to my car, I'd certainly vandal their dumb ass.
    • Placing a traffic cone on a car and thereby causing it to stop in traffic, creating an obstacle in the road is vandalism.

      • Placing a traffic cone on a moving car in the middle of the street is also a thing that isn't happening.

  • by John Smith 2319 ( 10449688 ) on Friday July 07, 2023 @04:56PM (#63666493)
    I am quite sure that "food insecure San Franciscans" are not ordering food for delivery.
  • Safety (Score:1, Redundant)

    by glum64 ( 8102266 )
    Did anyone count how many dogs were `un-alived' under the wheels of cars driven by human drivers during the same period? How many were `un-alived' deliverately? O the roads where I drive, I see `roadkill' animals every other week. There are no robotaxis here.
  • Elected Leaders and Public Servants abandon Public Safety and the Rule of Law. And none of them saw this coming? Guess they thought only businesses and citizens they don't care about were getting hurt by their actions.
  • by Dwedit ( 232252 )

    So when does the article explain what an "AV" is?

  • I also discovered that if you tailgate these cars they will now pull over and let you pass.. That's been helpful.

  • Can't that issue get solved by a software update? The bumper sensor makes sure the front of the car has 10 yards of clearance, a quick hi-torque electric speed up and slow down and no more cone. Of course, the vandals evolve to step 2 - cat and mouse game.
  • So it's like turning a cat off for a while by putting an object on her head?
  • Standard advice to pedestrians in driver manuals: make eye contact with driver before attempting to cross in front of vehicle.

Make sure your code does nothing gracefully.

Working...