Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Transportation

Driverless Taxis are Causing More 'Disruptions', San Francisco Officials Complain (sfchronicle.com) 88

After a severe rainstorm, two Cruise robotaxis drove past several downed trees and power lines, and then through caution tape, reports the San Francisco Chronicle. And then one of the Cruise vehicles caught on a low-hanging power wire for the city's bus system, "dragging it upward the rest of the block."

The article notes that the transit agency "had already de-energized the lines by the time the Cruise taxi hit them." But the cars only stopped "after driving through another set of caution tape and sandwich boards." Cruise personnel who retrieved the entangled car had to manually back it up a half block "to release the tension on the wire," according to a San Francisco Fire Department report. No one was inside the cars at the time, and no one was hurt...

But for city officials who oppose the rapid expansion of driverless taxi companies Cruise and Waymo, the episode reflects a recent and troubling trend. As driverless taxis ramp up operations in San Francisco, their disruption and close calls have increased in frequency and severity as well, officials say. "It really, really concerns me that something is going to go horribly wrong," Fire Chief Jeanine Nicholson said.

Cruise and Waymo say city officials have mischaracterized their safety track records. Their driverless taxis, the companies say, have lower collision rates than human drivers and public transit. Their self-driving cars, they argue, help improve traffic safety in San Francisco because their cars are programmed to follow posted speed limits.

The Fire Department has tallied 44 incidents so far this year in which robotaxis entered active fire scenes, ran over fire hoses or blocked fire trucks from responding to emergency calls. That count is double the figure from last year's informal count, which Nicholson said does not include all incidents.

Meanwhile the city's transit agency tallied 96 incidents just in March "where driverless cars disrupt traffic, transit and emergency responders," according to the article — and then another 91 in April.

But the issue is drawing more attention now because next month California's state regulatory agency and DMV "will vote on whether to allow Cruise and Waymo to charge for rides at all hours with no restrictions."
This discussion has been archived. No new comments can be posted.

Driverless Taxis are Causing More 'Disruptions', San Francisco Officials Complain

Comments Filter:
  • Duh (Score:5, Interesting)

    by zenlessyank ( 748553 ) on Saturday July 15, 2023 @11:39AM (#63688327)

    This is beyond stupid. Ever heard of putting the cart before the horse? We have a ways to go and more changes made before driver-less vehicles are a good idea.

    • Re:Duh (Score:5, Funny)

      by fahrbot-bot ( 874524 ) on Saturday July 15, 2023 @01:11PM (#63688535)

      Ever heard of putting the cart before the horse?

      Ya, and a driverless taxi ran into both of them. :-)

    • At the very least, there needs to be an algorithmic way to determine if a situation is new and that the car doesn't recognize the appropriate response. At that point, the car either needs to turn around and calculate a new route or pull over safely. This sounds bad but at least it wasn't an occupied vehicle driving directly into flood water.

      The fix can't be too train it on every possible scenario because that is infinite. We don't have the kind of AI that can guess safely.

      • third option,
        have available a fleet of 'remote operators' that can assist cars in unprecedented situations. Think predator pilot.

    • Re: (Score:3, Insightful)

      by Joce640k ( 829181 )

      It's obvious we need a broadcast system that allows a street to be temporarily blocked off to driverless cars.

      This would be useful for all sorts of things.

  • by MpVpRb ( 1423381 ) on Saturday July 15, 2023 @11:46AM (#63688349)

    This is the most dangerous thing they do
    Drivers will do all sorts of unsafe things to get around them
    Adapting to the natural flow of traffic is safer

    • by quonset ( 4839537 ) on Saturday July 15, 2023 @11:59AM (#63688381)

      This is the most dangerous thing they do

      You do realize there are a multitude of vehicles on the road now whose speed is monitored by GPS to keep them at the posted speed limit, don't you?

      Drivers will do all sorts of unsafe things to get around them

      Then these unsafe drivers should learn to properly drive so they're not doing unsafe things.

      Adapting to the natural flow of traffic is safer

      Sure. Who doesn't want a fifteen ton tractor trailer doing 80+ on a highway filled with cars.

      • You do realize there are a multitude of vehicles on the road now whose speed is monitored by GPS to keep them at the posted speed limit, don't you?

        And they still regularly get plowed into for going too slow and causing impatient drivers to do stupid things to get around them. Hell, I was barely below the speed limit due to a traffic jam a couple days ago and had a car blaze by me on the left hand median.

    • by Calydor ( 739835 ) on Saturday July 15, 2023 @12:16PM (#63688409)

      If the natural flow is to ignore the posted rules there is a problem, and that specific problem is not the driverless car following the speed limit. There are plenty of other problems with them, but that one is not an AI problem; it's a human problem.

      • under posted limits need to go as well the work zones that can be way to low.
        Like 45 where their walls and traffic goes about 70 all the time.

        If they had realistically set ones then speeding will not be an big thing.

      • Mostly correct. The speed limits are set very conservatively so your nearly bald tires in the rain still hold you on the road with your top heavy panel truck.

        If a person can figure out the actual safe driving speed for the actual road conditions, then the AI should be able to manage it too.

        As for the storm made a mess of things situation, the simple fix is to shut down the sel-driving cars until the authorities have identified the unsafe streets and sent the data to the car company.

        It's not just an urban

  • How does that compare to cars with drivers? I've seen drivers do a lot more stupd things than that.
    • I have long said that "computers fail in non-human ways." That makes it harder to predict what a failed computer system will do than what a human failure will look like. That said, both can be responsible for unavoidable disaster.
    • by 93 Escort Wagon ( 326346 ) on Saturday July 15, 2023 @01:10PM (#63688527)

      The issue with that argument is that most human drivers do *not* do those things. But if one automated vehicle does something stupid in a particular circumstance, then the rest will also have that problem when faced with the same situation.

      • By the same argument they can all simultaneously stop doing certain things, too. Something that's impossible with humans.

        This is a learning process just like everything else. Maybe what's needed is an emergency system that can block individual streets to driverless cars.

        (cue the "but it will be hacked!" nannies with their knee-jerk nirvana fallacies)

    • Human drivers do stupid things. Driverless cars do stupid things. However, the overlap between the two of them is quite small. When a computer does a stupid thing it usualyl is something that a human would have to be really drunk (or stoned) to do.
      Example from TFS:

      After a severe rainstorm, two Cruise robotaxis drove past several downed trees and power lines, and then through caution tape, reports the San Francisco Chronicle. And then one of the Cruise vehicles caught on a low-hanging power wire for the city's bus system, "dragging it upward the rest of the block."

      A human driver would have to be really drunk to not only hit the wire, but especially to drag it for some distance. This was in a city, so they could not have been going very fast.

  • had already de-energized the lines

    What twaddle. You mean they cut the power?

    programmed to follow posted speed limits.

    No wonder people hate them.

    • had already de-energized the lines

      What twaddle. You mean they cut the power?

      If they already cut the power then they could have put up the special beacons that block the street to self-driving cars.

      (you know, the ones that would be a good idea to invent and deploy after incidents like this one)

    • You mean they cut the power?

      What do you mean they cut the power?

  • power line trucks will need to have manual drive for an long time.

  • by Joe_Dragon ( 2206452 ) on Saturday July 15, 2023 @11:58AM (#63688379)

    the city needs to bill Cruise for the damage or take them to court. At real court not arbitration

  • Unfair (Score:4, Insightful)

    by backslashdot ( 95548 ) on Saturday July 15, 2023 @12:05PM (#63688391)

    What about the around 35 people killed every year in traffic accidents in San Francisco, and hundreds severely injured, by human driven vehicles? That's not an inconvenience? ZERO people have been injured, let alone killed in self driving cars in San Francisco. In fact, no self driving car anywhere in the world has killed anyone since that incident with Uber when a woman blindly walked in front of a vehicle 5 years ago. We keep hearing about it over and over like it happened yesterday and every day prior -- like Uber hadn't reduced the number of cameras in that vehicle and technology hasn't advanced since then.

    I've never seen even one story about a human driven vehicle traffic death in San Francisco make national headlines. In fact, it often won't even get a footnote in any Bay Area newspaper.

    • like Uber hadn't reduced the number of cameras in that vehicle and technology hasn't advanced since then.

      Improvements to technology are not even required in this scenario. Collision detection / avoidance was not enabled so there was no mechanism to avoid a jaywalking pedestrian. So it was not that the system could not detect the pedestrian, it just was not programmed to avoid them. That is what the human driver was for. Too bad they were watching TV on their phone - perhaps they were inadequately informed on what their job entailed.

      • Too bad they were watching TV on their phone - perhaps they were inadequately informed on what their job entailed.

        Even if informed, there is a limit on how long someone can stare at nothing and still be attentive. People get bored.
        For example, when I am driving my car on a highway, I have to constantly monitor my speed (and the speed limit sometimes changes), the road, input slight corrections. This provides enough engagement for me to stay attentive. However, let's say I get a new car that has the self-driving feature and I use said feature. This would mean that the car would be able to keep the speed (according to th

    • Re: (Score:2, Interesting)

      by laddiebuck ( 868690 )

      She didn't just blindly step out in front of the car. There was enough distance that a human driver would have caught it. Part of the issue is that the self-driving was disengaged and the driver paid to pay attention wasn't paying attention. It's doubtful based on the car's software whether it would have correctly braked, but it's also academic: it was never given the chance.

      In other words, it was not the pedestrian's fault, it was the driver's fault, and the self-driving mechanism was completely disengaged

      • by Anonymous Coward

        It wasn't the pedestrian's fault for crossing the street when a car was approaching? Since when do people cross the street outside of a designated crosswalk with the expectation that approaching cars will slam the brakes? Of course any driver should be required to press the brakes, but no pedestrian should be jaywalking with the expectation that approaching cars hit the brakes.

        • Herzberg stepped into the street 378 feet (115 m) out from the car. The stopping distance at the posted speed limit was 89 feet (27 m). As a human driver in that situation, you are required to stop. The safety driver, who should have been attentive, was in fact charged with negligent homicide and is set for trial this year. Uber settled with Herzberg's daughter, which they wouldn't have done had there not been sufficient grounds for legal action.

          The car's emergency braking and driver alerting software were

    • by m00sh ( 2538182 )

      What about the around 35 people killed every year in traffic accidents in San Francisco, and hundreds severely injured, by human driven vehicles? That's not an inconvenience? ZERO people have been injured, let alone killed in self driving cars in San Francisco. In fact, no self driving car anywhere in the world has killed anyone since that incident with Uber when a woman blindly walked in front of a vehicle 5 years ago. We keep hearing about it over and over like it happened yesterday and every day prior -- like Uber hadn't reduced the number of cameras in that vehicle and technology hasn't advanced since then.

      I've never seen even one story about a human driven vehicle traffic death in San Francisco make national headlines. In fact, it often won't even get a footnote in any Bay Area newspaper.

      This may sound harsh but a lot of the people who get into car accidents deserve it.

      I curse a large number of cars I encounter on the road. They are car accidents deaths waiting to happen.

      The only sad thing is that they will also kill others who probably don't deserve it. Some of them with their oversized vehicles will kill others but escape without harm themselves.

    • by marcle ( 1575627 )

      The point isn't whether or not the cars have killed somebody, it's about disrupting traffic and hampering emergency services.

      • The point isn't whether or not the cars have killed somebody, it's about disrupting traffic and hampering emergency services.

        So.... what the emergency services need is some sort of beacon they can deploy to block self-driving cars from entering a street?

        Seems like a good idea to me.

    • The absolute number of people killed is a really poor metric. You should be more interested in accidents/deaths/property damage amounts per mile driven or per thousand miles driven.

      • You should be more interested in accidents/deaths/property damage amounts per mile driven or per thousand miles driven.

        Also in the mechanisms that can be used to reduce that:

        Humans: Individual drivers might learn but it it's impossible to get everybody in the country to change their habits. Distractions are increasing daily, there will always be a percentage who are driving drunk, driving distracted, are too stupid to think about weather conditions, have had more than a few years driving experience, etc., etc.

        Computers: Changes in behavior can be broadcast nightly, every single car is safer next day. They never get tired/dr

    • Given how many cars there are on the road -- literally millions -- I'm surprised the number of people killed every year isn't higher.

      Geeks don't like to admit it, but seriously, human drivers are actually much, much better at driving than we like to admit. Almost all accidents are caused by a very small number of mentally or physically troubled people, DUIs, badly maintained roads, or mechanical failures.

      I have little faith that self-driving cars will change things much. Of course, that won't stop some as

    • by noodler ( 724788 )

      What about the around 35 people killed every year in traffic accidents in San Francisco, and hundreds severely injured, by human driven vehicles? That's not an inconvenience?

      That's a pretty stupidest argument.
      There are about 500 autonomous cars in SF. There are over 400.000 registered cars in SF and many more travel into the city from outside.
      You can't directly compare the two groups or their effects.
      And consider this, if we take the number of incidents in may, as the article describes, and we scale that up to all cars being driverless, then there would have been about 100.000 incidents in just one month. That's a terrible prospect.

  • This all has to be Elon Musk's fault. I just know it.

  • Useless data... (Score:4, Informative)

    by sarren1901 ( 5415506 ) on Saturday July 15, 2023 @12:20PM (#63688411)

    Telling me about the number of incidents is fairly empty without anything to compare that against. How many incidents occurred with cars with drivers? What's the per mile incident ratio between the two?

    Maybe these robotaxis are just horrible but these numbers don't really help us determine that in the way they are presented. They may very well be better then humans on a per mile basis but without the comparison numbers we can't be sure.

    • I think the nature of the incidents is what is worrying. It clearly shows that the software has no understanding of the physical world around it, but just a set of stored situations and actions. That way, every time a new situation happens, the software will malfunction. What if a bridge collapses in an highway and somebody frantically signals the cars to stop. It's easy to think that, in a world of driverless cars, the software will ignore the jumping man, and drive on. The falling may then never stop, unt

    • Telling me about the number of incidents is fairly empty without anything to compare that against. How many incidents occurred with cars with drivers? What's the per mile incident ratio between the two?

      Maybe these robotaxis are just horrible but these numbers don't really help us determine that in the way they are presented. They may very well be better then humans on a per mile basis but without the comparison numbers we can't be sure.

      There's two problems with that approach:

      1) Accidents are rare enough events that the error bars are pretty larger.
      2) It's an Apple to Oranges comparison since the driverless taxis are used by different kinds of clients, going different kinds of routes, than regular taxis.
      3) Even if you corrected for routes and driving scenarios there's a huge reporting bias in the "incident" rate. Say a human taxi driver blows through a stop sign, is that getting reported? What if the driverless taxi does the same?

  • by frdmfghtr ( 603968 ) on Saturday July 15, 2023 @12:21PM (#63688413)

    "Cruise and Waymo say city officials have mischaracterized their safety track records. Their driverless taxis, the companies say, have lower collision rates than human drivers and public transit. Their self-driving cars, they argue, help improve traffic safety in San Francisco because their cars are programmed to follow posted speed limits."

    Ok that's wonderful, but you're missing the obvious here...your car ran through a street barricade. There was an object in the street intentionally placed to stop traffic and you took it out. The simplest of things to trigger a stop and it didn't happen. Just own the failure and be honest about it; "hey our car didn't perform its job correctly so we're going to look at the data, figure out what happened, and fix it."

    Of course insurance companies and lawyers will advise against that.

    • I agree that a response that admits culpability is much more encouraging than one that avoids blame. But thereâ(TM)s probably more to the quote than what is listed in this debrief.
    • by Baron_Yam ( 643147 ) on Saturday July 15, 2023 @12:31PM (#63688443)

      Honestly, this is absolutely the one kind of thing these cars should NEVER fail at - if there is a physical obstruction, they should never choose to collide with it.

      It's not an edge case where something really unusual didn't register - it was a road obstruction. It doesn't matter if it's a barricade, a kid on a bike, or the Ghost of Christmas Past - if there's something in the way, do not hit it. It a specialized case of the First Law of Robotics for self-driving vehicles. People have been thinking about this for longer than I've been alive. There is no way the team programming this car didn't understand this.

      So why did it hit an obstruction? I would say this failure is significant enough that the entire fleet should be banned until they have explained it and corrected it to the satisfaction of the government, the public, and a team of independent software auditors.

      • If the barricade was something non-standard or other operating conditions were outside of the norm, the car may not have registered it that way. The driverless AI programs work pretty well if the situation is close to how they were trained, but they completely fail in the most stupid ways when conditions devote from what's expected. Sure you can put in some manually coded overrides to try and prevent these, but it's possible those existed but bugs in the code by a human programmer caused led to the failure
      • by canavan ( 14778 )
        [quote]It's not an edge case where something really unusual didn't register - it was a road obstruction. It doesn't matter if it's a barricade, a kid on a bike, or the Ghost of Christmas Past - if there's something in the way, do not hit it. [/quote] I's not as clear cut as you present it. An autonomous ca should not hit the brakes hard for a variety of soft, light objects that the wind might blow into is path, such as leaves or garbage such as plastic bags. I think there may be enough overlap with the char
  • by Residentcur ( 1189677 ) on Saturday July 15, 2023 @12:22PM (#63688415)
    There are more "corner cases" in normal driving conditions than there are ordinary situations, one realizes after thinking about it for a moment. These systems in well-mapped areas, and Tesla's broader approach, all do well in the normal case, and where other drivers are not behaving too badly. But the number of things that remain to be handled before we'll truly have unpiloted transport is quite large. It is certainly possible that an AI breakthrough will allow many of these situations to be handled in a short amount of time -- for instance, reading and heeding more signs, and knowing more rules like the flashing school bus lights one -- but the truly edge cases will take quite a while. The rules for setting up road blocks, which people recognize pretty easily, are not sufficiently standard to have any confidence that robots will recognize them even most of the time. Recognizing accident scenes that have not yet been controlled will take even longer. I hope I am proven wrong.
    • by ZectronPositron ( 9807174 ) on Saturday July 15, 2023 @12:28PM (#63688431)
      Good point - set up Rules for how to block an accident scene, in a way that a self driving car knows to avoid it. For example, a GPS beacon synced to all autonomous car systems telling self driving cars to avoid the area. Eg. A quarter-mile radius around an emergency. Really the first responders should be given special access to the self driving cad system
      • Good point - set up Rules for how to block an accident scene, in a way that a self driving car knows to avoid it.

        But doing that in a standard way will require regulation, which all these companies hate.

        I'd suggest, in addition to the GPS beacon you mention, requiring all autonomous vehicles include a proximity kill switch which can be triggered by emergency personnel and first responders. The GPS beacon can also broadcast the kill signal - then if a car starts to go into a problem zone, it'll just stop.

        • but an fake cop or maybe an rent an cop can use that GPS to do bad things.
          also that GPS system needs to FREE TO USE TO the emergency personnel. No sub fees / No update fees / No DRM fees / No per user fees.

        • standard way will require regulation will come after say an self driveing car / trunk kills an school bus load full of kids.

        • by PPH ( 736903 )

          The GPS beacon can also broadcast the kill signal

          That's not the way GPS works. The satellite system broadcasts timing signals which receivers use to triangulate their positions. The satellites have footprints of hundreds of miles, so piggy-backing all of the possible extra messages onto the system that might originate within this area would cripple the system.

          There are other systems more appropriate for this use. Cellular networks, FM sub-carrier data broadcasts, etc. My in-car GPS receivers already have FM data receivers built in with area traffic infor

      • There is already a set of rules for blocking an accident scene and they were followed. The Robocars need to recognise it in the same way that a human driver would. This is part of replacing a human driver.

      • Great. Hope the first responders arrive before the self driving cars, then.

      • by ugen ( 93902 )

        Yes, if AI can't handle the real world - we just modify the real world so the limited AI can handle it. Why didn't we think of it before?
        And if we simplify the world sufficiently and limit the options for the human participants enough - may not even need an AI.

      • Or just drop a traffic cone in the middle of the street. Apparently they are programmed to recognize traffic cones. Let's make a law that every human driver is required to carry a traffic cone and mark hazards for self-driving cars.
      • A better point would be - if there is an obstacle on the road - don't hit it.
        In my country (and probably the rest of the EU), if your car breaks down on the road, you have to turn on the hazard lights or if those do not work, set up a warning sign (a red retroreflector triangle) 25-50 meters away from the car.
        OK, but what to do if you do not have that sign? You are supposed to have it in your car, but you don't. What now? Different people may think of different things to do, maybe stand with the flashlight

    • There are more "corner cases" in normal driving conditions than there are ordinary situations, one realizes after thinking about it for a moment. These systems in well-mapped areas, and Tesla's broader approach, all do well in the normal case, and where other drivers are not behaving too badly. But the number of things that remain to be handled before we'll truly have unpiloted transport is quite large. It is certainly possible that an AI breakthrough will allow many of these situations to be handled in a short amount of time -- for instance, reading and heeding more signs, and knowing more rules like the flashing school bus lights one -- but the truly edge cases will take quite a while. The rules for setting up road blocks, which people recognize pretty easily, are not sufficiently standard to have any confidence that robots will recognize them even most of the time. Recognizing accident scenes that have not yet been controlled will take even longer. I hope I am proven wrong.

      I think the argument is that while there are many more corner-cases in driving than normal conditions, there are incredibly more miles-driven in those normal conditions than in corner-cases. Further, while human drivers seem good at recognizing corner-cases, they're less reliable in normal conditions.

      Sure, these things are running over fire hoses - and that's bad - but human drivers are tailgating, performing unsafe lane changes, speeding, and getting distracted. , causing loss-of-life. At least that's

    • Also spatial modeling isn't like text modeling, multimodal LLMs depend on the availability of data tagging and the amount of tagged data for all the corner cases in the world is tiny relative to the availability of text.

      Not that you really want to use large data neural models for self driving either until they are AGI level, but then you have the problem of trying to enslave an AGI to drive your car.

  • Similar to how first responders have remote control over traffic lights and elevators, they should give first responders controllers for the autonomous cars. Would not solve this exact problem but possibly a good idea.
  • After a severe rainstorm, two Cruise robotaxis drove past several downed trees and power lines, and then through caution tape, reports the San Francisco Chronicle. And then one of the Cruise vehicles caught on a low-hanging power wire for the city's bus system, "dragging it upward the rest of the block."

    Just put an old guy behind that wheel, add a perpetually-blinking turn-signal, and people will just say, "Well, of course." :-)

    [Note to Cruise (et al): I'm not suggesting this as a PR workaround.]

    • by m00sh ( 2538182 )

      After a severe rainstorm, two Cruise robotaxis drove past several downed trees and power lines, and then through caution tape, reports the San Francisco Chronicle. And then one of the Cruise vehicles caught on a low-hanging power wire for the city's bus system, "dragging it upward the rest of the block."

      Just put an old guy behind that wheel, add a perpetually-blinking turn-signal, and people will just say, "Well, of course." :-)

      [Note to Cruise (et al): I'm not suggesting this as a PR workaround.]

      Or an over-confident middle-ager.

      Or a teenager or a young driver.

      Of course, Of course.

      Every age group has horrible drivers.

  • by backslashdot ( 95548 ) on Saturday July 15, 2023 @01:07PM (#63688511)

    Why isn't there a remote control ability on these? If it gets honked at or tapped, that should alert the remote operator to pause Netflix and watch the screen.

  • Then they'll keep their distance.

  • Their driverless taxis, the companies say, have lower collision rates than human drivers and public transit.

    Which human drivers are we talking about? One-third of all traffic fatalities are due to drunk driving. One-third are due to nighttime driving. Rural drivers tend to get in more accidents than city drivers. Most drivers don't drive drunk or at nighttime, or in the countryside. Subtract out those drivers, and the "average" human driver has much better collisions rates. How do Cruise and Waymo cars perform compared to these "average" drivers?

    Driving a huge bus is more challenging than a passenger car.

    • by m00sh ( 2538182 )

      Their driverless taxis, the companies say, have lower collision rates than human drivers and public transit.

      Which human drivers are we talking about? One-third of all traffic fatalities are due to drunk driving. One-third are due to nighttime driving. Rural drivers tend to get in more accidents than city drivers. Most drivers don't drive drunk or at nighttime, or in the countryside. Subtract out those drivers, and the "average" human driver has much better collisions rates. How do Cruise and Waymo cars perform compared to these "average" drivers?

      Driving a huge bus is more challenging than a passenger car. Cruise and Waymo need to show how well their self-driving buses perform for a more relevant comparison.

      Human driver accidents are lower because of a lot of other careful drivers. There are so many horrible drivers that speed, floor the gas pedal on yellow lights and blast through intersections etc etc. Every time I go driving, I always almost see a close call and numerous numerous cases of bad driving. By just other careful drivers or good reactions and anticipation, bad accidents were avoided.

      Autonomous vehicles don't come close to that kind of recklessness.

      • Autonomous vehicles don't come close to that kind of recklessness.

        They are reckless in a different way, for example, driving through caution tape, hitting a wire and dragging the wire for a while, not noticing that there is a problem.
        Or driving right into a concrete lane divider.during the day in good weather.

  • Luddites gonna Ludd.

    https://catdir.loc.gov/catdir/samples/random045/2002090323.html

    "The automobile, so sleekly efficient on paper, was in practice a civic menace, belching out exhaust, kicking up storms of dust, becoming hopelessly mired in the most innocuous-looking puddles, tying up horse traffic, and raising an earsplitting cacophony that sent buggy horses fleeing. Incensed local lawmakers responded with monuments to legislative creativity. The laws of at least one town required automobile drivers to sto
  • (posting here to undo mistaken moderation)
  • Driverless cars are 99% better than driverful cars. Also, driverless cars 1% of the time cause 10,000% more headaches than driverful cars.

  • As Tesla are discovering, it is the last 5-10% of autonomy that is tough. I have a Model Y LR with AP/FSD here in the UK. It's super wonderful on highways, but shows young human like behaviour once vehicle is on a smaller road with random situations possible, from parked cars, to people stepping into and then away from the road and so on. All things that humans can handle with aplomb due to our unique individual experience that unlike Tesla's etc averaged collected group think method, is ONLY based on our i

Keep up the good work! But please don't ask me to help.

Working...