Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Google

Waymo Shelves 'Self-Driving' Term For Its Technology To Shore Up Safety (cnet.com) 53

An anonymous reader quotes a report from CNET: Waymo swears it's not out to pick nits and give us all an exercise in linguistics. The fact it will no longer use the term "self-driving" when describing its technology is about education and safety, Alphabet Inc.'s division devoted to the technology said Wednesday. Going forward, Waymo will call its technology "fully autonomous" to create, what it believes, is an important distinction. The company's argument rests entirely on how the public perceives "self-driving" as a term. Waymo points out, without naming names, that some automakers -- Tesla comes to mind -- toss the phrase around even though its technology doesn't fully drive a car on its own. Worse, Waymo said the proliferation of "self-driving" can lead to drivers taking their hands off the wheel when it's unsafe to do so.

By moving to the term "fully autonomous," Waymo hopes to lay the groundwork for standard industry terminology and help the public understand that "fully autonomous" means the car makes every decision, well, autonomously on its own accord. It also puts some space between Waymo's technology and companies that continue to brand their own systems as "self-driving." In the end, it's a bit of a branding exercise, but I think Waymo's heart is in the right place.

This discussion has been archived. No new comments can be posted.

Waymo Shelves 'Self-Driving' Term For Its Technology To Shore Up Safety

Comments Filter:
  • I say we bring back autopilot.

    Let's hear some more stories about the anti-RTFM jockeys behind the steering wheel who didn't earn their Darwin Award.

  • More letters in the nomenclature functions like nerf bumpers when one of these things gets into a situation outside its design parameters. So much safer. Actual directly experienced driving situations like boulder dropping onto roadway (mercifully just behind rather than just ahead), zero visibility in sudden heavy storm on 405, large wheel and tire landing in adjacent lane as ejecta from large collision on other side (same stretch of 405), visibility suddenly obscured by heron shitting most of a fish on
    • I've seen a deer disintegrated by the car in front of mine doing 80 (I-17, northern AZ). Large chunks of it flying all over the highway. On another occasion on the same road in the city, I was running beside a flatbed truckload of 4-drawer steel filing cabinets. stacked three high, when an approaching thunderstorm gave forth a sudden strong gust of crosswind. Unsecured filing cabinets went flying over my head, across all traffic lanes.

      • The logical question in response to your anecdotes is obviously "Why do you think a robot will do a worse job of handling those situations than a human?".

        • by tsqr ( 808554 )

          The logical question in response to your anecdotes is obviously "Why do you think a robot will do a worse job of handling those situations than a human?".

          Obvious? The question that occurred to me was, "Why do you think a robot will do a better job of handling those situations than a human?" If the answer is, "The robot doesn't have to do better; only no worse," then the next question is, "Then why use a robot at all?"

          • If the answer is, "The robot doesn't have to do better; only no worse," then the next question is, "Then why use a robot at all?"

            Because the robot is either a lot cheaper, more expendable, or actually does better in a lot of other situations?

            Honestly, I figure a "fully autonomous" "self driving" "autopilot" vehicle would get into accidents that no human would, and would successfully avoid some accidents that humans fall for all the damn time. Because, you know, they're different.

            The accidents that no human would: Actually "no human" is a tough bar, the universe keeps coming up with better morons. But we're looking at things like a

            • by tlhIngan ( 30335 )

              It's interesting, because a computer has enhanced sensors. Radar works great in fog (but so do cameras - oddly enough. Modern imaging devices can see through fog far better than we can/ Go figure) and it's able to see far ahead than most people look when driving.

              Perhaps we need sensors to look up for falling objects so if it sees rock slides it will slow down to avoid running into a boulder. And running into deer shouldn't be possible - it should see the deer and slow down well before hitting it.

              Human react

        • by Luckyo ( 1726890 )

          Problem is that the opposite is actually true. We don't need evidence that "existing, proven old system is better than new system". We need evidence that
          "new system is better than existing, proven old one". Because these are critical systems that can cause massive amount of damage when they fail, as demonstrated by "anecdote" above.

          And that is very hard to prove when it comes to automatic driving systems, as evidenced by their long developmental cycle and all of its failures we've seen so far.

          • Thank you to all above for quality responses to what was a fact-based deliberate trolling post. When I was young and impressionable I worked for a marginalized researcher in natural language processing. He was working on a doctorate in philosophy, with a dissertation topic of ethics for an encyclopedic robot (Google, with eyes and ears). This was in 1974-75, when Brin and Page were quite literally in diapers. I fell out of touch with him, and note that he eventually completed a doctorate on a narrower t

      • Moments like that are a real thrill when you're on a motorcycle. I had a piece of siding rip off a trailer home in front of me at around 75 once and it literally spiraled around me on the way past. I could have easily stuck a hand out and touched it in any direction at one point, if I were feeling a bit suicidal. Even fully body armored up that'll give you an unpleasant adrenaline rush.

  • Waymo is right to distance itself from the likes of Tesla in this regard. Tesla's technology, while impressive, is still crippled because of their refusal to recognize that LIDAR is a very necessary tool for fully autonomous driving. Yes, the AI is the brains and is paramount, but Tesla and others relying solely upon camera systems will always have failures due to blindness.
    • Humans can drive to a generally accepted level without LIDAR.

      • by Viol8 ( 599362 )

        Humans are a lot smarter than the sub ant level AI "brain" in an autonomous car and can predict things such as "i'm currently being blinded by reflected sunlight but there was a large truck there a few seconds ago so its probably still there - I'd better slow down"

    • by GuB-42 ( 2483988 )

      I don't really understand the disdain of Tesla for LIDAR. My guess is that they couldn't get LIDAR for cheap.

      But things can change. I don't see why LIDAR should be expensive relative to the price of a high-end car. I mean Xiaomi put a LIDAR on a midrange robot vacuum cleaner! Probably not the same LIDAR as in self-driving cars but there are clearly economies of scale and value engineering to be made here.

      Tesla is ridiculously overvalued, they could probably sell a bit of their socks, build that LIDAR and pu

      • by tsqr ( 808554 )

        they could probably sell a bit of their socks

        Yeah, but who wants to see Elon Musk running around barefoot?

      • by dgatwood ( 11270 )

        I don't really understand the disdain of Tesla for LIDAR. My guess is that they couldn't get LIDAR for cheap.

        LIDAR is optical, which means it is great as long as it isn't raining, foggy, or dusty, i.e. it doesn't replace RADAR. And it provides no color detection for traffic lights, which means it doesn't replace cameras. So your choice is whether to do sensor fusion with cameras, RADAR, ultrasonics, and LIDAR or only three of those things. That's a big difference in CPU requirements, and the more sensor sources you have, the more likely you are to have edge case bugs when the various data sources don't agree.

        So

        • Tesla is actually using lidar to train their camera system. Several test cars have already been spotted with lidars. The neural net is then trained to turn the input from the cameras into 3D data that matches the lidar data. The resulting neural net will basically provide the same object detection capabilities as lidar but without the hardware and probably faster, too.

      • I don't really understand the disdain of Tesla for LIDAR. My guess is that they couldn't get LIDAR for cheap.

        You're almost certainly correct that they couldn't get LIDAR, at least LIDAR worth a crap, for cheap. Going over to his work in SpaceX, a lot of Musk's work is actually to increase affordability. He likes to increase reliability by re-engineering things to simplify them - make a part less complex, make it less likely to fail. Outright elimination of parts and systems is actually even better.

        By his way of thinking, if you can't afford something, it is useless. So a self driving car that is unaffordable t

    • It seems difficult to get hold of details about LIDAR.
      I keep seeing how it can detect stuff at 200m, but no details about resolution at that range.

      IIRC the last figures i saw were a 'pixel size' of 150mm square at 100m

      • by Viol8 ( 599362 )

        Doesn't really matter so long as it can detect something like a 20 ton truck crossing a highway or a solid concrete divider coming up which Teslas system has singularly failed to do on a number of occasions and caused deaths.

        • I would like it to detect everything that the human eyeball can detect. If it can't, how can it drive better than a human?
          Shirley the autonomous LIDAR car is driving in the equivalent of a fog with 200m visibility? Will it have to rely on reaction times to be safer than a human ?

          Any technical types want to weigh in on the amount of data processing needed for LIDAR with 50mm pixels at 500m ?

  • If "fully autonomous" means that the car makes every decision, then the car also decides where it wants to go. ... and interestingly, Tesla has implemented this functionality already [youtube.com] - so their car is "fully autonomous", but not "fully self-driving" yet?

    Actually, in the light of the recent Slashdot story that people need to eat better food, maybe that's a great feature. "Car, take me to McD!" "Nope, you're going to a green grocery!" Think of the lives that fully autonomous cars could save?

    • What does one do with a car? Drive it.

      What does autonomous mean? "a device capable of operating without direct human control."

      What does fully autonomous in a car mean? It means it drives itself.

      What's another way to say it drives itself? Self driving.

      There is absolutely zero difference in meaning between "fully autonomous" and self-driving when it comes to an automobile.

    • by GuB-42 ( 2483988 )

      "Car, take me to McD!" "Nope, you're going to a green grocery!"

      No, the car will go to the supercharger, then will order accessories for itself. And don't attempt to cancel the order or next time you leave your kids in the car, it will lock them in and ask for ransom.

  • Fully autonomous seems to convey the exact same meaning as full self driving.

    • This name is nothing more than marketing. And all this posturing about conveying an accurate description is more marketing. The fact is simple: the car is not "fully autonomous," any more than it is capable of "self-driving." If they were interested in accuracy they would use words like "assisted driving" or "computer co-pilot" or "automatic steering" or something like that....something that suggests incompleteness and a need for you to stay in control.

      But that would not sell as well. Obviously. So the

    • Waymo: It is fully autonomous.
      Me: How is it driving?
      Waymo: It is fully autonomous driving.
      Me: That's just Inkhorn [wikipedia.org] for self driving.

    • by f00zbll ( 526151 )
      no not really. I've been studying AI field for close to 2 decades and have had to explain various fields of AI to business people. I have no hard data on this, but based on my own experience in the IT world. 80% of the people in IT have zero clue what both terms mean. Instead of relying on obtuse vague marketing terms, we need to move to clearly defined definition. I'm so sick of people confusing marketing terms with levels of autonomy. Especially the Tesla fanatics who refuse to look at facts. The tesla zo
    • The reason the two terms don't mean the same thing is because they are being used for different meanings. This isn't resolved by looking up the words in a dictionary. What does "cruise control" mean? Whatever people mean when they say it.
  • It is a Google entity.
    It has no heart.
    It eats babies and their hearts for breakfast.

  • Autonomous is a word basically only used by the well educated. When you are selling to the mass market you want to use language which everyone understands and uses.

  • I am extremely conflicted on this issue:
    On the emotional side:I cannot bring myself to even entertain the idea of riding in an AI controlled vehicle. To be honest, I hate riding in anything that moves if I am not in control, including trains, airplanes and ferries, none of which I am even vaguely qualified to operate. I love to drive, and, like most, am convinced I am a far superior driver. (disclaimer: an objective and realistic look at my driving record, habits and experiences supports somewhat

  • by Wycliffe ( 116160 ) on Thursday January 07, 2021 @12:05PM (#60907252) Homepage

    Tesla and everyone else should be required to stop using the word self-driving or autonomous until they get to the point where you can remove the stearing wheel and have sleeping or unlicensed drivers.

    • Tesla and everyone else should be required to stop using the word self-driving or autonomous until they get to the point where you can remove the stearing wheel and have sleeping or unlicensed drivers.

      The term "self-driving" and "autonomous" are marketing terms like "natural." These terms are unregulated and because there is no uniform definition or point of reference, the terms are fungible and meaningless.

      Tesla tosses around the terms autonomous and autopilot. However, it's very careful not to say L3/4/5, but those terms are clear and presenting the current Tesla as L3/4/5 could be considered to be fraud.

  • These will never be safe enough to trust, and like I've said from the beginning, you're a fool if you sit down in a box on wheels that has no controls for a human operator to control the vehicle. There have been no technological developments since the beginning to have me believe otherwise. The shitty excuse for 'AI' used has ZERO cognitive ability, 'training data' and 'learning algorithms' are wholly insufficient to substitute for a human mind, and there WILL BE SENSELESS, POINTLESS DEATHS because people a

"Hello again, Peabody here..." -- Mister Peabody

Working...