Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
AI Transportation

Autonomous Cars Aren't As Smart as They're Cracked Up To Be (computerworld.com) 258

Gill Pratt, executive technical adviser at Toyota, offers a note of caution, even as more car companies start putting AI elements into their cars. Speaking in Tokyo at the announcement of a Silicon Valley AI research center that Toyota is to open in early 2016, Pratt pointed out the big shortcoming in an AI system as applied to automobile: Autonomous cars might look great in controlled tests or on pristine highways, "but soon fail when faced with tasks that human drivers find simple." From the article: Drivers, for example, can pretty much get behind the wheel of a car and drive it wherever it may be, he said. Autonomous vehicles use GPS and laser imaging sensors to figure out where they are by matching data against a complex map that goes beyond simple roads and includes details down to lane markings. The cars rely on all that data to drive, so they quickly hit problems in areas that haven't been mapped in advance. ... A truly intelligent self-driving car needs artificial intelligence that can figure out where it is even if it has no map or GPS, and manage to navigate highways and follow routes even if there are diversions or changing in lane markings, he said. I regularly drive a stretch of road that's just a few miles long, but between construction, accidents, poor marking, bicycles, and heavy traffic I'd be nervous about letting an AI system navigate. In what real-world driving scenarios would you most want humans to take over?
This discussion has been archived. No new comments can be posted.

Autonomous Cars Aren't As Smart as They're Cracked Up To Be

Comments Filter:
  • You'll look up where you'll be using the car to see if it's covered. I think for 85% of the folks, it would work fine for them: commuting the same paths to work, to the stores, to friends and family houses, etc. There would also be plenty of cases where an autonomous could wouldn't work well. [sarcasm]What?! A solution doesn't work for everyone?![/sarcasm] :P
    • I think the really jammed up and extremely non-adventurous city traffic would appreciate auto driving IF every car was equipped with it. Because then that invisible bottleneck where car A moves first, car B moves, then car C wouldn't happen. Cars A, B, and C would stop and move at the same time. Other than that I don't want one. In rural VA it's really fun to drive, plus or minus a few insane drivers.
  • by LetterRip ( 30937 ) on Saturday November 07, 2015 @08:39AM (#50882655)

    You don't have to have the car drive everywhere, 95% of the places you drive will probably have all of the factors needed for the car to navigate easily. Just don't have the car drive in areas where it can readily get in trouble.

    You don't start teens off in ambiguous hard to drive conditions, but rather low traffic side streets or empty parking lots, etc.

    We don't need self driving cars that are perfect from the start, merely good enough to drive us most places most of the time, and do not have accidents in the areas that are suitable for it to drive.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Don't know where you live, but here they change lanes around, close lanes, and reroute traffic on a nearly daily basis. Its been non-stop doing that for over 10 years straight. If auto driving cars depend on GPS that means I can't use it to just go to work even, not to mention going somewhere else.

      What happens when they close part of the city for a parade and a lot of streets are closed? Your car just stops and waits the 4 hours for it to pass before continuing? Sure you could take over, but if your dru

      • by TWX ( 665546 ) on Saturday November 07, 2015 @10:54AM (#50883209)

        Don't know where you live, but here they change lanes around, close lanes, and reroute traffic on a nearly daily basis. Its been non-stop doing that for over 10 years straight. If auto driving cars depend on GPS that means I can't use it to just go to work even, not to mention going somewhere else.

        What happens when they close part of the city for a parade and a lot of streets are closed? Your car just stops and waits the 4 hours for it to pass before continuing? Sure you could take over, but if your drunk and you do that you committed a DUI, the reason you bought a self driving car to start.

        I expect that in addition to cars being able to self-determine routes and find barriers there will need to be intelligent barriers that the cars can detect and follow the instructions of. These kinds of barriers would be used by construction crews, emergency responders, and perhaps even as a function of the four-way hazards when a car is stopped on the side of the road. Call it a more precise means for the autonomous car to determine what it should do or what the expectation is in a complex situation.

        Just as an example, in long-term highway construction projects it's not uncommon to take a two-lane-single-direction stretch of Interstate and to route both directions on it, one going the natural way, the other driving what would normally be opposed, while the other two-lane stretch is being worked on. In cases like this there needs to be a way for the construction barriers themselves to notify the vehicles both that something has overridden the expected behavior, and that this particular path is the override. The car will in-turn have to account for this deviation in the path and to know that it's not actually trying to go the wrong-way even though its default programming would say that it is, and it would have to understand that while one lane is now no longer the wrong, way, the other lane still is the wrong way and to not try to use it.

        Other construction-related examples include the ability to follow a pilot car and the ability to pay attention to flag-men. The flag-men method is a variation of the one-lane bridge in many cases with the addition of a very spontaneous control (ie, the switch from slow to stop and stop to slow comes without warning from the flag-man himself, so the vehicle must pay attention to the flow of traffic in addition to somehow figuring out the sign or receiving a signal from the sign), and the nature of pilot cars means that there has to be some means for cars to be subordinate to other vehicles, which leads into the next example...

        ...emergency responders. Cars will need to respect things like fire trucks blocking the road, or police cars blocking the road, or tow-trucks blocking the road, or any other sort of obstruction that will be present for awhile and indicates that it isn't safe to be within a certain area. Cars may also have to react to barriers placed by these responders, and it may make sense for those barriers to have some kind of component that lets them more intelligently broadcast so that the cars don't have to figure out what they are visually. Obviously if the police are attempting to close a stretch of road due to an accident investigation they want to keep cars out of that area so that the evidence is not disturbed. If firefighters are working on a structure fire they need to keep cars out of the immediate staging area and from driving down the road that the firemen may be crossing regularly without notice. They also need to keep a wide berth when a tow truck driver is working with a disabled vehicle, wherever that vehicle is disabled and whatever is wrong (ie, difference between an overturned vehicle on the highway, a stalled vehicle on the highway, and a stalled vehicle on the median or shoulder). These are all complex situations that happen all of the time, and cars need to be able to handle them.

        I think the first application for autonomous cars will be open-highway dr

      • Don't know where you live, but here they change lanes around, close lanes, and reroute traffic on a nearly daily basis. Its been non-stop doing that for over 10 years straight.

        Hi Seattle!

    • The safe bet is to only use an autonomous car in the laboratory where it was designed. Just like a lot of software, it only really works in clean room, optimized conditions with hand picked data.
    • You don't start teens off in ambiguous hard to drive conditions, but rather low traffic side streets or empty parking lots, etc.

      We don't need self driving cars that are perfect from the start, merely good enough to drive us most places most of the time, and do not have accidents in the areas that are suitable for it to drive.

      This might be true if we lived in a "rational" society. We don't. We live in a society whose concerns are driven by media hype. These days, the media seems to be on the side of self-driving vehicles because they seem really cool and awesome as a concept.

      But the media is fickle and could change its mind the moment something more sensational happens.

      We've already seen issues with idiots using the Tesla "autopilot" feature in ways it wasn't intended, and the company is starting to rein in its use [csmonitor.com] to pre

  • Two camps (Score:2, Insightful)

    There seem to be two camps of people. Those that think we will be living on mars and have fully autonomous cars in a couple of years, and those that actually look into it and see how hard it is going to be. For some reason, the media seems to prefer the first one. Reality prefers the second one.

    • There seem to be two camps of people. Those that think we will be living on mars and have fully autonomous cars in a couple of years, and those that actually look into it and see how hard it is going to be. For some reason, the media seems to prefer the first one. Reality prefers the second one.

      I'm guessing you're either a taxi driver or a martian. Reality prefers the first one, but boy would life be more exciting if it was the second.

    • Bull shit. No one I know of thinks we will be living on Mars in 5, 10 or even 15 year.

      You are a cynical pessimist that is unaware of the state of current technology and the deep need for it.

      Yes, if the choice was between a sober, experienced, 30 year old driver familiar with the roads and a self-driving car, no self-driving cars would ever be created. That is NOT the market for them.

      The market for self driving cars will start out being wealthy parents with kids that have a history of drinking alcohol,

  • by paradxum ( 67051 ) on Saturday November 07, 2015 @08:59AM (#50882737)
    "A truly intelligent self-driving car needs artificial intelligence that can figure out where it is even if it has no map or GPS, and manage to navigate highways and follow routes even if there are diversions or changing in lane markings, he said." - from tfa

    Frakly this is BS... I drive a large portion of my day for work (not a trucker, IT guy going to clients.) I run into "diversions or chaning in lane markings" and have to stop and think about what to do at times too! Why should an AI have to understand the intentions of a road worker/civil engineer better than we do before it can be accepted as intelligent?

    " that can figure out where it is even if it has no map or GPS" ... OK, I'm going to drop you off in the middle of Kentucky mountain area with no GPS and no map, leave you stranded with noone to talk to and you should just magically know where you are.... sorry but NO. Unless I had been there before (i.e. prior knowledge or.... mapping) I will have no clue where I am and will have to basically start driving in one direction (which these cars can do) until I figure out where I am.

    I know the media hype's this up, but he's going the other way and just being all doom and gloom.
    • Frakly this is BS... I drive a large portion of my day for work (not a trucker, IT guy going to clients.) I run into "diversions or chaning in lane markings" and have to stop and think about what to do at times too! Why should an AI have to understand the intentions of a road worker/civil engineer better than we do before it can be accepted as intelligent?

      I'm not seeing anyone in TFA saying it'd have to be better at it than us, just that it'd have to be able to do it at all would be a good start. As things stand autonomous cars are not anywhere near of being capable of doing that on their own.

      I will have no clue where I am and will have to basically start driving in one direction (which these cars can do) until I figure out where I am.

      No, they can't. That's the whole point here: as long as they rely on GPS and very detailed mappings for navigation they won't be able to do that -- they need to know where they are to be able to start driving at all. The author wasn't saying the car should be able to magically instantly know where it is even when no mappings or GPS was available, just that the car should still be able to try and figure it out -- quite possibly doing the exact thing you suggested and trying to find a roadsign or two. The issue here is that these cars won't know even how to get off the god damn parking lot without GPS and mappings, let alone going out and figuring their own surroundings on their own without some very extensive AI.

      • The issue here is that these cars won't know even how to get off the god damn parking lot without GPS and mappings

        That's a good example

    • Frakly this is BS... I drive a large portion of my day for work (not a trucker, IT guy going to clients.) I run into "diversions or chaning in lane markings" and have to stop and think about what to do at times too! Why should an AI have to understand the intentions of a road worker/civil engineer better than we do before it can be accepted as intelligent?

      As long as it is feasible and SAFE for it "to stop and think about what to do" in these situations, that's fine. When you're on a highway traveling in a pack of bumper-to-bumper traffic at 60mph+ between concrete barriers on both sides in a construction zone and the lane changes and signs come suddenly, I don't think just stopping in the middle of the road seems like a good idea.

      Almost every time I travel any significant distance on highways, I end up driving through such construction zones (including im

      • by RR ( 64484 )

        Frakly this is BS... I drive a large portion of my day for work (not a trucker, IT guy going to clients.) I run into "diversions or chaning in lane markings" and have to stop and think about what to do at times too! Why should an AI have to understand the intentions of a road worker/civil engineer better than we do before it can be accepted as intelligent?

        As long as it is feasible and SAFE for it "to stop and think about what to do" in these situations, that's fine. When you're on a highway traveling in a pack of bumper-to-bumper traffic at 60mph+ between concrete barriers on both sides in a construction zone and the lane changes and signs come suddenly, I don't think just stopping in the middle of the road seems like a good idea.

        What are you talking about? If you are in a construction zone and the lane changes suddenly, I think you will find yourself in a traffic jam. For that matter, I don’t think you will be driving “60mph+” in a construction zone; I think you will already be in a traffic jam.

        • by vovin ( 12759 )

          Regardless. If the cars in front know where to go ... the route is by definition mapped and the data shared into the pool of knowledge about the route change and how to handle it.
          Done and Done.

    • " that can figure out where it is even if it has no map or GPS" ... OK, I'm going to drop you off in the middle of Kentucky mountain area with no GPS and no map, leave you stranded with noone to talk to and you should just magically know where you are.... sorry but NO. Unless I had been there before (i.e. prior knowledge or.... mapping) I will have no clue where I am and will have to basically start driving in one direction (which these cars can do) until I figure out where I am.

      Note that a self-driven car will almost always have GPS and now its exact location. It will just sometimes not have an accurate map of the area directly around it. Both self driven car and car driven by me will proceed to the nearest road, then make a guess which direction to turn. The difference is that the self driven care will always know where it is and what direction it is going. It can't get lost. If it returns to a place where it was before it can take that into account.

    • by sl149q ( 1537343 )

      If you have used Google Maps / Waze / Apple Maps recently you will have noticed that they do a pretty good job of showing congestion for your route in real time.

      For any obstruction on the highway, the FIRST car may have to figure something out, the SECOND car will simply have an updated "map" saying that there is an obstruction use the left lane and pay attention for a flagger.

  • by monkeyxpress ( 4016725 ) on Saturday November 07, 2015 @09:01AM (#50882741)

    Translation: Toyota is woefully behind in autonomous car development, and rather worried about it.

    The FUD begins.

  • by NostalgiaForInfinity ( 4001831 ) on Saturday November 07, 2015 @09:02AM (#50882745)

    The cars rely on all that data to drive, so they quickly hit problems in areas that haven't been mapped in advance.

    I don't see that as a problem. If it works on most of the roads people drive every day, that's good enough. As with all automation, we automate routine tasks and let humans do the rest.

    A truly intelligent self-driving car needs

    But we don't need "truly intelligent self-driving cars" for self-driving cars to be very useful any more than we need "truly intelligent factory robots" for factory automation to be very useful.

    I regularly drive a stretch of road that's just a few miles long, but between construction, accidents, poor marking, bicycles, and heavy traffic I'd be nervous about letting an AI system navigate.

    Well, then don't. In fact, your AI driver would probably simply avoid that route altogether precisely for those reasons and still get you to your destination safely and efficiently. Nobody says that an automated driver needs to take the same route as you do; after all, bikes, motorcycles, buses and light rail probably don't either.

  • by hey! ( 33014 ) on Saturday November 07, 2015 @09:05AM (#50882757) Homepage Journal

    Well, neither are human drivers.

  • by sjbe ( 173966 ) on Saturday November 07, 2015 @09:09AM (#50882767)

    The biggest problems aren't actually going to be point to point navigation or even obstacle avoidance, though those aren't trivial. Navigation when you know the destinations is a solved problem and we've got a pretty good idea how to handle obstacle avoidance and terrain following though there is progress to be made

    Possibly the hardest problem to solve it you want completely autonomous cars will be navigation in the last quarter mile and for destinations where you aren't actually sure exactly where you are going. This is a human interface problem and those are always challenging. In those circumstances it is REALLY hard to instruct a computer efficiently without actually taking the controls yourself. For example how do you explain to the computer that you want the parking space 3 places over but you want to back in? Or that you don't want to block in the car so park next to it on the lawn? Sounds easy but it really isn't - not yet anyway. Humans can do it mostly competently but we don't have any computer that is anywhere close to human level processing of verbal commands. Stuff like parking lots will be surprisingly hard to automate in a way that will be pleasing to most people. There are solutions but they are going to take a long time and require a lot of infrastructure. Probably several decades away at minimum. Sort of how we had autopilot for planes many year before we had the ability to do autonomous takeoffs and landings. (and the aviation problem is arguably easier as it has fewer variables)

    I think we will see semi-autonomous systems relatively soon particularly for stuff like highway driving. But I think there is going to remain driver controls for quite some time because steering into that parking space or instructing the car to back up to the front door is actually pretty hard to do well. What will happen is that you'll program in your destination, the car will take you close to where you want to go and then you'll probably drive the last little bit yourself in a lot of cases. I think this piece of navigation will be solved last if at all.

    • I've been thinking about that problem too. A good example would be unstructured situations, such as spill over event parking on lawns or gravel lots. Given existing cars cars already there certain pattern formation algorithms can be applied, such as continue this or that line of cars. Otherwise things get even trickier and some spacial user interface will have to let the passenger point out in an overhead view where exactly to place the car and in what orientation. None of these are unsolvable problems per

    • The real killer for adoption is going to be the couple minutes you have to spend punching in a destination before you get going. We are so used to just getting in and going where we want that such delays will be very frustrating for short trips.

      Worse will be the times where you know where you are headed, but don't really know the address or proper name such as that italian place downtown, you know the one with the good meatballs. Or the soccer field just past the railroad tracks. Judging by how awful som

  • "Autonomous Cars Aren't As Smart as They're Cracked Up To Be"

    Let me be the first to say, "No shit."

  • No, autonomous automobiles.

  • How much fog/rain/snow/smoke does it take to degrade the sensing level?

  • ... Autonomous cars might look great in controlled tests or on pristine highways, "but soon fail when faced with tasks that human drivers find simple." ...

    I want to see those so-called self-driving cars navigate a New England winter, or the pothole-filled roads that occur after said New England winter.

    • by sl149q ( 1537343 )

      So you live in the 20% of the 80:20 rule. We'll solve for the 80% for great benefit. And let the 20% hang out to flap in the wind. It simply doesn't matter that we can't drive there. We probably don't want to drive there. And you can be like the Amish driving around in their horse drawn carriages.

  • Fact is, roads and road markings aren't supposed to just pop up out of nowhere. Here in Norway every public road (and many private roads, pedestrain/bike roads, forest roads closed for general traffic and soon) is mapped out in NVDB (Norwegian Road Database), and it's supposed to be authoritative guide on speed limits, road signs, pedestrian crossings, speed bumps, bridges, tunnels, road classification including lane types and weight restrictions, railing and so on. This is all public data, I'm looking at i

  • by BenJeremy ( 181303 ) on Saturday November 07, 2015 @10:01AM (#50883023)

    Toyota researcher finds autonomous driving technology is hard to do, beyond the autonomous accelerator pedal.

    On a side note, this stuff has been worked on for ages. I worked with a company in 2000, doing image recognition for lane departure warning systems and other subsystems that are currently in use today. The technology is there, but not all companies are happy that many of those technologies are tied to patents and would rather be able to use in-house sources. Developing those sources now is a bit late in the game.

    In 5 years, comments like Pratt's will be completely laughable. The only reason he's taken remotely serious now is because it isn't ubiquitous yet. Consumers do not have serious experience with autonomous driving, so his FUD is accepted at face value. In reality, he's just faced with a tremendous uphill battle to catch his company up in the game, and it's overwhelmed Toyota, to the point they are sowing caution to the masses, mostly in the hopes to catch a breather in the court of public opinion.

  • Every day driving is filled with problems that would be intractable for a computer. Anyone who wasn't drinking the self drive koolaid would realise this in a second.

    The best chance self drive has is on closed loops, e.g. airport terminal transfers where vehicles can drive separately than the other traffic in mostly predictable conditions. Even there there'll probably be some guy in a booth whose job it is to takeover if the car gets stuck, confused or breaks down.

    On the public roads it would be better f

    • Everyday driving is filled with problems that intractable for humans. Computers will be better at some things, people at others. How about using the strengths of both?

  • by vtcodger ( 957785 ) on Saturday November 07, 2015 @10:17AM (#50883075)

    Just a few of the many things I've encountered in 60 years of driving that are going to be a problem for computers.

    1. GPS? My wife and I bought a new GPS on sale at a local mall a few years ago. First thing we did when we got in the car was to program the thing to take us home. We hit GO. It thought a while and then told us that home was 2700 odd miles away and that the trip might take a while. Guess what? GPSen don't work in parking garages. It apparently thought it was still in Sunnyvale where last it was turned off, and it was contemplating a trip across the continent.

    2. A couple of days ago I was using that same GPS to navigate through a rural area in Vermont. Seeking the shortest route, it put me on a (dirt) road that ran about a half mile, turned a corner, and ended in someone's barn. Care to try your hand at a program to recognize and deal with that situation?

    3. Many years ago while traveling up the (dirt) road to an obscure National Monument out West, I came around a corner and found myself in a large herd of sheep. Couldn't see the road. Or the ditches. Or anything but sheep. What now Kit?

    Not that cars a few decades from now won't be able to deal with thousands of situations like that. But it'll take a while I think.

  • Answer (Score:2, Insightful)

    by Anonymous Coward
    So nobody is answering the last question in the post - about when would you want to take over. I say never. The whole point of a self driving car is so that I don't have to watch the road, don't have to pay attention, and can sleep, read, or work while the device drives me like a taxi would. If it can't do that, it is less than worthless. Less than worthless because why would I want to pay for a fancy AI in my car if I have to keep my hands free, near the wheel, and my eyes on the road? I do that NOW. If I
  • by Dereck1701 ( 1922824 ) on Saturday November 07, 2015 @10:30AM (#50883109)

    "Autonomous cars might look great in controlled tests or on pristine highways"

    Hasn't Google been testing out their cars in the real world? And if the wiki article is right they've driven over a million miles and only had 14 minor traffic accidents, none of which were the fault of the autonomous system (at least according to Google). If that is true and if my math is correct that puts their accidents per mile ratio at about 1 / 71,400. Again if my math is correct your average human vehicle experiences accidents at a rate of 1 / 66,700. Suggesting Googles autonomous vehicle is safer. Admittedly there are probably limitations, letting one drive in torrential rain or snow/ice covered roads may result in far less advantageous statistics, the roads do have to be pre-mapped and there are almost certainly situations they can't handle. But most of those situations go for any vehicle/driver, I've driven in a variety of terrible weather and I've never been in an accident that was my fault, I have siblings who have been in a half dozen accidents most of which were in good weather. Most humans generally do well when encountering road work areas, I've seen others driving in oncoming lanes because they failed to notice the gigantic signs pointing them somewhere else. Some people are going to be safer drivers than these autonomous vehicles, some people should be encouraged to let the vehicle drive instead.

    • by Kjella ( 173770 )

      Hasn't Google been testing out their cars in the real world? And if the wiki article is right they've driven over a million miles and only had 14 minor traffic accidents, none of which were the fault of the autonomous system (at least according to Google). If that is true and if my math is correct that puts their accidents per mile ratio at about 1 / 71,400. Again if my math is correct your average human vehicle experiences accidents at a rate of 1 / 66,700. Suggesting Googles autonomous vehicle is safer.

      A much more interesting statistic that Google hasn't released is how often their engineers have either intervened or pre-emptively taken over control in order to avoid a potentially bad situation. Not to oversimplify what Google has done, but if I think of my commute to work and map it out to a computer with relatively simple rules like here's the lanes, there's an intersection and there's the light, there's a crossing that you don't pass until it's clear I would say at least 95/100 times it'd get by on ver

    • Part of Google's testing involved lending out Beta cars to employees. Sure enough, they whipped out laptops, went to sleep, and otherwise were in no position to take over if HAL gave up and handed over control. These were well educated folks who knew they were in Beta cars (and who should have been fired for such negligence). So as far as the general public goes we can expect zero backup for the system from the human inside. So the system needs to be truly autonmous in every sense before it gets release

  • by mark-t ( 151149 ) <markt&nerdflat,com> on Saturday November 07, 2015 @11:22AM (#50883401) Journal

    .... is not to literally drive the car, it is to prevent accidents.

    Any autonomy that an automobile might appear to exhibit should be seen as a side effect of that goal, and not a direct manifestation of intent.

  • by mbeckman ( 645148 ) on Saturday November 07, 2015 @11:22AM (#50883409)
    The first step to an intelligent debate on autonomous cars is to eliminate the phrase "artificial intelligence" from the discussion. Autonomous cars are just that: cars that navigate roads without human intervention. They are not intelligent, artificially or otherwise, anymore then a 1940s autopilot in a Beechcraft D-18 is.

    "Autonomous" is the perfect adjective, because these cars are automatons, not conscious, thinking beings. Because we have only the foggiest definition of "intelligence", we are in no position to create an artificial one. If someday we do have that knowledge, what will we call artificial intelligence when we actually make one? That'll be a problem if we sully the term today with myth and superstition.
  • by kheldan ( 1460303 ) on Saturday November 07, 2015 @12:01PM (#50883565) Journal
    We can't even, after decades of trying, create an 'artificial intelligence' that can pass the Turing Test, and that's just text on a screen. What makes any of you so sure that 'autonomous cars' were ever so close to being a reality? Even then, as I've said in the past and will keep saying, there's always going to be a full set of manual controls, by law, and you'll always still be required, by law, to be fully educated, trained, tested, licensed, and insured in order to be behind the wheel of any vehicle, regardless of any so-called 'self-driving' feature it might have, because when all it said and done, when human safety and lives are at stake, a human being must be the final 'backup system'. Furthermore since we all know that any skill that isn't used often tends to atrophy, you'll likely be required to be re-tested by the government more often than you are right now, to ensure that you're still competent to be operating a motor vehicle. So get over it: You're still going to be driving yourselves around for a good long time to come, probably the rest of your lives, or at least until you're too old to be a competent operator of a motor vehicle.

    Now, then, for all of you with all your complaints about 'other drivers' being so bad: Hush up already, you're probably at least as bad as the ones you're complaining about. That being said, what we need to do in this country is to improve driver training and education, and tighten up testing procedures and frequency to improve the overall competence of drivers on the roads, and exclude the ones who can't (or won't) show an acceptable and consistent level of competence. This should include tougher and longer-lasting penalties for individuals convicted of DUI. Furthermore any use of any kind of any mobile wireless device while driving should be strictly prohibited and punished severely; I think a six-month suspension of driving privilege with a hefty fine should be sufficient.

    Meanwhile, auto industry, please do continue to develop and produce collision-avoidance systems that warn the driver when they're screwing up.
    • We can't even, after decades of trying, create an 'artificial intelligence' that can pass the Turing Test, and that's just text on a screen. What makes any of you so sure that 'autonomous cars' were ever so close to being a reality?

      Because those are two wildly different problems?

      • No, see, they're not: If you can't even create software that can think enough like a human being to have a casual conversation with someone, over text on a screen, which is a relatively simple thing with relatively few variables in it, then how the hell do you expect something as complex and dynamic as operating a vehicle in the real world, where anything can happen at any time, completely at random? You're expecting us, in a few years, to come up with the equivalent of the human brain, that took millions o
  • I'm sure these cars will do great around the Magic Roundabout: http://basementgeographer.com/... [basementgeographer.com]

I have never seen anything fill up a vacuum so fast and still suck. -- Rob Pike, on X.

Working...