Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Transportation AI Technology

When Autonomous Cars Teach Themselves To Drive Better Than Humans (ieee.org) 86

schwit1 shares a report from IEEE Spectrum, written by Evan Ackerman: A few weeks ago, the CTO of Cruise tweeted an example of one of their AVs demonstrating a safety behavior where it moves over to make room for a cyclist. What's interesting about this behavior, though, is that the AV does this for cyclists approaching rapidly from behind the vehicle, something a human is far less likely to notice, much less react to. A neat trick -- but what does it mean, and what's next? In the video [here], as the cyclist approaches from the rear right side at a pretty good clip, you can see the autonomous vehicle pull to the left a little bit, increasing the amount of space that the cyclist can use to pass on the right.

One important question that we're not really going to tackle here is whether this is even a good idea in the first place, since (as a cyclist) I'd personally prefer that cars be predictable rather than sometimes doing weirdly nice things that I might not be prepared for. But that's one of the things that makes cyclists tricky: we're unpredictable. And for AVs, dealing with unpredictable things is notoriously problematic. Cruise's approach to this, explains Rashed Haq, VP of Robotics at Cruise, is to try to give their autonomous system some idea of how unpredictable cyclists can be, and then plan its actions accordingly. Cruise has collected millions of miles of real-world data from its sensorized vehicles that include cyclists doing all sorts of things. And their system has built up a model of how certain it can be that when it sees a cyclist, it can accurately predict what that cyclist is going to do next.

Essentially, based on its understanding of the unpredictability of cyclists, the Cruise AV determined that the probability of a safe interaction is improved when it gives cyclists more space, so that's what it tries to do whenever possible. This behavior illustrates some of the critical differences between autonomous and human-driven vehicles. Humans drive around with relatively limited situational awareness and deal with things like uncertainty primarily on a subconscious level. AVs, on the other hand, are constantly predicting the future in very explicit ways. Humans tend to have the edge when something unusual happens, because we're able to instantly apply a lifetime's worth of common-sense knowledge about the world to our decision-making process. Meanwhile, AVs are always considering the safest next course of action across the entire space that they're able to predict.

This discussion has been archived. No new comments can be posted.

When Autonomous Cars Teach Themselves To Drive Better Than Humans

Comments Filter:
  • Cyclist ... (Score:2, Troll)

    by PPH ( 736903 )

    ... passing on right has a death wish. AI needs to call the coroner.

    • You mean in the bike lane?

      • Re: Cyclist ... (Score:4, Informative)

        by simlox ( 6576120 ) on Friday May 07, 2021 @01:38AM (#61357502)
        There were no bike lane in the video. And riding fast in the bike lane is very dangerous as it is very unexpected for others. Especially for cars turning right. I set my personal speed limit to 30 km/h in bike lanes. When I am going faster I use the road. Depending, of course, on the road and bike lane...
        • In some jurisdictions you can't use the road if there is a bike lane, e.g. the Netherlands and my native Germany. I live in the UK now and here I'm free to use the road even if there is a cyclepath (many of which are terrible [theguardian.com]).

          I sometimes find myself ignoring "random acts of kindness" by car drivers who have the right of way yet decide to wait for me to let me pass. This leads to a stupid mexican standoff wait-a-thon, but there is a fundamental lack of trust on my side, I'd rather stick to right-of-way

        • The only time I've hit a car was when in the bike lane traveling faster than the cars, one car cut across the bike lane to park. I slid along his car from rear to front breaking off the side rearview mirror (which he didn't bother to use anyway) and ending up in the gutter one parking spot further along.
    • It's very simple to do. You look up the road 200 feet and determine if there is a car coming. If there is not one you pass. If there is one you stay behind the bike until there is not one.
  • and when the system can't id one & Elaine Herzberg dies?

    • uhh.... should I call you a Bondulance? [kym-cdn.com]

    • by MrL0G1C ( 867445 ) on Friday May 07, 2021 @05:44AM (#61357904) Journal

      In fairness, that Uber system knew full well that she was there but Uber had turned off the safety systems which would have stopped the car in ample time and the driver was watching video on her gadget. The fault is likely both with Uber for not getting adequate confirmation that the driver knew the car wouldn't stop for anything and the driver for not paying attention to this and subsequently to the road.

  • by Anonymous Coward

    But that's one of the things that makes cyclists tricky: we're unpredictable."

    Unpredictable? Ever ask yourself why? Yes, I'm serious.

    Every other cyclist has to abide by all the same laws that you do when it comes to operating a bicycle on a public road, especially one that you share rules with deadly automobiles.

    The problem I have with this statement is the fact that while cyclists are unpredictable, a GOOD cyclist should not be. Based on rules and visual signals, they should be anything but unpredictable. Again, dedicated lanes and rules, make them more predictable. Kind of lik

    • by felixrising ( 1135205 ) on Thursday May 06, 2021 @10:59PM (#61357200)
      Let's face it, all people are a bit unpredictable. You can't just say cyclists. https://youtu.be/dX-bcedKy2Y [youtu.be] Get my point?!
    • by jezwel ( 2451108 )
      100% agreeance with you. Cyclists breaking road rules for their own convenience is why I've nearly cleaned a few up - crossing a red light when I've got a green and very nearly T-boning them is the worst so far.

      I don't mind when cyclists switch between road and footpath (which is legal where I live) as that's something you can predict and make allowance for. If it happens then you're ready at least.

      • by MrL0G1C ( 867445 )

        Filtering path-side of vehicles is perfectly legal in the UK and safe if done right (when traffic is crawling along). Footpath cycling isn't legal here unless there are signs saying otherwise.

    • by HiThere ( 15173 )

      Actually, cyclists ARE more unpredictable than cars, though not as much so as skateboarders. Motorcycles are a lot more predictable than bicycles, though when they have a problem it's a lot worse. Two wheeled vehicles (current designs) are inherently unstable, and small variations in control can cause large changes in action. A small rock in the road can send a bicycle in a rapid change of direction that needs to be rapidly corrected. Etc. I once had my rear hub strip it's gears while I was riding, and

  • by f00zbll ( 526151 ) on Thursday May 06, 2021 @10:01PM (#61357118)
    Unless it's a red light, passing on the right is a terrible idea. I've seen cyclist do that and I think WTF dude! But stupidity isn't isolated to cyclist, it's a human condition. I've lost count of shitty drivers doing dangerous shit to me and my friends. If I notice a cyclist in the side minor, I will try to make room or be cautious. There have also been times when I didn't see a cyclist due to turns, trees, bushes and other visual obstacles.
    • It's a trade-off. approaching the median to pass on the left is its own danger, especially at intersections and stop signs.

    • by cusco ( 717999 )

      Wholly carp, you're going to go out into the middle of the street and assume that the driver sees you coming and that every other driver notices you, then cross back in front of them assuming that they're not going to speed up unexpectedly. How the frack are you still alive?

      The first thing that I was taught when I was old enough to ride in the street is "Assume you are invisible".

      • by Kokuyo ( 549451 )

        Funny how "Hey, traffic may be slow but it's still moving... I'll just not pass at all and wait my turn" never seems to have even popped into your brain when you replied to OP.

    • by MrL0G1C ( 867445 )

      If the traffic is moving under 10 mph then I'll pass either side, your post is based upon assumption that may not be true.

    • Unless it's a red light, passing on the right is a terrible idea. I've seen cyclist do that and I think WTF dude!

      If you're being overtaken by a cyclist then the odds of you being a danger are far lower than when a cyclist isn't able to catch up with you. A school zone speed limit is faster than a typical road cyclist can sustain. Mind you if you frequently take turns without indicating that may be more of a bad idea for you rather than the people around you wondering what it is you're doing.

      Also ... America... I don't think they have any functioning concept of the idea that passing left or right makes a difference. It

      • Also ... America... I don't think they have any functioning concept of the idea that passing left or right makes a difference. It's not like Europe where that is seen as an offence worse than speeding.

        In California we have no law prohibiting passing on the right, but we do have a law prohibiting clogging the passing lane. Unfortunately, it is a very permissive law, and it is also almost never enforced. You have to be holding up five or more people on a highway or freeway before you are in violation. Literally the only time I've seen it enforced was against a backhoe on the CA29. They are permitted to drive there if they have signals, but they still have to permit passing like anyone else.

        Unfortunately in

    • by zmooc ( 33175 )

      I'm from the Netherlands and based on certain metrics this apparently is cyclist Valhalla. If you don't have to share the road with people that got their drivers license for free with a carton of milk, have traffic laws that are reasonable and an infrastructure that's built for all kinds of traffic instead of just for cars, passing on the right generally is the safer option for cyclists.

  • by Gravis Zero ( 934156 ) on Thursday May 06, 2021 @10:13PM (#61357132)

    Not a lot of details on how it's trained or the depth of the AI so it could simply be mimicking drivers. This could be dangerous because with only information on how to react to the bicyclist, it could move across the dividing line (I have seen cars do this) to give the bicyclist space. Usually this isn't problematic but usually there isn't also another car coming form the opposing direction.

    Another possibility is that it is doing more than just mimicking and may "think" that it should keep a distance from bicyclists at all costs... including running off the road.

    It's good to know they are developing stuff but let's not get ahead of ourselves on what this progress means without getting important facts.

    • by AmiMoJo ( 196126 )

      The problem with the low grade AI we have now is that it can be trained to do things, but it doesn't understand context and we don't understand how it "thinks", so it can be unpredictable.

      I'm reminded of that time the army tried to train an AI to spot tanks. They ended up training it to spot sunny days because all the photos with tanks were taken in more favourable weather conditions. The risk is we end up with AI drivers like that, they seem to work but we don't realize they are actually basing their decis

  • by fredrated ( 639554 ) on Thursday May 06, 2021 @10:53PM (#61357182) Journal

    Maintain your course and speed, don't swerve to accommodate lane splitters, don't swerve for cyclists.

    • Fully agree with this. (Not a cyclist, too many crazy drivers...).

      Humans are mostly good at predicting steady state progression. If the next logical step is linear for everything then you can predict within that range. For a car moving straight ahead, it has only 4 options (left/right/slower/faster). The worst thing is to slowdown or move to the right (closing distance). If I see a cyclist from a distance, I might give a bit more room if they have time to integrate my new position into their spatial m

    • by MrL0G1C ( 867445 )

      Summary says nothing about swerving. There's a difference between swerving and noticing a cyclist is right behind the curbside back end of your car and then checking centre-side and moving over so that you're up against the land dividing lines.

      If I'm passing a vehicle on the inside, I'll only do so when it's safe to do which means traffic is crawling along and the vehicle either isn't a large truck or it's a truck but it hasn't got time to get anywhere before I can safely get past it.

    • There's a difference between swerving and making space for someone lane splitting when you're obviously in insanely slow moving traffic.

      • Either way it is bad. You have no idea why the person moved over or if they are going to move back, or about to do a u turn or something. I see drivers doing weird stuff all the time when I am on my motorcycle.
        • You don't have an idea? I do. Cars making way for lane splitting at times when it's legal to lane split are very obvious. When it's legal traffic is moving very slowly, usually in a traffic jam. Also splitting implies 2 lanes, so straight away the idea of someone doing a u-turn is out of the window. Mind you if you're relying on people not to do actions without using the little yellow blinking lights then you have completely lost and you could die at any moment lane splitting or not.

          As for moving over, what

          • Yeah I have an 'idea', that if I'm wrong I AM DEAD.I would rather drivers hold their line and drive normally. What is so hard to understand?
          • I've been behind cars on my motorcycle, not tailgating and have had them pull over so I can pass, spraying me with debris and gravel from side of the road.
    • I agree - BUT... if you're far enough from the kerb for the cyclist to get past easily, then don't give them any more room. However, if you're too close to the kerb, and there's enough road width that you can move out and still leave plenty of space in the middle, then do move out. But, if you can't move out, then stay put, even though that obstructs the cyclist. The car behind you needs to make similar decisions, but if it sees you haven't moved out, and it can't either, but there's enough space for the cy

  • by backslashdot ( 95548 ) on Thursday May 06, 2021 @10:54PM (#61357184)

    Glad to see Cruise is progressing, unlike Toyota.Toyota unfortunately hired the wrong self driving team. I read multiple interviews where they claim that we can't have self driving cars because a car supposedly can't predict if someone is suicidal and going to jump in the middle of the street (like humans can do any better). They gave some weird esoteric examples that I am sure most humans would fail too. The car just needs to predict trajectories and make decisions based on that. It doesn't need to predict when someone is going to break the law and do a suicidal move. If that's the case it can never drive near another car on the highway because there is a chance the driver may go crazy. Why don't they quit their jobs if they can't figure out how to do it? They are happy to keep drawing a paycheck.

    Reference: https://spectrum.ieee.org/tran... [ieee.org]

    • by MrL0G1C ( 867445 )

      Funny thing is the self-driving car could easily react quicker to the suicidal person and could result in less harm being done.

      This article gives me a bit more confidence in the future of self driving systems. I've heard of both BMW and Tesla behaving in a manner that is borderline criminal when it comes to passing cyclists, but these days you can't tell if the self driving system was bad or if the driver was simply lying and saying the self driving system was responsible for actions that were their own. Fo

      • I'm waiting to hear about people intentionally jumping in front of cars, knowing full well they will stop, in order to try and rob those inside. The other concern I have is people screwing/trolling self driving cars since they know how the car might react. A human might try some defensive maneuvers or be able to recognize a bad situation about to unfold. Will cars? How will self driving cars handle snow covered roads with no visible lane markings, or very icy conditions where if you don't brake early en
        • by MrL0G1C ( 867445 )

          No visible lane markings is one that will no doubt throw a spanner in the works for self-driving systems, humans will be better at reading clues as to where the road is. As for sliding about on roads, I expect they'll be as bad as we are, I've been in a car that hit black ice, we got lucky because there were no parked cars in our sliding trajectory when that bit of road usually did have parked cars.

          Liability is a no-brainer, manufacturers must insure for when the self-driving system is active, there is no o

    • a car supposedly can't predict if someone is suicidal and going to jump in the middle of the street (like humans can do any better). They gave some weird esoteric examples that I am sure most humans would fail too. The car just needs to predict trajectories and make decisions based on that.

      I've driven in a city at least once in my life.

      We do this all the time, there are simple cues like eye contact between drivers and pedestrians, or if we're not looking at each other what is the other person looking at. We are extremely adept other person intention calculators. It is probably the thing we're best at, and the last thing machines will be good at. Are they impatiently stepping off the curb to walk behind our car as we pass, or are they totally oblivious to surroundings because conversation w

  • You know, to apply the brakes when they're about to blow through a stop sign, or when they're passing a bunch of cars on the right when the cars are stopped or while cars and bike are approaching an intersection?

    Everyone seems to be focussing on cars - it seems the market for other vehicles is wide open.

  • by Beeftopia ( 1846720 ) on Thursday May 06, 2021 @10:59PM (#61357204)

    There's something called "Polyani's Paradox" [wikipedia.org]:

    Summarised in the slogan "We can know more than we can tell", Polanyi's paradox is mainly to explain the cognitive phenomenon that there exist many tasks which we, human beings, understand intuitively how to perform but cannot verbalize the rules or procedures behind it.[2]

    This "self-ignorance" is common to many human activities, from driving a car in traffic to face recognition.[3] As Polanyi argues, humans are relying on their tacit knowledge, which is difficult to adequately express by verbal means, when engaging these tasks.[2] Polanyi's paradox has been widely considered a major obstacle in the fields of AI and automation, since the absence of consciously accessible knowledge creates tremendous difficulty in programming.

    In machine learning, you can either tell the machine what it needs to know, or you can let it learn, which some would say is reinventing the wheel - wasted processor cycles.

    Driving seemed initially so simple - keep the car on the road, between the lines, avoid hitting anything. But as we've discovered, there is a huge amount of variation in that seemingly simple premise. So maybe tacit learning is the way to go. There's a bit of a discussion about it in the machine learning community, as noted in a recent issue of the CACM [acm.org].

    • > This "self ignorance" is common to many human activities

      yet we expect AI models to explainable when even we can't introspect our own processes ...
      • by Tom ( 822 )

        yet we expect AI models to explainable when even we can't introspect our own processes ...

        Our own processes are a shared experience - I may not understand how you and I recognize faces, but I can understand why you didn't recognize my face if you show me the picture in question and I see that it's just badly lit. I may not understand WHY it makes us fail, exactly, or where the threshold is or what the features are that decide it - but I can follow along because we share the same process (with small variations).

        But the AI and I don't have the shared process. I do not understand why it classified

        • I believe you can find a path between AI and human reasoning; after all both are doing number crunching at back-end (now neurons connect - the link weight assignment); Just that doing this work alone would require so much computation (equivalent to building the neural net in the first place) that it is more efficient (energy/computation wise) to just assume such weird results are possible/natural [ie calling a school-bus as giraffe] and design your higher level system taking this into account. A human analo
      • by jbengt ( 874751 )

        yet we expect AI models to explainable when even we can't introspect our own processes ...

        But when it comes to safety engineering, we require the processes to be explicit and verifiable - except for machine-learning self-driving (and 737 MAX attitude controls)

  • Humans don't really drive that well, so if "AI" can do better, that's great, but not particularly surprising!

    • what are you talking about? Until the AI can drive after 3 hours sleep then trun around to yell at kids in the back wild swerving over 3 lanes, it's clearly not as good as a Human. Also if the AI can't get shitfaced and wind up with a court ordered ignition interlock, it's just substandard (this is a mandatory feature in NM).

      Also we need to teach the AI to buy trucknuts, tail gate and roll coal.

  • by Anonymous Coward
    Why would ANYONE do anything to encourage this reckless behavior? Humans are already coddled to to the point where they don't even THINK about doing anything safely. THIS is why people don't like sharing the road with bikes.
    • by MrL0G1C ( 867445 )

      So being safer near cyclists is reckless behaviour. How did you manage to work that one out? I await your baseless assumptions.

  • by swell ( 195815 ) <jabberwock@poetic.com> on Friday May 07, 2021 @12:11AM (#61357348)

    An intelligence (artificial or otherwise) mostly learns from its mistakes. In a specific novel situation, such as a fast bicycle approaching from behind on the right, a smart car in training is likely to make a mistake. If it should become annoyed and swerve toward the bicyclist, thereby killing her (possibly a red-headed 4th grader late for school), then the car has begun to learn something about this particular situation. There are other slight variations of the situation that could also cause a mistake. Generally speaking, it is reasonable for the car to kill up to 6 cyclists in the process of fully understanding how to deal with very similar situations.

    You may think that 6 is excessive, but remember that these cars will be able to share what they've learned with the entire fleet. We can't be sure of the total number of risky situations that will present to these cars-in-training, perhaps 430. So it's fairly safe to predict that only 430X6, or 2580 people will be lost in the effort to create truly smart cars. Over time, many more lives may actually be saved!

    • An intelligence (artificial or otherwise) mostly learns from its mistakes

      That is the ideal situation but these AI cars aren't learning that way right now.

      Generally speaking, it is reasonable for the car to kill up to 6 cyclists in the process of fully understanding how to deal with very similar situations.

      For deep learning, it will take a lot more than six to get enough data for them to deal reasonably with it.

    • and sixty years hardtime for someone

    • > An intelligence (artificial or otherwise) mostly learns from its mistakes.

      Okay

      > If it should become annoyed and swerve toward the bicyclist, thereby killing her (possibly a red headed 4th grader late for school), then the car has begun to learn something about this particular situation.

      No, that's not all of it. What you're talking about is learning by reward/penalty. But there is also learning by predicting the future, such as if the cyclist was behaving in an expected way or not. There is
      • by Tom ( 822 )

        Humans don't need to drive off a cliff to learn to avoid this behavior in the future, either.

        Some do.

        In fact, lots of humans aren't actually very good at learning from mistakes. The reason machines will soon overtake is us is pride - a lot of people would rather go to great lengths to justify what they did than to just admit "I fucked up".

  • Tesla's self-driving fiasco is spooking cyclists by unpredictably making jerking movements into the left lane, possibly onto incoming traffic if the car detected a cyclist coming from the right. While head-on crashes and fiery explosions have not yet happened because of this new misguided feature introduced by Elon Musk (possibly in response to a Twitter joke), they could, in theory, happen any time.
    This is coming at a bad time for Tesla, at only ${TIME} after the news of ${OTHER_TESLA_RELATED_THING}.
    Concer

  • Work harder, get a car.
    https://www.youtube.com/watch?... [youtube.com]

  • This is actually a great example of a problem with "accomodative" ai programming, a sort of second-order challenge we haven't even begun to address.

    That is, "We CAN, but should we?"

    I see a bicyclist charging up on my right, I will NOT "pull slightly left" to make it easier for them to continue to follow a course that is dangerous, illegal in places, and frankly stupid.

    Considering the widespread opportunism and frankly maliciousness of these hairless chimps, we will see people deliberately "messing with" ai

    • You should look up the road around 100 feet and make room for the cyclist if it is safe. Otherwise stay behind the cyclist.
      • If the cyclist is COMING UP BEHIND ME, fuck him.
        Cyclists want to have the rights of CARS in TRAFFIC LANES.
        They can wait in the TRAFFIC LANE behind me...you know, like a car or truck. And then when it's clear, and he has the speed to overtake, he can pass me on the left.

        The #1 worst behavior is cyclists believing they are entitled to choose that they have the legal entitlements of "vehicles" or "pedestrians" changing as they feel, moment to moment.

        Fuck 'em. It's not my job to make the road safer for them i

    • I think as we see self-driving vehicles become more commonplace/the norm? People will learn to adapt to their behaviors. So if you feel like trying to get ahead of traffic, you might barrel down the middle of the road, swerving towards these vehicles so they try to move out of your way and clear a path for you to get through.

      It needs to be established that all the self-driving vehicles are following rigid sets of rules for driving that a person can't "trick them" into deviating from to cause traffic jams an

  • "To understand death, I must amass information on every aspect of it. Every kind of dying. The experiments shouldn't take more than a third of your crew, maybe half." - Nagilum, Star Trek the Next Generation
  • (as a cyclist) I'd personally prefer that cars be predictable rather than sometimes doing weirdly nice things that I might not be prepared for.

    THIS! I frequently have cars stop right in the middle of a 4-lane with traffic b/c I'm waiting on my bike to cross the steet. This is so insanely dangerous. And they're looking at me, with 3 other lanes of cars buzzing by, like "well?" Please, @carppl don't move into oncoming traffic because you think I may want to pass you on the right. This is not a safe maneuver.

  • I'd prefer if autonomous cars go into kill mode whenever they detect spandex bike pants. Regular cyclists I don't mind.
  • by mveloso ( 325617 ) on Friday May 07, 2021 @12:23PM (#61359128)

    "Humans tend to have the edge when something unusual happens, because we're able to instantly apply a lifetime's worth of common-sense knowledge about the world to our decision-making process"

    I posit that this is a statement not only not supported by data, but completely made up by the reporter.

    In fact, in real life it seems that humans have basically no edge unless they're trained to handle unusual situations.

  • In CA, the law requires vehicles to give bicycles 3' clearance. In order to do that, it might be mandatory for the car to move to the left a little. Of course, this probably assumes the car will be overtaking the bike, and thus that a human driver can see it. But if the car is driving and sees a bike approaching from behind at less than 3', it is probably legally required to make that move. This law is as much for the drivers as it is for the bikers, who can go ahead and have their own opinions about non-da

One way to make your old car run better is to look up the price of a new model.

Working...