Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation Software Technology

Tesla Will Allow Aggressive Autopilot Mode With 'Slight Chance of a Fender Bender' (theverge.com) 190

During Tesla's "Autonomy Investor Day" today, Elon Musk said that the company will someday allow drivers to select aggressive modes of its Autopilot driver assistance system that have a "slight chance of a fender bender." "Musk didn't say when Tesla might roll out that option, only that the company would have to have "higher confidence" in Autopilot's capabilities before allowing it to happen," reports The Verge. From the report: "Do you want to have a nonzero chance of a fender bender on freeway traffic?" Musk asked at the event, which was for investors in the company. He dubbed it "LA traffic mode," because "unfortunately, [it's] the only way to navigate LA traffic." Tesla already allows its owners to select a "Mad Max" setting for Navigate on Autopilot, which is a feature that handles highway driving from on-ramp to off-ramp. The Mad Max setting makes quicker lane changes than if the car is in "Mild" or "Average" modes. Musk suggested Tesla will eventually allow drivers to choose "gradually more aggressive behavior" by "dial[ing] the setting up." Musk also said Tesla's full self-driving computer is now in all new Model 3, X and S vehicles, and a next-gen chip that's "three times better" than the current system is already "halfway done."
This discussion has been archived. No new comments can be posted.

Tesla Will Allow Aggressive Autopilot Mode With 'Slight Chance of a Fender Bender'

Comments Filter:
  • Slight Chance of a Fender Bender = you may die! when it rams into an beam!

    • The man thinks he's the "God Of Innovation". What does he care if his cars BLOW UP ( https://edition.cnn.com/2019/0... [cnn.com] ) or his - cough - Autopilot winds up killing a "mortal" who "isn't a God Of Innovation like Elon".
      • by Anonymous Coward

        In all seriousness, consider how little danger you are actually in when using just a simple lane follower in the middle lane. It's not like the car is depending on a single set of dashes to drive straight. There are TWO sets of dashes guiding the car. One set may be the main guide, but the other set controls the position of the car just as much, and given the explicit purpose is to stay in the lane, if the algorithm is properly employing analysis of both dividers the car has literally almost no chance of cr

        • by Anonymous Coward

          The minute any autonomous guidance system takes enough control over any vehicle driven in public is the moment the maker of that system assumes any and all responsibility for crashes, damages, casualties, etc. it might produce.

          Musk is an absolute idiot to push that as a sales pitch. He thinks that because it is done in the aviation and military markets that it can and should be done everywhere possible (for huge potential profits). His greed blinds him and all the people he has convinced to buy in on his be

        • by Rei ( 128717 ) on Tuesday April 23, 2019 @03:29AM (#58476002) Homepage

          There are TWO sets of dashes guiding the car. One set may be the main guide, but the other set controls the position of the car just as much, and given the explicit purpose is to stay in the lane, if the algorithm is properly employing analysis of both dividers the car has literally almost no chance of crashing.

          It's actually a lot more complicated than that still. These brief "soundbite" Slashdot headlines are sort of annoying because they miss all of the really interesting detail during yesterday's presentation (note: I'm not a FSD optimist... but even I found it fascinating). Here's a brief rundown of the process.

          1) Humans annotate images from the vehicles' cameras as to where the safe driving areas are in a video (including where shoulders are), manually label objects, etc
          2) The neural nets are tasked with identifying the safe driving areas and all objects in the scene, and trained to the dataset.
          3) Wherever a weakness shows in the net, a campaign is launched for that weakness. Simulators create endless variants of the aforementioned scenario, while customer vehicles are polled to collect real-world data on similar tricky circumstances, which are annotated by human annotators to expand the training data.

          Note what's not mentioned in the above: lane lines. It's never taught what a lane line looks like. Just like, for example, in determining another driver's intent to change lanes, it's never taught what a blinker is. The neural net is allowed to use any and all clues in the scene to determine where lanes are, not just specific ones that humans might offhand think are important but might not apply in all circumstances or could fail in some circumstances.

          For example, on lane-change intent, the network might notice that a car ahead has started drifting toward the edge of the lane, that they suddenly changed their speed, etc, and find these clues to be more reliable than a blinker. Or it may find blinkers particularly useful for prediction in some geographic area but not others. It'll use whatever combination of factors yields the best training score.

          When it comes to lane prediction, it can get almost magical-feeling because the neural net outright predicts where lanes are going to go in places that it can't even see yet, just based on context clues in the scene. But then again, we humans do that too, and an ability to do so is an important aspect of driving.

          Don't forget, there is no autopilot per se, just a collection of different pilot-like systems

          In a way, this is a key aspect. It's easy to think of there being a single "autopilot" neural net, but actually there's numerous subsystems making independent calculations, and then data fusion combines all of these outputs into a model of the world. For example, if you have a jogger running in front of the car, one net might identify and tag them as "jogger". A completely separate visual obstruction-detection system might identify an obstruction at that location. Different cameras may all add their own interpretations, along with radar and ultrasonics. This is all fused together to create an overall sense of the world around the vehicle - which then has to be interpreted not just for "how things are", but also for intent. For example, it's one thing to identify an animal on the side of the road - but is it likely to jump out in front of you? Like the driver lane change example, you can train to intent detection, while also applying various cautionary principles, such as, "If I see X, I better slow down to no more than Y speed given Z environment".

          There's also the issue of "detections can be right, and yet still wrong". For example, picture a car with a hitch-mounted bike rack (like this [minimania.com]). The net will correctly identify both a car and a bicycle... but shouldn't, because they're actually one object. If it sees the bike and expects it to m

        • Clearly you have never driven in places with snow. Even when the plow the roads the remaining gravel and frost is enough to hide 70% of the lines.
          • by Rei ( 128717 )

            Exactly - it's an important reason why simply training specifically to recognize lane lines is entirely unsuitable for the task. The neural net has to be free to use any context clues in the scene - not just ones that humans might offhand think important but which can fail in specific circumstances.

    • Ramming speed!
    • by DrXym ( 126579 )
      Or you cause injury to the driver or passengers in the car in front or behind you, neither of which may have reaction speed of a Tesla set to asshole mode.
    • by mark-t ( 151149 )
      Fender benders by definition are not life threatening... they are a specific type of collision between a vehicle and another vehicle or object (as opposed to a pedestrian) at low enough difference in velocity such that the *only* damage caused is to the vehicle(s) involved, and that ordinarily, air bags would not even have needed deployment to further prevent any occupants from injury (injuries that may be a direct result of unnecessary airbag deployment itself notwithstanding).
  • by sphealey ( 2855 ) on Monday April 22, 2019 @06:48PM (#58474552)

    How exactly does that work, legally? Deliberately setting a safety-related system to be less safe than it is capable of and in so doing increasing the chance of injury (physical, financial, medical) to other parties who do not have control over that decision? Seems to me that Musk and Tesla just bought their shareholders unlimited liability for every accident involving a Tesla forever.

    • by ShanghaiBill ( 739463 ) on Monday April 22, 2019 @07:03PM (#58474632)

      Deliberately setting a safety-related system to be less safe than it is capable of

      Human drivers do this every time they depress the accelerator.

      The safest option is to never leave your house.

      • Human drivers do this every time they depress the accelerator.

        Sad to see you modded down for obvious truths. Many luddites on Slashdot who would rather die at the hands of incompetent human drivers rather than letting the worst drivers be replaces by more reliable computer systems.

      • Comment removed based on user account deletion
      • by eepok ( 545733 )

        Yes, but it's hard to prove it. This is a switch that says, "Don't be as safe as I should be."

        This is where futurists just don't get it. With any algorithmic driving system (or AI), you can definitively control how "careful" the driver is and telling the driver to be less careful than it can be immediately makes one susceptible to criminal prosecution and civil liability on the basis of intentionally driving without "due care".

    • > How exactly does that work, legally?

      By making it an intentional act by the driver. Of course, there's nothing to stop a plaintiff from cross-claiming Tesla as a deep pocket if the owner can't afford to pay the judgment himself... but as long as Teslas are fairly expensive, and most of their purchasers are relatively wealthy, the risk of Tesla having to bear the brunt of more than an occasional million-dollar lawsuit are relatively low.

      My prediction: as time passes and old Teslas become increasingly aff

      • Let's ask the real question: Will the NTSB recommend "aggressive mode" to be insured at all? If insurance companies know that the car is designed to allow cause/accidents (and injury) as part of is behavior, the owner's liability insurance is going higher than any of the numbers you mentioned above. That car would turn in to the car that you can afford to buy but not afford to insure. Who wants to drive a car that would cost tens of thousands of dollars per year to insure when the car only costs $50,000? It
        • by AmiMoJo ( 196126 )

          Insurance companies will want to know if the car was in aggressive mode when the accident happened. If it was then liability will shift. Maybe not 100% to the Tesla driver, but it might go 50/50 where it would have been 80/20 in their favour if it's known that they were driving aggressively.

          Oh yeah, it will be considered them driving, not autopilot. Tesla's get-out clause for all this is that autopilot is in perpetual beta and the driver must have hands on the wheel and be paying attention at all times, so

          • Insurance companies will want to know if the car was in aggressive mode when the accident happened.

            No they don't. That kind of investigation will cost them money, and the last thing they want is for every accident to cost them more money than it needs to. Tesla owners will just be all covering the cost of the investigation then.

    • How exactly does that work, legally? Deliberately setting a safety-related system to be less safe than it is capable of

      Actually having watched the video (shocker, since it took some time) I can actually answer this with a reasonable answer.

      The answer is this: How much like a human did the car drive when the accident occurred?

      This statement made (people will be able to select a mode that allows for a slight fender bender) is really meant for something like L.A. traffic, where you simply have to be aggressiv

    • by eepok ( 545733 )

      This is correct. Regardless of EULA that is tapped to enter the mode, making the mode available and taking control of the vehicle with the explicit intention to be less safe than knowingly capable invites liability. Effectively, it's a less-than-"due care" mode when the law requires that one exercise due care on the road.

  • I suppose all is fine so long as you're *more* aggressive than the next guy...
    • Next is the 'very aggressive' mode which will drive you over at the first opportunity, and the 'suicide mode' which drives at speed into the first large object it sees.

  • Might just be a marketing trick. The switch/button will actually do nothing to change the autopilots behavior but give the drivers a feeling of "Ha Ha! My car will push yours into the gravel if you get in my way!! feeling.

    Like how auto manufactures will put a speedometer that goes to 170mph in a car that can only do 90mph downhill with a heavy tailwind. Its all just a psychological trick to get you to buy their car because you think it is really fast.

    Otherwise I can see Tesla getting sued into oblivion t

    • > Otherwise I can see Tesla getting sued into oblivion the first time someone gets maimed/killed while the autopilot was in "aggressive" mode.

      In the US, courts tend to put the value of "wrongful death" at around $1.5-3 million/death. There are occasional outliers, but that's pretty close to the norm, even when the death occurs as a direct result of intentional negligence or criminal conduct by a company's employees. It would hurt Tesla's bottom line, and might lead to policy changes if it happened too of

    • by kriston ( 7886 )

      I've always found the 140+ MPH speedometer an infuriating gimmick. I recently had to shop for a new vehicle and the 4-cylinder puddle jumpers with 160 MPH speedometers is ponderous.

      I eventually bought a car that displays the speed as a number instead of a silly gauge that never swoops more than 1/4 of its travel in real-world driving.

      That massive tachometer for an automatic transmission, on the other hand...

      • by Sique ( 173459 )
        As someone who drives a 4-cylinder car whose speedometer goes up to 150 mph (240 km/h), I actually tested it on a flat surface, and the speedometer reached 125 mph (200 km/h). So it made sense to put the speedometer limit to 150 mph.

        On the other hand, I live close to Germany, where you can actually drive 125 mph.

        Don't underestimate the top speed even a puddle jumper can do! Yes, it might take ages to reach top speed, but here it is.

        For those interested: It was a Skoda Octavia Praktik 1.9 TDI with 81 k

    • My guess is that the switch does nothing but lets the Tesla owner assume more responsibility when it gets into an accident. And look, it was accidentally set to "on" in 100% of the cases Teslas had an autodriving accident!

  • by Grand Facade ( 35180 ) on Monday April 22, 2019 @07:01PM (#58474618)

    Handle traveling on 280 in the bay area?
    There are sections of that freeway that if you are not traveling 75 or 80 you are impeding the flow of traffic!

    Coming soon to your local auto parts store
    The Lewis Hamilton hack for your Tesla AutoPilot
    It slices and dices, it goes to not just 11, 12 will get you there even faster!

    • It crashed into a divider on the 101, just a few miles from their offices and killed it's occupant on a sunny day. I want them to improve on the 'lets not accelerate into a brick wall if the lines become confusing' tech.

    • Handle traveling on 280 in the bay area? There are sections of that freeway that if you are not traveling 75 or 80 you are impeding the flow of traffic!

      Yeah, because the big problem in the Bay Area is going TOO fast on freeways.

    • Handle traveling on 280 in the bay area?
      There are sections of that freeway that if you are not traveling 75 or 80 you are impeding the flow of traffic!

      There are lots of lanes on that part of the 280. And being where it is, there are usually lots of slower vehicles off on the right-hand side of the highway. Mostly old people, and people in old Subarus or VWs. You're only impeding the flow of traffic if you're in the wrong lane.

  • What WILL the insurance companies and lawyers do?

    HINT: Can you say higher insurance rates?

    • Can you say higher insurance rates?

      Insurance rates go by average of large numbers. The question is whether somewhat higher level of aggression will cause more damage in the long run. The answer is not so simple, because you have to take into account the effects on predictability of your driving, and the resulting effects on smooth traffic flow. For instance, if there's a 0.1% chance of a fender bender that can be reduced to 0% by slamming on the brakes, it does not automatically mean that you should therefore slam on the brakes, because tha

  • Hopefully this dangerous fidgety twerp will get reined in once grown up leaders return to the U.S. Capitol.
  • by skam240 ( 789197 ) on Monday April 22, 2019 @07:11PM (#58474668)

    I have about zero interest in the current state of autopilot for cars. If I can't take my focus off the road then what the hell is the point (at least the vast majority of the time)? Sure, a future where I could take a nap, read a book, or do anything else I can dream up, while moving between points A and B would be fantastic but if I have to continuously monitor the road I might as well just be driving.

    As far as I can see, current autopilot tech is purely a novelty with no real value beyond that.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      You can do that now. It is called "being chauffeured". There is also a version for poor people like you. It's called Public transportation or the night train

    • If I can't take my focus off the road then what the hell is the point (at least the vast majority of the time)? Sure, a future where I could take a nap, read a book, or do anything else

      With current Tesla autopilot you could easily read a book while the car handled stop-n-go traffic, where you creep forward a few feet at a time... simply stop reading anytime the car starts moving with any speed, or a lot of people are trying to merge. I have wasted too many hours sitting there watching cars way up ahead mo

    • Tesla has killed 3 times as many people [wikipedia.org] in self-driving car accidents than Uber (the only other company with fatalities). Further, anyone stupid enough to trust Tesla's FSD deserves a Darwin award. Taking rich idiots out of the gene pool is a universal good and I'd like to thank Elon Musk for his public service.

      • by brunes69 ( 86786 )

        That one guy ran into the attenuator, yes. And from that point on, the software installed ON ALL CARS, FOR ALL TIME, will never make that mistake again.

        The benefit of these self driving systems is not that they are perfect. It is that they will become very close to perfect, over a very short period of time, because of positive and negative feedback loops of accidents on the road.

        Even the most basic of automation will, over a very very short period of time, will be far safer than any human driver on the road

  • by wolfheart111 ( 2496796 ) on Monday April 22, 2019 @07:14PM (#58474676)
    Takes you through the most beautifullest part of the city. Something that pertains to the more moderate people who dont mind leaving a little early so they dont have to drive in ludicrous mode...
  • suing Tesla owners in my future "fender benders".

  • by thedarb ( 181754 ) on Monday April 22, 2019 @08:06PM (#58474898)

    Or are you just trying to get bought out by BMW?

  • Stop promising things.

    How long until Elon starts selling .JPGs of his cars?

  • by SuperKendall ( 25149 ) on Monday April 22, 2019 @09:11PM (#58475110)

    I was land changing and driving behavior that responds to those driving around me.

    If there's someone simply slow in a lane, pass them smoothly, merge back in way ahead of them. No need to grief anyone just being cautious.

    The guy who has been driving in your blind spot for 10 miles matching your speed almost exactly and refusing to pass on the left? Oh that's easy, gun it for a second and cut that bastard off to move over if it's at all useful.

    In all seriousness it would be good if self driving cars could estimate a danger level for cars around you and be extra ready for action based on danger from those quadrants with more erratic or mean drivers. Some cars you can tell are very passive aggressive and will do things like try to anticipate your moves and cut you off, or pass you only to slow way down, and it would be nice to have a self driving car able to deal with strange things like that in different ways.

    • The guy who has been driving in your blind spot for 10 miles matching your speed almost exactly and refusing to pass on the left?

      Oh this gives me the shits. The fast lane is for overtaking, not cruising in. The police here won't fine you for doing 10km/h over, especially not if you do it for 10 seconds just to pass someone and then slow down again.

  • Apparently LA or more accurately, CA, is a land full of idiot drivers where people hit each other all the time even though it's virtually flat, dry and sunny all the time. If a fender bender is acceptable behavior, perhaps develop your car elsewhere, in the Northeast of the US people drive as fast as CA in ice, snow, sleet, rain, fog in full darkness while weaving over hills and crummy roads at 65-75mph and we barely have any fender benders.

    • Apparently LA or more accurately, CA, is a land full of idiot drivers where people hit each other all the time even though it's virtually flat, dry and sunny all the time.

      People stuck in traffic are not at their best. They're breathing a lot of bad air, and they're under a lot of stress, so their brains aren't working optimally.

      If a fender bender is acceptable behavior, perhaps develop your car elsewhere, in the Northeast of the US people drive as fast as CA in ice, snow, sleet, rain, fog in full darkness while weaving over hills and crummy roads at 65-75mph and we barely have any fender benders.

      No, you have freeway pileups. In any case, the state with the most automobile accidents is Florida, because of course it is. California has neither the most accidents, nor the most accidents per capita.

    • by hawk ( 1151 )

      >Apparently LA or more accurately, CA, is a land full of idiot drivers where people hit each other all the time
      >even though it's virtually flat, dry and sunny all the time.

      And water is wet.

      This has been known for a *long* time . . .

      I had to drive through LA every few months during the freeway shootings. I suspect that a large number were justified self defense . . .

      Although only in Northern California have I seen people *slow down* to prevent you from pulling *behind them* to get off a freeway! I h

  • This is ludicrous. Without LIDAR you can't have full self-driving cars.

    So he's saying all Teslas have LIDAR? Are they using stereoscopic cameras? What's really going on?

    What about his announcement that they're abandoning nVidia chipsets?

    • Without LIDAR you can't have full self-driving cars.

      Most human drivers seem to be doing just fine without LIDAR, only using two eyes at suboptimal positions.

      • Most human drivers have an image recognition hardware far beyond anything that is currently technically possible.

        • Depends. There are certainly plenty of situations where human drivers excel, but in other situations, the machine vision is already better.

          But the question wasn't about processing, but about LIDAR vs camera. Replacing the cameras with LIDAR does not remove the need for advanced image recognition. If the LIDAR picks up a bicycle near the edge of the road, you still need to process the data to predict whether it's going to interfere with the car's motion planning, and that's just as hard as using camera ima

        • Most human drivers have an image recognition hardware far beyond anything that is currently technically possible.

          My two eyes and brain are superior to any computer with two cameras, but I can't do everything that a computer with eight cameras can do. I'm better at some things, and not as good at others. And if it's got LIDAR, then forget it. Its depth perception is dramatically better than mine. Of course, Tesla doesn't...

  • by OrangeTide ( 124937 ) on Monday April 22, 2019 @11:12PM (#58475448) Homepage Journal

    New "Hyper-Aggressive" mode executes jaywalkers and cyclists. ALL cyclists, for any reason, even if they are sleeping in their homes.

    The only AI system trained on data from SF MUNI bus drivers.

  • Here I was thinking that the self driving cars of the future would solve highway safety issues and allow me to finally enjoy riding in the car.
    Instead we get "agressive" and "mad max" modes. Jeez. Why don't you also install some boring company flame throwers on the front bumper while you're at it? Plus a spiked cow catcher.

  • This whole self-driving car thing seems like a total fail to me.

    • This is needed if you want to have the car drive just as well as yourself, because human drivers are also driving "with a slight chance of a fender bender".

      If you're not a little bit aggressive, you can't merge in busy traffic.

  • Comment removed based on user account deletion
    • That tech isn’t ready yet to be entrusted with MY life.

      But you trust other random human drivers with your life ?

    • "... entrusted with MY life."
      We do it all the time. Pilots, cab drivers, bus drivers, train engineers, even uber drones.
      Also there's the people that just mounted your new tires, or shocks, or ball joints.
      You are depending on someone else keeping them up to snuff, but that's just kicking
      the trust down the chain one or two steps.

      I'm not a huge proponent of self driving, my closest exposure is old-school cruise control.
      I suspect I'd really, really enjoy owning a Tesla variant of any type than a 2008 Civic...

      No

  • "Mild" or "Average" modes.
    "LA traffic mode
    "Mad Max"

    So where's Robo-Cop mode? Or tank mode, I don't care what you call it.

    "He's in my way. OK Google, solve the problem." "Firing solution found. Ready to engage." NOW you're talking cars. Or cars talking, whichever.
  • Seriously, can I order a Tesla that has none of this shitty hardware or software that I don't want at all? I just want to drive, not play Russian Roulette with my life.
  • 1) If the cars can't cope with humans being around, they shouldn't be on the roads with humans (whichever way you want to do that).
    2) If the cars are literally set to "allow collision", fender-bender or not, then you're into a world of hurt liability-wise. Every time you use that mode, you're basically admitting driving without due care and attention.
    3) If you do have an accident with that enabled, instant liability. No questions asked. You selected a mode that made it drive badly, game over.
    4) Encouragi

    • 2) If the cars are literally set to "allow collision", fender-bender or not, then you're into a world of hurt liability-wise. Every time you use that mode, you're basically admitting driving without due care and attention.

      That's how a normal human driver operates, and it hasn't stopped insurance companies from paying up.

      4) Encouraging such action (even though I suspect it's nonsense that'll never see production) is positively dangerous

      On the contrary. Ultra safe driving, like Waymo does, is more dangerous, because you'll get the car slamming on its brakes at random times to rule out a tiny chance of a fender bender. This has the effect of causing more serious collisions by other drivers who are not anticipating that kind of behavior.

      • by ledow ( 319597 )

        If you have contact with a car, you are responsible unless you're basically hit from behind. Changing lanes, merging, driving close and failing to brake in time, or any other manoeuvre can make any "accident" (what I call a "deliberate") automatically your fault, in an insurance claim. Yes, if you're merging and contact a car who's also merging, most insurers will hold you *both* at fault for failing to yield.

        Again, 4)... if that depends on 1) then you need to implement 1) first.

        • Merging is a good example. It's a delicate balance between giving and taking space. If you engage in ultra safe driving, and are never prepared to take a risk of a fender bender, other people will simply claim the space, and cut you off. The end result is that you're stuck at the end of the merge zone, waiting for rush hour to pass.

          A bit more aggressive driving means that you steer the car in a small opening, forcing other cars to slow down to avoid a collision. The trick is that you must be aggressive en

          • by Zak3056 ( 69287 )

            This is perfectly in line with behavior of other human drivers, and it's a routine case for insurance companies. None of those companies are interested in doing an expensive investigation in the precise liabiility, because that would be more expensive than just paying out for a couple of new fenders.

            The reason is not "expensive investigation" it's "impossible investigation" as the claim typically devolves into "he said, she said." Add a dash cam, and liability is easier to apportion. Add a literal software setting that says "act aggressively" and that's going to be instant liability, no investigation required.

  • one of the benefits of self-driving cars was to me that you'd have all the asshollery removed from traffic as the 'AI' would nicely follow the rules and make to most safest & sensible decissions.

    but no, it seems you can select a level of douchebaggery into the driving style, what are they thinking?
    the end result will be that everybody will drive in the most agressive mode, otherwise the other AI's will just bully you of the road.

    • one of the benefits of self-driving cars was to me that you'd have all the asshollery removed from traffic as the 'AI' would nicely follow the rules and make to most safest & sensible decissions

      Yes, that's what a lot of people thought. And then we saw Waymo cars slamming on their brakes for no apparent reason, and getting stuck at the end of the merge zone.

      the end result will be that everybody will drive in the most agressive mode, otherwise the other AI's will just bully you of the road.

      No, because that's not what's happening between human drivers. A little bit of aggression is useful, but too much, and you'll get yourself in more accidents than it's worth.

      Also, as more and more self driving cars appear on the road, and they interact together, we can use more cooperative algorithms, or vehicle-vehicle signalling.

  • this will be the lawsuit that finally muzzles Musk

  • by Daralantan ( 5305713 ) on Tuesday April 23, 2019 @07:57AM (#58476664)
    I wasn't so in to the Tesla auto pilot before. But AGGRESSIVE MODE? Sign me up for my car cutting people off, honking the horn like a samurai, and flipping them some kind of giant LED HUD middle finger.

    a "Mad Max" setting

    Hell yes. Now I can ride on top of my Tesla, playing a guitar with giant flame throwing speakers?!

  • All of life involves risk calculations and tradeoffs.

    I can't speak to the calculations/tradeoffs that Musk is specifically addressing here, but I'm not going to get outraged just at the concept.

    Just driving the thing (or anything) at all is riskier than leaving it at home.

  • Because "mild fender benders" can cause major brain and other damage to infants and young children.

    Maybe someone should actually hire an ethicist at Tesla.

  • by kaatochacha ( 651922 ) on Tuesday April 23, 2019 @01:42PM (#58478542)

    Jerk driver mode. I deal with people who drive like this all the time. You know what? YOU DONT GET THERE ANY FASTER.

Save energy: Drive a smaller shell.

Working...