Tesla Will Allow Aggressive Autopilot Mode With 'Slight Chance of a Fender Bender' (theverge.com) 190
During Tesla's "Autonomy Investor Day" today, Elon Musk said that the company will someday allow drivers to select aggressive modes of its Autopilot driver assistance system that have a "slight chance of a fender bender." "Musk didn't say when Tesla might roll out that option, only that the company would have to have "higher confidence" in Autopilot's capabilities before allowing it to happen," reports The Verge. From the report: "Do you want to have a nonzero chance of a fender bender on freeway traffic?" Musk asked at the event, which was for investors in the company. He dubbed it "LA traffic mode," because "unfortunately, [it's] the only way to navigate LA traffic." Tesla already allows its owners to select a "Mad Max" setting for Navigate on Autopilot, which is a feature that handles highway driving from on-ramp to off-ramp. The Mad Max setting makes quicker lane changes than if the car is in "Mild" or "Average" modes. Musk suggested Tesla will eventually allow drivers to choose "gradually more aggressive behavior" by "dial[ing] the setting up." Musk also said Tesla's full self-driving computer is now in all new Model 3, X and S vehicles, and a next-gen chip that's "three times better" than the current system is already "halfway done."
Slight Chance of a Fender Bender = you may die! (Score:2, Insightful)
Slight Chance of a Fender Bender = you may die! when it rams into an beam!
Lord Father Musk Doesn't Care About "Mortals" (Score:1)
Re: Lord Father Musk Doesn't Care About "Mortals" (Score:2, Insightful)
In all seriousness, consider how little danger you are actually in when using just a simple lane follower in the middle lane. It's not like the car is depending on a single set of dashes to drive straight. There are TWO sets of dashes guiding the car. One set may be the main guide, but the other set controls the position of the car just as much, and given the explicit purpose is to stay in the lane, if the algorithm is properly employing analysis of both dividers the car has literally almost no chance of cr
Re: Lord Father Musk Doesn't Care About "Mortals" (Score:2, Interesting)
The minute any autonomous guidance system takes enough control over any vehicle driven in public is the moment the maker of that system assumes any and all responsibility for crashes, damages, casualties, etc. it might produce.
Musk is an absolute idiot to push that as a sales pitch. He thinks that because it is done in the aviation and military markets that it can and should be done everywhere possible (for huge potential profits). His greed blinds him and all the people he has convinced to buy in on his be
Re: (Score:3)
Re: Lord Father Musk Doesn't Care About "Mortals" (Score:5, Interesting)
It's actually a lot more complicated than that still. These brief "soundbite" Slashdot headlines are sort of annoying because they miss all of the really interesting detail during yesterday's presentation (note: I'm not a FSD optimist... but even I found it fascinating). Here's a brief rundown of the process.
1) Humans annotate images from the vehicles' cameras as to where the safe driving areas are in a video (including where shoulders are), manually label objects, etc
2) The neural nets are tasked with identifying the safe driving areas and all objects in the scene, and trained to the dataset.
3) Wherever a weakness shows in the net, a campaign is launched for that weakness. Simulators create endless variants of the aforementioned scenario, while customer vehicles are polled to collect real-world data on similar tricky circumstances, which are annotated by human annotators to expand the training data.
Note what's not mentioned in the above: lane lines. It's never taught what a lane line looks like. Just like, for example, in determining another driver's intent to change lanes, it's never taught what a blinker is. The neural net is allowed to use any and all clues in the scene to determine where lanes are, not just specific ones that humans might offhand think are important but might not apply in all circumstances or could fail in some circumstances.
For example, on lane-change intent, the network might notice that a car ahead has started drifting toward the edge of the lane, that they suddenly changed their speed, etc, and find these clues to be more reliable than a blinker. Or it may find blinkers particularly useful for prediction in some geographic area but not others. It'll use whatever combination of factors yields the best training score.
When it comes to lane prediction, it can get almost magical-feeling because the neural net outright predicts where lanes are going to go in places that it can't even see yet, just based on context clues in the scene. But then again, we humans do that too, and an ability to do so is an important aspect of driving.
In a way, this is a key aspect. It's easy to think of there being a single "autopilot" neural net, but actually there's numerous subsystems making independent calculations, and then data fusion combines all of these outputs into a model of the world. For example, if you have a jogger running in front of the car, one net might identify and tag them as "jogger". A completely separate visual obstruction-detection system might identify an obstruction at that location. Different cameras may all add their own interpretations, along with radar and ultrasonics. This is all fused together to create an overall sense of the world around the vehicle - which then has to be interpreted not just for "how things are", but also for intent. For example, it's one thing to identify an animal on the side of the road - but is it likely to jump out in front of you? Like the driver lane change example, you can train to intent detection, while also applying various cautionary principles, such as, "If I see X, I better slow down to no more than Y speed given Z environment".
There's also the issue of "detections can be right, and yet still wrong". For example, picture a car with a hitch-mounted bike rack (like this [minimania.com]). The net will correctly identify both a car and a bicycle... but shouldn't, because they're actually one object. If it sees the bike and expects it to m
Re: (Score:2)
Indeed, all training data has to come with counter-training data. One thing that they discussed a lot during the event was that there's no point - and it's actually counterproductive - to just record everything and train with everything, or a random selection thereof. You want cases that illustrate tricky situations / edge cases which run opposite to each other - for example, bikes on cars vs. bikes passing behind cars vs. bikes reflected in puddles vs. pictures of bikes on signs, and so forth. It's all
Re: (Score:2)
Re: (Score:2)
Exactly - it's an important reason why simply training specifically to recognize lane lines is entirely unsuitable for the task. The neural net has to be free to use any context clues in the scene - not just ones that humans might offhand think important but which can fail in specific circumstances.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Slight Chance of a Fender Bender = you may die! (Score:5, Informative)
Watch the included video. Note the time on the screen. Watch how the time changes during the "explosion". Second 19 (the "explosion") is shortened to half the length of the others. Second 20 is dragged out to 1 1/2 times the length of the others. Someone has clearly modified this video to try to make a fire look like an explosion by speeding up second #19, without altering the audio track or total video length.
Who? Why? Were they involved in the incident? Not a clue. But it's clearly been manipulated.
But let's just take it at face value. BMW in the past year has had hundreds of spontaneous fires. This would, if actually "spontaneous", be the third for Tesla (all Model S, none in the Model 3 - indeed, I'm still awaiting a single report of a fire in a customer Model 3 in any circumstance, even extreme high-speed collisions). The other two "spontaneous" Model S fires were:
* Mary McCormack: The cause has not been publicly disclosed, but was believed to have been a prior debris strike.
* John Schneider: The cause was determined to be a bullet fired into the battery pack from the back seat.
As for fires in general, not "spontaneous" ones: As of early last year, there had been a total of 40 fires in Tesla for 7,5 billion miles driven, equating to around 5 fires per billion miles. By contrast, the rate of fires in gasoline cars is 55 fires per billion miles.
Re:Slight Chance of a Fender Bender = you may die! (Score:4, Insightful)
The cause was determined to be a bullet fired into the battery pack from the back seat.
Seriously, what is wrong with you people...
Re: Slight Chance of a Fender Bender = you may die (Score:2)
This is the land of the free, and if we want to shoot up the inside of our cars as God Almighty intended, we will, you cheese sniffing rifle dropper!
Re: (Score:3)
The cause was determined to be a bullet fired into the battery pack from the back seat.
Seriously, what is wrong with you people...
It was the only way to kill the new aggressive autopilot. :p
Imposing harm on others (Score:4, Interesting)
How exactly does that work, legally? Deliberately setting a safety-related system to be less safe than it is capable of and in so doing increasing the chance of injury (physical, financial, medical) to other parties who do not have control over that decision? Seems to me that Musk and Tesla just bought their shareholders unlimited liability for every accident involving a Tesla forever.
Re:Imposing harm on others (Score:5, Insightful)
Deliberately setting a safety-related system to be less safe than it is capable of
Human drivers do this every time they depress the accelerator.
The safest option is to never leave your house.
Modded down for truth? (Score:3)
Human drivers do this every time they depress the accelerator.
Sad to see you modded down for obvious truths. Many luddites on Slashdot who would rather die at the hands of incompetent human drivers rather than letting the worst drivers be replaces by more reliable computer systems.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Yes, but it's hard to prove it. This is a switch that says, "Don't be as safe as I should be."
This is where futurists just don't get it. With any algorithmic driving system (or AI), you can definitively control how "careful" the driver is and telling the driver to be less careful than it can be immediately makes one susceptible to criminal prosecution and civil liability on the basis of intentionally driving without "due care".
Re: (Score:2)
> How exactly does that work, legally?
By making it an intentional act by the driver. Of course, there's nothing to stop a plaintiff from cross-claiming Tesla as a deep pocket if the owner can't afford to pay the judgment himself... but as long as Teslas are fairly expensive, and most of their purchasers are relatively wealthy, the risk of Tesla having to bear the brunt of more than an occasional million-dollar lawsuit are relatively low.
My prediction: as time passes and old Teslas become increasingly aff
Re: (Score:3)
Re: (Score:2)
Insurance companies will want to know if the car was in aggressive mode when the accident happened. If it was then liability will shift. Maybe not 100% to the Tesla driver, but it might go 50/50 where it would have been 80/20 in their favour if it's known that they were driving aggressively.
Oh yeah, it will be considered them driving, not autopilot. Tesla's get-out clause for all this is that autopilot is in perpetual beta and the driver must have hands on the wheel and be paying attention at all times, so
Re: (Score:2)
Insurance companies will want to know if the car was in aggressive mode when the accident happened.
No they don't. That kind of investigation will cost them money, and the last thing they want is for every accident to cost them more money than it needs to. Tesla owners will just be all covering the cost of the investigation then.
Re: (Score:2)
As opposed to someone who buys, say, a Lamborghini? <Sarcasm>Because, you know, there are SO MANY poor/middle-class people driving expensive Italian sports cars, and everyone who DOES have one always obeys the speed limit</Sarcasm>
Re: Imposing harm on others (Score:3)
The thing is, NORMAL CARS "allow dangerous behavior" by default.
If someone with bipolar disorder drives a Mustang GT recklessly while having a manic episode, can somebody sue Ford for "allowing" the car to be driven in an unsafe manner? No, because the driver has ultimate legal responsibility. As long as autopilot systems maintain the polite fiction that drivers are theoretically in control, they can deflect most lawsuits unless someone can prove that the system to hand control to the driver failed to work
Re: (Score:2)
Re: (Score:2)
You could reasonably argue that autopilot, even in "aggressive" mode, is probably safer when driving in a zero-visibility downpour (like the one that swept across southwest Florida on Friday afternoon) than any human driver, because autopilot knows (from millisecond to millisecond) what each wheel's traction is, can see through rain & relative darkness,
Yes, that's all benefit.
and can react to hydroplaning adjacent vehicles faster than ANY human can.
Well, I don't know about that. It doesn't function instantaneously either. It has to actually perceive that the other vehicle is behaving oddly before it can react to that. It's a good and useful driving aid, but the driver is still responsible for reacting to that vehicle.
The computer never gets tired, and applies the same diligence to every situation. But it's also a bit short on sensors, and it has to figure everything out from image processing. It's better than the human in some
Re: Imposing harm on others (Score:2)
I think even imperfect Autopilot is likely to be a net improvement over humans in Florida-style torrential downpours. Other states have downpours, of course... but in Florida, they're a regular occurrence that often coincides with evening rush hour. In other states, they're sufficiently uncommon that most people will just wait for them to pass. It's different when storms like that happen 18-24 out of every 30 days between June and October.
Ultimately, the big improvement in Autopilot safety will come when th
Here's how it works (Score:3)
How exactly does that work, legally? Deliberately setting a safety-related system to be less safe than it is capable of
Actually having watched the video (shocker, since it took some time) I can actually answer this with a reasonable answer.
The answer is this: How much like a human did the car drive when the accident occurred?
This statement made (people will be able to select a mode that allows for a slight fender bender) is really meant for something like L.A. traffic, where you simply have to be aggressiv
Re: (Score:2)
This is correct. Regardless of EULA that is tapped to enter the mode, making the mode available and taking control of the vehicle with the explicit intention to be less safe than knowingly capable invites liability. Effectively, it's a less-than-"due care" mode when the law requires that one exercise due care on the road.
Re: (Score:2)
Arms race (Score:1)
Re: (Score:2)
Next is the 'very aggressive' mode which will drive you over at the first opportunity, and the 'suicide mode' which drives at speed into the first large object it sees.
Marketing stunt (Score:2)
Might just be a marketing trick. The switch/button will actually do nothing to change the autopilots behavior but give the drivers a feeling of "Ha Ha! My car will push yours into the gravel if you get in my way!! feeling.
Like how auto manufactures will put a speedometer that goes to 170mph in a car that can only do 90mph downhill with a heavy tailwind. Its all just a psychological trick to get you to buy their car because you think it is really fast.
Otherwise I can see Tesla getting sued into oblivion t
Re: (Score:3)
> Otherwise I can see Tesla getting sued into oblivion the first time someone gets maimed/killed while the autopilot was in "aggressive" mode.
In the US, courts tend to put the value of "wrongful death" at around $1.5-3 million/death. There are occasional outliers, but that's pretty close to the norm, even when the death occurs as a direct result of intentional negligence or criminal conduct by a company's employees. It would hurt Tesla's bottom line, and might lead to policy changes if it happened too of
Re: (Score:2)
A million cars times $1.5 million per fender bender is a trillion and a half.
A fender bender isn't going to cost $1.5 million, right ? And "a slight chance" does not result in a million incidents.
Re: Marketing stunt (Score:3)
$1.5-3 million is PER WRONGFUL DEATH, not "per fender bender"
In the US, you're generally unlikely to be successful suing for more than the documented cash value of an automobile accident that involves only property damage. Punitive damages generally require GROSS negligence or intent to cause harm... neither of which is likely to be demonstrable in court.
"More dangerous than the safest possible option" is NOT automatic evidence of "gross negligence", especially if it's statistically no worse than the averag
Re: (Score:2)
I've always found the 140+ MPH speedometer an infuriating gimmick. I recently had to shop for a new vehicle and the 4-cylinder puddle jumpers with 160 MPH speedometers is ponderous.
I eventually bought a car that displays the speed as a number instead of a silly gauge that never swoops more than 1/4 of its travel in real-world driving.
That massive tachometer for an automatic transmission, on the other hand...
Re: (Score:3)
On the other hand, I live close to Germany, where you can actually drive 125 mph.
Don't underestimate the top speed even a puddle jumper can do! Yes, it might take ages to reach top speed, but here it is.
For those interested: It was a Skoda Octavia Praktik 1.9 TDI with 81 k
Wrong marketing trick. (Score:2)
My guess is that the switch does nothing but lets the Tesla owner assume more responsibility when it gets into an accident. And look, it was accidentally set to "on" in 100% of the cases Teslas had an autodriving accident!
Re:Marketing stunt (Score:4, Interesting)
If the "less safe" button is actually fake, Tesla will get sued for false advertising. And if Tesla denies it is fake in the first few court cases, but later admits it was all fake, then Tesla will face a second set of liabilities: pretending that the drivers were wiling to accept an increased risk, so the drivers were actually liable. In other words, fraud.
unfortunately in usa what will happen is a settlement while claiming no foul play at all.
like... come on. this is a dude who manipulated stock prices with fake news. would he really lie about autopilot, a feature which the timeline for he has already lied about around a dozen times. the new chip is always the one to allow for fully autonomous driving, while it clearly isn't going to be the one.
How does the Tesla Auto-Pilot (Score:3, Interesting)
Handle traveling on 280 in the bay area?
There are sections of that freeway that if you are not traveling 75 or 80 you are impeding the flow of traffic!
Coming soon to your local auto parts store
The Lewis Hamilton hack for your Tesla AutoPilot
It slices and dices, it goes to not just 11, 12 will get you there even faster!
Re: (Score:2)
It crashed into a divider on the 101, just a few miles from their offices and killed it's occupant on a sunny day. I want them to improve on the 'lets not accelerate into a brick wall if the lines become confusing' tech.
Re: (Score:2)
You can't accelerate into a divider. Is your spatial awareness even at the mongoloid level?
The logs didn't show evidence of slowing down, it showed the car speeding up to almost 71mph when it crashed. The car did drive into the divider killing it's occupant, but it was also accelerating at the process.
This is important because not only did it fail to avoid the threat, it wasn't even attempting to mitigate it.
Re: (Score:2)
Handle traveling on 280 in the bay area? There are sections of that freeway that if you are not traveling 75 or 80 you are impeding the flow of traffic!
Yeah, because the big problem in the Bay Area is going TOO fast on freeways.
Re: (Score:2)
Handle traveling on 280 in the bay area?
There are sections of that freeway that if you are not traveling 75 or 80 you are impeding the flow of traffic!
There are lots of lanes on that part of the 280. And being where it is, there are usually lots of slower vehicles off on the right-hand side of the highway. Mostly old people, and people in old Subarus or VWs. You're only impeding the flow of traffic if you're in the wrong lane.
I can't wait (Score:2)
What WILL the insurance companies and lawyers do?
HINT: Can you say higher insurance rates?
Re: (Score:2)
Can you say higher insurance rates?
Insurance rates go by average of large numbers. The question is whether somewhat higher level of aggression will cause more damage in the long run. The answer is not so simple, because you have to take into account the effects on predictability of your driving, and the resulting effects on smooth traffic flow. For instance, if there's a 0.1% chance of a fender bender that can be reduced to 0% by slamming on the brakes, it does not automatically mean that you should therefore slam on the brakes, because tha
Dangerous Dude (Score:2)
Who even wants the current state of autopilot? (Score:5, Insightful)
I have about zero interest in the current state of autopilot for cars. If I can't take my focus off the road then what the hell is the point (at least the vast majority of the time)? Sure, a future where I could take a nap, read a book, or do anything else I can dream up, while moving between points A and B would be fantastic but if I have to continuously monitor the road I might as well just be driving.
As far as I can see, current autopilot tech is purely a novelty with no real value beyond that.
Re: (Score:2, Insightful)
You can do that now. It is called "being chauffeured". There is also a version for poor people like you. It's called Public transportation or the night train
Think about stop-n-go traffic. (Score:3)
If I can't take my focus off the road then what the hell is the point (at least the vast majority of the time)? Sure, a future where I could take a nap, read a book, or do anything else
With current Tesla autopilot you could easily read a book while the car handled stop-n-go traffic, where you creep forward a few feet at a time... simply stop reading anytime the car starts moving with any speed, or a lot of people are trying to merge. I have wasted too many hours sitting there watching cars way up ahead mo
Darwin award for rich people (Score:2)
Tesla has killed 3 times as many people [wikipedia.org] in self-driving car accidents than Uber (the only other company with fatalities). Further, anyone stupid enough to trust Tesla's FSD deserves a Darwin award. Taking rich idiots out of the gene pool is a universal good and I'd like to thank Elon Musk for his public service.
Re: (Score:2)
That one guy ran into the attenuator, yes. And from that point on, the software installed ON ALL CARS, FOR ALL TIME, will never make that mistake again.
The benefit of these self driving systems is not that they are perfect. It is that they will become very close to perfect, over a very short period of time, because of positive and negative feedback loops of accidents on the road.
Even the most basic of automation will, over a very very short period of time, will be far safer than any human driver on the road
Re: (Score:2)
Gees, How about Scenic Mode, (Score:5, Funny)
I look forward to... (Score:2)
suing Tesla owners in my future "fender benders".
Sooo, you're building a BMW? (Score:5, Funny)
Or are you just trying to get bought out by BMW?
Elon Musk is becoming Chris Roberts... (Score:2)
Stop promising things.
How long until Elon starts selling .JPGs of his cars?
Proportional Response please (Score:3)
I was land changing and driving behavior that responds to those driving around me.
If there's someone simply slow in a lane, pass them smoothly, merge back in way ahead of them. No need to grief anyone just being cautious.
The guy who has been driving in your blind spot for 10 miles matching your speed almost exactly and refusing to pass on the left? Oh that's easy, gun it for a second and cut that bastard off to move over if it's at all useful.
In all seriousness it would be good if self driving cars could estimate a danger level for cars around you and be extra ready for action based on danger from those quadrants with more erratic or mean drivers. Some cars you can tell are very passive aggressive and will do things like try to anticipate your moves and cut you off, or pass you only to slow way down, and it would be nice to have a self driving car able to deal with strange things like that in different ways.
Re: (Score:2)
The guy who has been driving in your blind spot for 10 miles matching your speed almost exactly and refusing to pass on the left?
Oh this gives me the shits. The fast lane is for overtaking, not cruising in. The police here won't fine you for doing 10km/h over, especially not if you do it for 10 seconds just to pass someone and then slow down again.
Re: (Score:2)
The fast lane is for overtaking, not cruising in.
Unless the other lanes are too crowded for cruising.
Develop self-driving in the Northeast, not in LA (Score:2)
Apparently LA or more accurately, CA, is a land full of idiot drivers where people hit each other all the time even though it's virtually flat, dry and sunny all the time. If a fender bender is acceptable behavior, perhaps develop your car elsewhere, in the Northeast of the US people drive as fast as CA in ice, snow, sleet, rain, fog in full darkness while weaving over hills and crummy roads at 65-75mph and we barely have any fender benders.
Re: (Score:2)
Apparently LA or more accurately, CA, is a land full of idiot drivers where people hit each other all the time even though it's virtually flat, dry and sunny all the time.
People stuck in traffic are not at their best. They're breathing a lot of bad air, and they're under a lot of stress, so their brains aren't working optimally.
If a fender bender is acceptable behavior, perhaps develop your car elsewhere, in the Northeast of the US people drive as fast as CA in ice, snow, sleet, rain, fog in full darkness while weaving over hills and crummy roads at 65-75mph and we barely have any fender benders.
No, you have freeway pileups. In any case, the state with the most automobile accidents is Florida, because of course it is. California has neither the most accidents, nor the most accidents per capita.
Re: (Score:2)
>Apparently LA or more accurately, CA, is a land full of idiot drivers where people hit each other all the time
>even though it's virtually flat, dry and sunny all the time.
And water is wet.
This has been known for a *long* time . . .
I had to drive through LA every few months during the freeway shootings. I suspect that a large number were justified self defense . . .
Although only in Northern California have I seen people *slow down* to prevent you from pulling *behind them* to get off a freeway! I h
Without LIDAR you can't have full self-driving (Score:2)
This is ludicrous. Without LIDAR you can't have full self-driving cars.
So he's saying all Teslas have LIDAR? Are they using stereoscopic cameras? What's really going on?
What about his announcement that they're abandoning nVidia chipsets?
Re: (Score:2)
Without LIDAR you can't have full self-driving cars.
Most human drivers seem to be doing just fine without LIDAR, only using two eyes at suboptimal positions.
Re: (Score:2)
Most human drivers have an image recognition hardware far beyond anything that is currently technically possible.
Re: (Score:3)
Depends. There are certainly plenty of situations where human drivers excel, but in other situations, the machine vision is already better.
But the question wasn't about processing, but about LIDAR vs camera. Replacing the cameras with LIDAR does not remove the need for advanced image recognition. If the LIDAR picks up a bicycle near the edge of the road, you still need to process the data to predict whether it's going to interfere with the car's motion planning, and that's just as hard as using camera ima
Re: (Score:2)
Most human drivers have an image recognition hardware far beyond anything that is currently technically possible.
My two eyes and brain are superior to any computer with two cameras, but I can't do everything that a computer with eight cameras can do. I'm better at some things, and not as good at others. And if it's got LIDAR, then forget it. Its depth perception is dramatically better than mine. Of course, Tesla doesn't...
Hyper-aggressive mode (Score:4, Funny)
New "Hyper-Aggressive" mode executes jaywalkers and cyclists. ALL cyclists, for any reason, even if they are sleeping in their homes.
The only AI system trained on data from SF MUNI bus drivers.
Thanks Elon for shattering my dream (Score:2)
Here I was thinking that the self driving cars of the future would solve highway safety issues and allow me to finally enjoy riding in the car.
Instead we get "agressive" and "mad max" modes. Jeez. Why don't you also install some boring company flame throwers on the front bumper while you're at it? Plus a spiked cow catcher.
...So why this is better than driving yourself? (Score:2)
This whole self-driving car thing seems like a total fail to me.
Re: (Score:2)
This is needed if you want to have the car drive just as well as yourself, because human drivers are also driving "with a slight chance of a fender bender".
If you're not a little bit aggressive, you can't merge in busy traffic.
Re: (Score:2)
Re: (Score:2)
That tech isn’t ready yet to be entrusted with MY life.
But you trust other random human drivers with your life ?
Re: (Score:3)
"... entrusted with MY life."
We do it all the time. Pilots, cab drivers, bus drivers, train engineers, even uber drones.
Also there's the people that just mounted your new tires, or shocks, or ball joints.
You are depending on someone else keeping them up to snuff, but that's just kicking
the trust down the chain one or two steps.
I'm not a huge proponent of self driving, my closest exposure is old-school cruise control.
I suspect I'd really, really enjoy owning a Tesla variant of any type than a 2008 Civic...
No
Navigate on Autopilot modes (Score:2)
"LA traffic mode
"Mad Max"
So where's Robo-Cop mode? Or tank mode, I don't care what you call it.
"He's in my way. OK Google, solve the problem." "Firing solution found. Ready to engage." NOW you're talking cars. Or cars talking, whichever.
Can I buy a Tesla that has none of this crap? (Score:2)
Self-driving (Score:2)
1) If the cars can't cope with humans being around, they shouldn't be on the roads with humans (whichever way you want to do that).
2) If the cars are literally set to "allow collision", fender-bender or not, then you're into a world of hurt liability-wise. Every time you use that mode, you're basically admitting driving without due care and attention.
3) If you do have an accident with that enabled, instant liability. No questions asked. You selected a mode that made it drive badly, game over.
4) Encouragi
Re: (Score:2)
2) If the cars are literally set to "allow collision", fender-bender or not, then you're into a world of hurt liability-wise. Every time you use that mode, you're basically admitting driving without due care and attention.
That's how a normal human driver operates, and it hasn't stopped insurance companies from paying up.
4) Encouraging such action (even though I suspect it's nonsense that'll never see production) is positively dangerous
On the contrary. Ultra safe driving, like Waymo does, is more dangerous, because you'll get the car slamming on its brakes at random times to rule out a tiny chance of a fender bender. This has the effect of causing more serious collisions by other drivers who are not anticipating that kind of behavior.
Re: (Score:2)
If you have contact with a car, you are responsible unless you're basically hit from behind. Changing lanes, merging, driving close and failing to brake in time, or any other manoeuvre can make any "accident" (what I call a "deliberate") automatically your fault, in an insurance claim. Yes, if you're merging and contact a car who's also merging, most insurers will hold you *both* at fault for failing to yield.
Again, 4)... if that depends on 1) then you need to implement 1) first.
Re: (Score:2)
Merging is a good example. It's a delicate balance between giving and taking space. If you engage in ultra safe driving, and are never prepared to take a risk of a fender bender, other people will simply claim the space, and cut you off. The end result is that you're stuck at the end of the merge zone, waiting for rush hour to pass.
A bit more aggressive driving means that you steer the car in a small opening, forcing other cars to slow down to avoid a collision. The trick is that you must be aggressive en
Re: (Score:2)
This is perfectly in line with behavior of other human drivers, and it's a routine case for insurance companies. None of those companies are interested in doing an expensive investigation in the precise liabiility, because that would be more expensive than just paying out for a couple of new fenders.
The reason is not "expensive investigation" it's "impossible investigation" as the claim typically devolves into "he said, she said." Add a dash cam, and liability is easier to apportion. Add a literal software setting that says "act aggressively" and that's going to be instant liability, no investigation required.
what the?? (Score:2)
one of the benefits of self-driving cars was to me that you'd have all the asshollery removed from traffic as the 'AI' would nicely follow the rules and make to most safest & sensible decissions.
but no, it seems you can select a level of douchebaggery into the driving style, what are they thinking?
the end result will be that everybody will drive in the most agressive mode, otherwise the other AI's will just bully you of the road.
Re: (Score:2)
one of the benefits of self-driving cars was to me that you'd have all the asshollery removed from traffic as the 'AI' would nicely follow the rules and make to most safest & sensible decissions
Yes, that's what a lot of people thought. And then we saw Waymo cars slamming on their brakes for no apparent reason, and getting stuck at the end of the merge zone.
the end result will be that everybody will drive in the most agressive mode, otherwise the other AI's will just bully you of the road.
No, because that's not what's happening between human drivers. A little bit of aggression is useful, but too much, and you'll get yourself in more accidents than it's worth.
Also, as more and more self driving cars appear on the road, and they interact together, we can use more cooperative algorithms, or vehicle-vehicle signalling.
Hopefully (Score:2)
this will be the lawsuit that finally muzzles Musk
NOW I'M INTERESTED (Score:3)
a "Mad Max" setting
Hell yes. Now I can ride on top of my Tesla, playing a guitar with giant flame throwing speakers?!
risk (Score:2)
All of life involves risk calculations and tradeoffs.
I can't speak to the calculations/tradeoffs that Musk is specifically addressing here, but I'm not going to get outraged just at the concept.
Just driving the thing (or anything) at all is riskier than leaving it at home.
Translation: dead kids are ok (Score:2)
Because "mild fender benders" can cause major brain and other damage to infants and young children.
Maybe someone should actually hire an ethicist at Tesla.
so, essentially (Score:3)
Jerk driver mode. I deal with people who drive like this all the time. You know what? YOU DONT GET THERE ANY FASTER.
Re: (Score:2)
Re: (Score:3)
If it was anyone else, your skepticism would be warranted; however, Elon already has two "impossible things" in the bag: popular/high-performance/mass-produced EVs, and re-usable/self-landing/commercially-available rockets.
That doesn't mean FSDs are guaranteed to be a success, but if Musk is willing to bet his reputation by being extremely bullish regarding FSD, then it could be that he knows something that his critics don't know. Again.
If I was betting my own money (which I'm not), I'd bet that full self
Re: (Score:2)
Don't forget the man-rated space vehicle.
Re: I Choose More Aggressive (Score:2)
My vehicle dribbles a little brake fluid. Not a lot. You would need to be following a little too close for it to spray on your hood. Brake fluid will dissolve paint, by the way.
I've been meaning to get the leak fixed, but plausible deniability is convenient.
Re: (Score:2)
In my youth, I drove dump trucks for the city.
One of 'em had a hydraulic fluid leak and would spray lots fluid off to the right side when
you pulled the bed raising lever when the bed was locked. It was fun.
Man, that one guy in the convertible was -really- pissed off.
It needs "Miami" mode (Score:3)
In hypothetical "Miami" mode, you can get out of your car while it's at a dead stop in traffic on the Palmetto Expressway and stand in front of the car in the lane you want to move into so it can't move forward & block you. As that car's driver angrily calls you a "pendejo", your Tesla drives in front when the car in front of him finally moves a few feet and opens up a gap, then you get back in your car.
People in L.A. don't appreciate just how good their traffic is compared to the hopeless, tangled clus