People Are Losing Faith In Self-Driving Cars Following Recent Fatal Crashes (mashable.com) 446
oldgraybeard shares a report from Mashable: A new survey (PDF) released Tuesday by the American Automobile Association found that 73 percent of American drivers are scared to ride in an autonomous vehicle. That figure is up 10 percent from the end of last year. The millennial demographic has been the most affected, according to the survey of more than 1,000 drivers. From that age group, 64 percent said they're too afraid to ride in an autonomous vehicle, up from 49 percent -- making it the biggest increase of any age group surveyed.
"There are news articles about the trust levels in self-driving cars going down," writes oldgraybeard. "As a technical person, I have always thought the road to driverless cars would be longer than most were talking about. What are your thoughts? As an individual with eye problems, I do like the idea. But technology is not as good as some think."
The Mashable article also references a separate study from market research company Morning Consult "showing increased fear about self-driving vehicles following the deadly March crashes in the Bay Area and Arizona." Another survey from car shopping site CarGurus set to be released Wednesday found that car owners aren't quite ready to trade their conventional vehicles for self-driving ones. "Some 84 percent of the 1,873 U.S. car owners surveyed in April said they were unlikely to own a self-driving car in the next five years," reports Mashable. "79 percent of respondents said they were not excited about the new technology."
The Mashable article also references a separate study from market research company Morning Consult "showing increased fear about self-driving vehicles following the deadly March crashes in the Bay Area and Arizona." Another survey from car shopping site CarGurus set to be released Wednesday found that car owners aren't quite ready to trade their conventional vehicles for self-driving ones. "Some 84 percent of the 1,873 U.S. car owners surveyed in April said they were unlikely to own a self-driving car in the next five years," reports Mashable. "79 percent of respondents said they were not excited about the new technology."
Amazing (Score:5, Insightful)
How many crashes happen every day because of humans? Yes I know it is sad, no one wants bad things to happen. But in the long run this is going to save far more lives than take.
Re: (Score:3, Insightful)
Sure, but maybe we should be more careful with deployment than Tesla and Uber. See Waymo (or I am sure there are others) for example, I don't know of any fatal incident there. Also studies comparing accidents of driver-less/normal cars would be useful.
Re: Amazing (Score:2, Troll)
I don't know of any fatal incident there.
And if self-driving becomes dominated by the most powerful information clearinghouse on the planet, there's a good chance you never will.
Re:Amazing (Score:5, Insightful)
I would like to know where autonomous cars can drive. Can they handle construction zones? Can they even recognize stop lights? Can they drive down a country dirt road?
The only examples I've seen of them driving down straight freeways. I've not seen them negotiate any traffic situations at all.
So... before you compare accident rates, those questions need to be answered.
I'm still very skeptical that they can handle anything but the most trivial of driving situations. Perhaps I'm wrong. If anyone knows for sure and can point to maybe a YouTube or something, I'd like to see.
Re:Amazing (Score:5, Insightful)
I'm pretty sure that no commercially available so-called self-driving options can do much more than lane-following, adaptive cruise control (including coming to a complete stop) and possibly obstacle-aware lane changing.
Speaking of Tesla, at least, it can only "self-drive" on roads with lane markings i.e. no dirt roads. There is no recognition of stop lights, stop signs or other indications of actions needed to be taken by the driver. It will update speed limits as you're driving so you're aware of them but that's about it. I'm not including here anything not related to driving e.g. self-parking, summoning etc.
Honestly anyone who doesn't think of self-driving as a future promise at this time needs to have their heads examined. Just looking at the problem space, one would realize that we've only dipped our toes into this pool.
Re: (Score:3, Insightful)
Tesla is partially at fault for this bad press. They call their system "autopilot", which to the general public means "this thing drives all by itself". But that's explicitly not what it does. It drives without human intervention only in specific conditions.
I expect aircraft pilots to know the difference, but not the general public.
Re:Amazing (Score:5, Insightful)
This is a silly meme. There's no evidence whatsoever that people driving Teslas don't know that autopilot is an assistive technology, not one that drives itself without monitoring.
Sure, some people have done stupid things to override the failsafe that checks for hands on the steering wheel. And some have even then got into the passenger seat or even the back of the car.
But they've done this in spite of knowing what the autopilot system does, not because they are ignorant of it.
Re:Amazing (Score:4, Insightful)
There may be some validity in this but not that much. It's also quite natural for the media to create boom-bust cycles of trust and distrust, fed by the occasional instance of people who delegate too much responsibility to the autopilot and crash .
I don't have any experience with Tesla but it does look like a challenge to allow the car to drive by itself, reducing yourself to the role of supervisor, and then not letting your attention wander off occasionally.
Re:Amazing (Score:5, Insightful)
That has a Wile E. Coyote problem. (Score:4, Interesting)
We can't build AI that relies on special signs or road markings or vehicle-to-vehicle communication. That's a terribly brittle approach, and way too easy to maliciously or accidentally defeat.
We need to build AI that relies on sensing its environment and behaving safely in all situations, including by pulling over and handing control over to a human when it gets confused.
Re: (Score:3)
The police already do that
https://commons.wikimedia.org/... [wikimedia.org]
Re:Amazing (Score:5, Insightful)
Agreed. But even with those incidents my underdstanding was the death rate per mile is much lower than with human drivers. Am I wrong on that?
Actually, yes, you are. The number of miles driven autonomously for the same reasons as regular driving is still too low to draw any statistically significant conclusions either way. We need to get at least an order of magnitude more data in before we can even speak of significant tendencies.
Re: (Score:3)
Agreed. But even with those incidents my underdstanding was the death rate per mile is much lower than with human drivers.
Tesla at least cherry-picked where autopilot could be enabled to the easiest scenarios, and since the driver is meant to give 100% attention the real question is how often did drivers need to intercede to prevent an accident.
Re:Amazing (Score:4, Informative)
Re: (Score:3)
Humans ... driving over 585,000 miles successfully per crash.
Doesn't pass the smell test. That's more than a lifetime of driving. How many drivers over 30 do you know who never had a fender bender?
Re: (Score:3)
Re:Amazing (Score:4, Insightful)
Re: (Score:3)
At-fault crashes per million miles is the only relevant metric for comparison purposes. The total number of miles driven just gives you the margin of error. You really don't need billions of miles driven to make a valid comparison, IMO.
That said, Tesla's actual self-driving mile count is still zero, to the best of my knowledge. Their current setup is incapable of making a number of critical driving decisions, including lane changes, turns, exits, stopping at traffic lights or stop signs, etc. Comparing
People aren't logical (Score:3, Insightful)
How many crashes happen every day because of humans? Yes I know it is sad, no one wants bad things to happen. But in the long run this is going to save far more lives than take.
Several problems with that argument. 1) you are assuming people are rational when they aren't. 2) People don't care much about the long run. They especially don't care when they are afraid of something (see nuclear power). 3) Your claim that it will save lives is at this point pure conjecture albeit based on reasonable logic. We don't actually have any proof that self driving tech does or will save lives. 4) Certain high profile companies are pushing the technology out there in some arguably irrespo
Re: (Score:2)
The real difference is with automated driving cars, their safety will only get better with new technology and applied lesson learns over a long period of time.
So the young Adult on the road today may have only a few hundred hours of driving experience, then when they get a lot of experience they are at an age where their reflexes are slower.
A self driving car, for every new one made, the lesson learned for past cars is copied into the software, as well with newer technology to let it understand its environm
Re: (Score:3)
The real difference is with automated driving cars, their safety will only get better with new technology and applied lesson learns over a long period of time.
How is that a difference? Human driven cars most certainly get safer too - just look at the statistics.
Human drivers have in general gotten better too; in parts of the world through programs like mandatory slick driving and obstacle avoidance courses, or it becoming easier to lose a license.
Re: (Score:2)
How many crashes happen every day because of humans? Yes I know it is sad, no one wants bad things to happen. But in the long run this is going to save far more lives than take.
Stupid question. How about "How often do SDC's need intervention?" Humans may be poor drivers, but at least they can go for 250k miles without an accident. SDCs need active human participation every 5k miles or so.
The average human driver (including unlicensed, drunk, tired and old) *averages* 250k miles without an accident. Call me when SDCs can go that far without having a human take over.
Re: (Score:2)
So, using your numbers, and assuming that human intervention would require 10 minutes of attention per instance (call it five miles worth of attention), then a human using an SDC would have an accident about once every 250,000,000 miles traveled. Soun
Re: (Score:3)
How many crashes happen every day because of humans? Yes I know it is sad, no one wants bad things to happen. But in the long run this is going to save far more lives than take.
We humans have this organ on top of our bodies that has evolved to be able to assess risks and rewards, weigh them against each other, and make choices accordingly. We accept small risks all the time. Evolution has had a lot of time to weed out both the excessive risk takers and the risk averse.
The risk of driving is minimal compared to the rewards. Reducing the risk is a good thing only as long as it does not reduce the rewards to a higher degree.
And that, I believe, is a problem with autonomous cars, e
Re: (Score:2)
Exactly there is a very fundamental issue with SDCs is they need a destination - My wife and I go for drives in the country in my 32 year Alfa Spider all the time - its fun. Its something to do around here. You can push the car into a curve now and again and get a little thrill, as you are appreciating the sprawling country side and the mountains in the distance.
Its going to take a pretty clever AI to be able to tell the car "just take me for a ride exploring the county roads and make it interesting here
Re:Amazing (Score:4, Insightful)
Huh?
a) Has anybody planned a car where the autopilot doesn't have an on/off switch?
b) Drivers don't get to look out of the window at the landscapes, they're too busy driving.
c) Teslas have a Ludicrous mode which is a big selling point - they're more of a drivers car than anything else you're used to.
Re: Amazing (Score:4, Informative)
The issue is the various tech. Camera only models will have drastically reduced abilities compared to lidar,radar,camera models.
Self driving cars can't go the cheap route like Tesla autopilot. You need the $125,000 package of equipment.
Re: Amazing (Score:3)
Rough estimates of the waymo
System is 100-125k for the system plus 50k for the van.
That is why it is being targeted to taxis first. To get the tech and volume levels up for mass market.
Camera only systems are what is behind every single crash that has been in the news. Tesla, Uber, etc are camera only systems
Re: Amazing (Score:4, Interesting)
Conceptually intervehicle communication is a terrific ideal And there's a standard of sort (V2V). Problem is the edge cases where one thinks one is communicating with vehicle A which you think is going to let you make a turn across traffic. But Vehicle A is not the car you think it is. Vehicle B -- which you have mistaken for A is moving toward you at speed and has no intention of letting you make a turn. Oooops.
Just more FUD (Score:4, Insightful)
Re: (Score:2)
Re:Just more FUD (Score:4, Insightful)
The way to reduce automobile accidents is to rid the road of drunk drivers and texting drivers. When you subtract those two causes, humans are pretty good drivers.
Maybe in 10 or 20 years your dream of self-driving cars will come true. They're just not good enough yet.
Do you go out much? Careless, aggressive, inattentive and plain bad drivers are really big problems. I have driven in the USA, Europe Africa and the middle east. In the west, we have better roads and that does encourage poor driving. Or perhaps it is the fact that we have ambulances and nice police officers to pick up the pieces (or whole bodies if needed).
Drunks and other idiots deserve whatever they get. I am more in danger from people who tailgate, overtake on the wrong side, cut in front of people and so on. Get them off the road and we will all be safer.
Re: (Score:2)
Aggressive drivers and generally bad drivers tend to be excessively self-confident bro-types who will never engage a self-driving mode on any car they own anyway, so that group factors out. Additionally, programming an automated vehicle to make the decisions necessary to handle when it encounters an aggressively bad driver requires real AI - which doesn't exist - and takes the designer deep into trolley problem space.
Re: (Score:3)
The single greatest thing about self driving cars will be that the police can order the "bro-types" to use one after a driving offense.
The future of motoring laws won't be "banned from driving for six months", it will be "forced to use autopilot for 2 years".
Re: (Score:3)
I agree with this 100%. People do really stupid shit in cars. They pull out blindly into traffic, they drive 30mph or more faster than the traffic in the lane next to them, they turn where it is clearly marked they aren't allowed to, they take left turns INTO TRAFFIC just because the other lane is finally clear, etc. etc. etc. I don't even live in a big city and I see this stuff every day -- people don't treat the roads or o
Re: (Score:2)
Re:Just more FUD (Score:4, Insightful)
The statistics are that there are an insignificant number of self-driving cars on the road.
Even if you include things like the Tesla which are NOT SELF-DRIVING.
Sadly, you won't have the statistics to compare accurately until, say, 5% of people has one of those things. Currently... what? SALES of electric cars are 1% of all new car sales. So there is an insignificant percentage of those, even, currently on the road.
If you took all Teslas, every single model of them ever sold, adds up to about 300,000 cars. Worldwide. There are approximately 1bn vehicles in the world. That's 0.03%.
If you go for "certified self-driving cars in private hands", the figure is so near zero that's is not even recordable. Everything is either a "prototype" from a big corporation or deliberately advertised as NOT a self-driving car.
So... sorry... self-driving cars do not have any statistically-significant data from which to draw any conclusion whatsoever. Even Tesla's don't.
As an IT guy, I fail to see why a computer would be any better than a human at such a human task. If we were talking isolated, self-driving-only roads, no human drivers, changing the roads to prevent such signage and road-marking confusions etc. Sure. We call that a railway, though. It's very different.
We couldn't even make a burger-flipping robot that works around humans. Robots/computers are good for one thing. The same task, over and over again, which needs as little interpretation as possible, and no human interference. Anything else is a mess.
And guess what a self-driving on an ordinary road is.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
A statistic wouldn't be all that useful at this point, as we don't have any real consumer self driving cars on the road, just experimental vehicles with safety drivers and those vehicles are only driven in condition that they are deemed to be able to handle. We don't even know how much or how little the safety driver had to intervene. Those cars could be terrible, but you still wouldn't notice since there is a human on the wheel helping out.
I'd be much more interested in seeing the self-driving software bei
Re: (Score:2)
Show me the statistics, not the emotion-laden stories. I'll bet money that self-driving cars are safer now and will be even safer in the future. Id love to have one, just can't afford it.
You'll lose that money, because we don't actually have self-driving cars now. We have cars that require human intervention every few thousand miles at best, every ten miles at worst.
Re: (Score:2)
As for why people still distrust self-driving cars, educated guess is that it's due to the way mass media tends to end up pushing the stores that pull in the biggest audience rather than the ones most relevant or truthful ones. We saw this back in the 80s and 90s when the reporting about violent crime went up significantly while actual crime statistics were showing a downward trend that
Re: (Score:3)
Re: (Score:2)
I really really hope that those that are leading the push for self driving cars (ie. actually doing the work, or running the business) are simply using it as a means to advance and test assisted driving technology, and that they're just not telling us that they don't even think self driving will happen. If they can get self driving to work acceptably on the road with no one in the car, IMO that'd be enough to use that tech for assisted driving... it's a great test in that way.
I can see self driving stuff wo
Re: (Score:3)
What do you mean by 'assisted driving technology'? It seems to me that the safest option is where the human is doing the driving, but the car is ready in case of emergency (ie apply the brakes if you are going to hit something). The absolute worst option is the vehicle is driving, but the human has to be 'at the ready' (autopilot). In that case, you are requiring humans to do the thing they are worst at, which is paying attention through long periods of boredom. So, in your example of truckers: as you s
Good (Score:5, Insightful)
I'm in favour of developing the technology. And very, very much in favour of not overhyping it to destruction.
Re: (Score:2)
I'd rather have Siri behind the wheel of my neighbors' cars than my neighbors. Sure, it'd just stay in Park because Siri can't drive, but then the roadways would be clear for me. Problem solved.
Its the "planes are dangerous" effect (Score:2)
Correct the first PDF link please (Score:5, Informative)
Re: (Score:3)
Bullshit...not "autonomous...fatal crashes" (Score:2)
The old stories referenced are for a Tesla on "Autopilot" (stupid name) and a pedestrian stepping out into traffic and getting (sadly but unsurprisingly) run down. In both cases the human driver is clearly at fault.
Get back to me when truly "autonomous" cars are (a) on the road and (b) killing more people than sleepy or drunk humans.
Re: (Score:3)
She was even walking a bike across, which is a giant radar reflector. And she was a moving target, so shouldn't have been filtered out as clutter. That accident still just amazes me that Uber's system was so bad as to have missed her. Lidar missed her. Cameras missed her. Radar missed her. Ultrasonics missed her. No braking attempt whatsoever until the impact. I still want to know how that happened. I can at least understand the high profile Tesla accidents, but I just can't understand this one.
Re: (Score:2)
Re: (Score:2)
Been There Crashed in That (Score:2)
I imagine civilians lost a lot of faith in 'aeroplanes' after they dropped a bunch of bombs on them during ww2. After a bunch of test pilots died because parachutes hadn't been invented yet. After a bunch of barnstormers died pushing the limits of airplane controls. After an endless procession of adverse weather, mundane mechanical failures, and human errors.
The bugs were worked out, pilot training was drastically improved, and it was figured out what was needed for safe flight. And now commercial air trave
Re: (Score:2)
The right question (Score:4, Interesting)
Re: (Score:2)
Re: (Score:3)
The right question to ask is : would you prefer to ride in a self-driven car, or with a drunken driver ? and with a very tired driver ?
That's called a false dichotomy, and is certainly not the right question to ask.
Re: (Score:2)
Re:The right question (Score:5, Funny)
The right question to ask is : would you prefer to ride in a self-driven car, or with a drunken driver ? and with a very tired driver ?
I mean why stop there.
The right question to as is: Would you like to ride in a self driving car in a summers day on a controlled road with no traffic or in an death-race style commute with a drunken, tired Donald Trump at the wheel whilst he listens to the BBC world service and pops Prozacs every 24 seconds.
If you're going load a question, bloody well load it properly
Has been suspect (Score:5, Informative)
I do industrial automation for a living, since about 2000. There's a certain class of automation problem where getting to a 90% solution is easy, getting to 95% takes a lot of work, and getting to 97% is extremely hard. That is, 90% of the parts coming down the assembly line are easy to categorize correctly, the next 5% you can do with a lot of effort, and so on. Unfortunately that last 2 or 3% are damn near impossible due to problems with how good our sensors are, or how good our algorithms are, or how good our mechanical sorting solutions are.
These problems are notorious for causing run-on projects that slurp up money but never end. That's because your initial effort appears to produce amazing results - 90% with almost no effort. How hard can the remaining 10% be? My first encounter with one of these problems was a barcode-reading system at an industrial facility reading barcoded tags with a camera instead of a barcode reader. The problem was that the barcodes were becoming more worn and faded over time, and management believed that if we used a camera instead of a barcode reader we'd be able to enhance the image, etc., and get a good read because clearly a human looking at the picture can clearly see the bars and the human-readable text below it. This project went on for months, and then years, always creeping closer to 100%, but never making that leap to 100%, having thrown several different engineers at the problem and bringing in outside machine vision specialists.
In most cases these problems come from over-estimating the capability of your sensors. A sensor with a little dirt on it suddenly gives the wrong result, or temperature fluctuations mess up the calibration, or the dreaded, "sensor seems to be giving valid values, but they're just wrong for no reason." Even if your sensor values are reliable, in many cases you'll end up with a measurement that doesn't fall clearly into the known-A or known-B range.
That's where "AI" is supposed to save us, but my limited experience with AI shows it falls into the same class of engineering problem: you can quickly build an AI that correctly categorizes 90% of your input correctly, and then with effort you can improve it and improve it some more, but you'll never reach that always-correct answer.
This is where engineering projects fail, because you can always find a manager or an optimistic engineer who can hand-wave away the ambiguity and say, "humans aren't perfect either" and "we can just keep making the AI better and better." That's convenient when you don't put a physical number on it. How good can you make the AI with the available sensors? We know the sensors are in some ways better than human perception, but in other ways they're worse. In what quantitative ways are they worse, and how are you compensating for that?
If I were going to tackle some problem like this, I'd start with a standardized sensor suite and data format. You can't have everyone developing AI based on proprietary sensor data because it's too opaque. You also need to standardize the system output format (accelerator percent, braking percent, steering value, etc.) Plus you need to standardize the parameters of the vehicle. Once you've got that you need to start collecting and publishing this data in this standard format - hundreds of thousands or millions of test case scenarios available for every researcher to use, and in each case you need to have an expert specify what the correct set of outputs should be (or correct range at least) for each scenario. Then you can develop your AI or algorithms and you can then run these through a test suite so your AI has to pass all of these scenarios before it can be certified. As we have crashes then we add to the list of scenarios, and if you make changes to the AI, it has to pass that new scenario and still pass all the old ones.
I get the sense this is what the companies doing research are trying to do, but how do we validate their product? If their databases are proprietary, and their sensor format and data isn't in a standard format, and we can't run the tests ourselves, then how can we trust their systems? Of course we can't.
Pareto Distribution (Score:2)
In QA circles there's a pretty standard distribution that says the first 80% of something will take 20% of your effort. Finishing the last 20% will take 80% of your effort. It's not true for everything, but it's true for quite a lot of things.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The Year of the Self-Driving Car (Score:2)
Let's get this straight: (Score:5, Insightful)
...people are losing faith in an overhyped, not-ready-for-prime-time technology in the development stages for a task that takes a colossal synthesis of perception, reflexes, maturity, and training (none of which we have systems capable of duplicating yet individually) for which the infrastructure (physical, legal, social) hasn't even begun to be developed, much less matured to the point of implementation?
It's almost like repeatedly INSISTING that "it's almost here" is ACTUALLY an insufficient substitute for real time in development?
Hm.
Shorting TSLA much, Slashdot owners? (Score:2)
Two anti-Tesla articles in a row on the front page makes you look like curmudgeons.
People who believe God murders babies on purpose still believe in Jesus.
People kill themselves and others while driving every day. I have no faith in humans.
Re: (Score:2)
Uber Conspiracy (Score:2)
Uber got into the ride-sharing business, which has sort of morphed into the taxi business. Then along came Google with their plans to make a self-driving car. Uber saw it's future disappearing, and so got into the self-driving car game. They initially did it to give themselves a future, but quickly realised that self-driving cars are actually really, really hard. They then (secretly) pivoted to ensure that SDCs kill a few people so that the public trusts Uber's human drivers a bit longer.
"People are losing faith" (Score:3)
If they're the ones dying in the fatal accidents (Score:2)
they're losing more than "faith"
Idiots never compare. (Score:2)
You have to compare things, otherwise I can make anything seem scary dangerous.
Sharks are a great example. One movie and people are terrified of them. But they are basically the same as elephants - more likely to be killed by humans than to kill a human.
E- Cars are horribly dangerous - but they are ALREADY safer than human driven cars.
I guarantee that if you are a parent, of a teenager, or the spouse of someone that drinks alcohol, a self-driving car looks VERY attractive, even today.
Re: (Score:2)
The system of cars on roads is completely broken (Score:5, Interesting)
We climb in a little metal box, hurtle towards another metal box at a closing speed of 200km/hr. Then, to make it safe, we paint a white line on the road and promise to both stay on one side of it. To make life exciting we then add wildlife, children playing, wet weather, tired alcoholics who have just broken up with their wives...
The system is absurd, it is mind blowing that it works as well as it does, but all the band aids like crumple zones, seatbelts and AI steering can't avoid the fact that the system we have evolved is inherently dangerous. Nobody would ever deliberately design a system like our roads and cars.
As an illustration, where I live people working on the side of the road must have a substantial crash barrier to protect them from the oncoming traffic and provide a safe working environment. That same worker can then get on a motorbike and ride home, protected only by a painted line, and nobody thinks anything of it.
Re:The choice is still clear. Self driving (Score:5, Insightful)
We can safely ignore this headline.
In surveys people think Facebook is evil but I don't see many of them cancelling their accounts.
Surveys also prove that people want more leafy green salads in McDonalds but nobody ever eats them if they appear.
Nope. The day people figure out they can use Facebook all the way to MacDonalds and back will be a good day to look for a second hand car.
Re:The choice is still clear. Self driving (Score:5, Informative)
Are these reporters pointing out that 17 gasoline cars burst into flames every hour in the USA? That non-Tesla cars are responsible for 6% of all fire-related deaths?
Nope? That's what I imagined.
https://www.nfpa.org/Public-Ed... [nfpa.org]
Re: (Score:3)
Get back to us when there are a statistically significant number of Tesla cars on the roads, you shill. Also, the statistics you cited include Tesla fires and fatalities.
Re:The choice is still clear. Self driving (Score:5, Insightful)
Get back to us when there are a statistically significant number of Tesla cars on the roads, you shill.
Huh?
The whole point is that there aren't a statistically significant number of Tesla cars on the roads but they're making all the headlines.
2000 gasoline gars explode? Nothing to see here.
A story involving a Tesla? Front page news!
Re: (Score:2)
I get it, you don't want to talk about the times that Tesla skirts or skips safety rules that every auto manufacturer adopts as best practices, and Tesla's choices lead to fatalities. You'd rather talk about indents that are mostly minor and don't involve injuries to any person.
Re: (Score:2)
I've said it before,the main reason I still visit /. is to laugh at the neo-luddites leaving themselves behind.
Re: (Score:2)
It's not Luddite to ask that cars stop pretending they have level 4 or 5 autonomy when they don't.
Re: (Score:2, Interesting)
You're forgetting that self-driving companies are also pushing sensationalist headlines all the time saying self-driving cars *must* be done so we can reduce fatalities. But in the process all I see is that we are producing exceptionally bad, inattentive drivers (and will so even more than people simply using their phone and driving), making more hazardous conditions. I've seen really bad inattentive drivers the last couple years, and I can't help it is the generation coming that have been addicted to "smar
Re: (Score:2)
Read the page he linked. It says "automobile fires", without a qualifier for fuel type. Tesla shill added that himself, hoping that people won't remember all the data fires that have happened in Tesla cars.
Re: (Score:3)
I think more in depth analysis needs to be done. The numbers for gasoline powered vehicles is for all cars of any age. But most Teslas on the road are only a few years old. What do the numbers look like for gas powered cars in the same age range? What about when only compared to cars in teh same age range and price range? What portion of the car fires were set intentionally vs. which ones were the result of bad maintenance vs. which ones were the result of an accident vs which ones were just spontaneous?
Re:The choice is still clear. Self driving (Score:4, Insightful)
How about this -- 65% of all fires in waste facilities are from lithium batteries.
Those places where they pass lithium batteries though big grinding machines followed by even bigger trash compactors?
Yes, that statistic might be true.
Re: (Score:2)
Re: The choice is still clear. Self driving (Score:5, Informative)
Re: The choice is still clear. Self driving (Score:5, Interesting)
Yes, because as we all know, airplane autopilots are totally designed to replace a pilot, and that's why we don't have pilots anymore.
Meanwhile, it's not Tesla that's calling its cars "self-driving" [evobsession.com].
Re: (Score:2)
And a good number of those are likely suicides. More than we can determine for certain.
Going out in an "accident" is a fairly common way to make a quick escape, and one that has a chance of letting those left back at least get insurance money.
With autonomous cars, I predict other death causes will rise, like "accidental" drowning (another big one).
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)