Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation AI Businesses

Are Tesla Crashes Balanced Out By The Lives That They Save? (eetimes.com) 198

Friday EE Times shared the story of a Tesla crash that occurred during a test drive. "The salesperson suggested that my friend not brake, letting the system do the work. It didn't..." One Oregon news site even argues autopiloted Tesla's may actually have a higher crash rate.

But there's also been stories about Teslas that have saved lives -- like the grateful driver whose Model S slammed on the brakes to prevent a collision with a pedestrian, and another man whose Tesla drove him 20 miles to a hospital after he'd suddenly experienced a pulmonary embolism. (Slate wrote a story about the incident titled "Code is My Co-Pilot".) Now an anonymous Slashdot reader asks: How many successes has the autopilot had in saving life and reducing damage to property? What is the ratio of these successes to the very public failures?
I'd be curious to hear what Slashdot readers think. If you add it all up, are self-driving cars keeping us safer -- or just making us drive more recklessly?
This discussion has been archived. No new comments can be posted.

Are Tesla Crashes Balanced Out By The Lives That They Save?

Comments Filter:
  • by Anonymous Coward on Sunday November 13, 2016 @06:54PM (#53277699)

    I'd be curious to hear what Slashdot readers think. If you add it all up, are self-driving cars keeping us safer -- or just making us drive more recklessly?

    Who cares what Slashdot readers think? This isn't something where opinions or anecdotes matter. Do (or read) a study, collect data. Then you'll have an answer.

    • like the grateful driver whose Model S slammed on the brakes to prevent a collision with a pedestrian

      There are other cars equipped with lane-keeping technology and automatic emergency braking. However the makers of these cars don't pretend that they are a completely autonomous car.

      Some day we will get there sure, but the "Auto-Pilot" technology in Tesla is no more advanced than what's available in other manufacturer's products.

      • by haruchai ( 17472 ) on Sunday November 13, 2016 @07:37PM (#53277821)

        like the grateful driver whose Model S slammed on the brakes to prevent a collision with a pedestrian

        There are other cars equipped with lane-keeping technology and automatic emergency braking. However the makers of these cars don't pretend that they are a completely autonomous car.

        Some day we will get there sure, but the "Auto-Pilot" technology in Tesla is no more advanced than what's available in other manufacturer's products.

        It is. It routinely performs better than the competition in testing by car reviewers. But it isn't good enough - yet - and I don't know when it'll be.
        My personal opinion is that if I'm buying a $100k performance car, I'M DRIVING, not some autistic software robot.
        But I know one day I'll be too old to care & technology will be good enough - but not today & not soon.
        Elon clearly disagrees but I worry that the software quality & testing isn't rigorous enough and legislators may crack down, which may be a good thing.

        And George Hotz is dangerously smart - and recklessly stupid.

        • legislators may crack down, which may be a good thing.

          It isn't. By being foolishly aggressive in his sales tactics, Elon is putting the entire industry in jeopardy, potentially slowing down the development of this technology.

          • by haruchai ( 17472 )

            "potentially slowing down the development of this technology"

            Not necessarily. Development will proceed and when it's good enough, I'm sure the insurance lobby will put plenty of pressure on the legislature.
            But the tools & tech needed to improve will proceed, mapping, neural nets, cheaper & better sensors, CPUs / GPUs. etc.

        • by AmiMoJo ( 196126 ) on Monday November 14, 2016 @05:48AM (#53279685) Homepage Journal

          The problem with Auto Pilot is that it ignores human nature. It makes two classic engineering mistakes:

          1. Assume the user is paying attention
          2. Assume the user is the ultimate failsafe device

          If either of those assumptions held we wouldn't have the problems we do with malware or social engineering scams or any number of other things. Yet they are assumptions that engineers keep making, because that last 1%, the corner cases that the machine can't handle, are really hard to deal with programatically, and really easy for an alert and informed human.

          • It appears that if you want people to do something useful in an unusual situation, then they need to practice. People get things really wrong in unusual situations. Automated systems screw up in unusual systems too. Airplane investigations are littered with examples of automation "helping" pilots and causing disasters. There are examples of pilots relying on automation, which didn't work and caused a disaster. There are examples of pilots getting confused, ignoring training and automation, and crashing

            • The Autopilot won't handle all the simple and elementary situations. If my adaptive cruise control screws up and tries to drive up the tailpipe of the car in front, I hit the brakes. If there's a sudden obstacle, I hit the brakes even if the collision avoidance system has done so faster than I could. So far, it has worked splendidly (well, the adaptive cruise control, haven't really tested the collision avoidance system), but I was not told to rely on it, and the user manual is very emphatic that I shou

              • by TheRaven64 ( 641858 ) on Monday November 14, 2016 @01:12PM (#53282183) Journal
                Adaptive cruise control doesn't do much to take your attention away. You're still focussing on keeping the car in the lane, for example, so you'll notice if the car in front does something dangerous. When you add in lane following, the car is basically driving itself. If your attention wanders, nothing bad happens. Most of the time. Until you're in an unusual situation, and then it's suddenly very bad because you now have the added delay of having to switch your attention back to driving, which adds at least another half second to your response time. At 70mph, that's another 16 metres before you start to react.
    • Comment removed based on user account deletion
    • I'd be curious to hear what Slashdot readers think. If you add it all up, are self-driving cars keeping us safer -- or just making us drive more recklessly?

      Who cares what Slashdot readers think? This isn't something where opinions or anecdotes matter. Do (or read) a study, collect data. Then you'll have an answer.

      It's difficult, because many people won't know about an averted crash, and, even if they do, it's far less likely to be considered newsworthy.

      As I've said before, this is the fundamental problem in the "gun" debate. You see in the news when someone is killed, but you don't see the million plus times every year that someone stops a crime using a gun. The "anti" side takes advantage of the imbalance in news coverage to claim the defensive uses are exceedingly rare.

      The bottom line is that we need to look at

  • Demand something that's a plus on both sides. Anything else is defective.

    • by Calydor ( 739835 )

      So you want a utopia where nothing bad ever happens?

      Sure, I want that too - but I'm also a realist.

      • They can get a zero on both sides by doing nothing. If they can't beat that they shouldn't be in the game.

        • by silentcoder ( 1241496 ) on Monday November 14, 2016 @04:37AM (#53279507)

          So what you are saying is... that according to you the only morally acceptable kind of car company is one that doesn't make cars.

          Well I'm sure the very, very, very far extreme fringes of the environmental movement will agree with you - that is if you go live with them in their hippie communes in the woods - but the rest of the world will probably collapse if we tried that. Better to try and build greener and progressively safer cars. There are times when you can and should demand perfection - but this is one of those cases where perfection will never exist, so you can and should demand improvement, which is exactly what Tesla is doing.
          Bad things happening sometimes does not mean it isn't improvement. It just means it's not perfection.

        • Comment removed based on user account deletion
  • This is something that can be easily figured out with statistics. Accident rates and serious injuries per miles driven in autopilot vs human. Unfortunately most people make decisions with gut feelings not detailed statistical analysis and politicians take advantage are very eager to score easy points with the mindless masses
    • But we have no data on miles driven with self-driving cars interacting with each other, which is the real end-game environment of self-driving cars. There likely are huge new failure domains, such a deadly embraces wth real death, race conditions on rainy roads, etc.

      So there is no easy statistical answer based on miles driven.

    • Re:Show me the data (Score:5, Interesting)

      by SolemnLord ( 775377 ) on Sunday November 13, 2016 @07:39PM (#53277827)

      An article over on Forbes already looked into this. [forbes.com]

      The TL;DR version is that Tesla's autopilot has 1 fatality per 130M miles driven, while the US average of all vehicle-related fatalities comes out to about one per 94M miles. That's 94M miles under all roads, all conditions, compared to Tesla's autopilot being driven almost exclusively on highways.

      • by Kjella ( 173770 )

        The TL;DR version is that Tesla's autopilot has 1 fatality per 130M miles driven

        Actually the important fact is that they have 130M miles total, one ugly front-on-front collision and the numbers would be completely different. The national average is statistically good across 32k deaths, but extrapolating from one death is folly.

        • by MrL0G1C ( 867445 )

          Spot on, the auto-pilot only works on highways and not all highways all the time afaik and there simply isn't enough data to go on.

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        It is so much more complicated than that. Here are two perhaps more reasoned points to consider...

        First, fatalities aren't all that matter - injuries and property damage should also be considered. It would be awesome if someone could come up with a cost of accidents per million miles rating that puts a price on the lives and adds it in to get a single number.

        Second, it needs to be considered systemically and with a long-term point of view. A transition that costs a few more lives in the short term but bring

        • We should also consider other advantages of rushing to the eventual arrival of fully autonomous vehicles.

          Oh, absolutely. I'm not inherently against self-driving vehicles: the benefits are massive and far-reaching. There are immense economic pitfalls that need to be navigated, but in the long run it's a net gain. As a rough snapshot of where we stand today, however, it's fair to compare Tesla's fatality rate with traditional vehicles (even while it is more nuanced than that).

          We all too often miss the advantage of paying a price up front.

          We do, but we cannot forget that sometimes the price is people.

        • The law already puts a price on lives. the EPA has a standard calculation based on the most comprehensive research ever done into the economic impact of a lost life which is used as the baseline for environmental regulations and fines, it happens to be the best answer there is and likely the best there can be.

          The current figure is about 7-billion dollars (it's wrong to just count what a person is likely to earn in his lifetime, you have to count the impact on the family, lost time for funerals, reduced inco

      • That actually isn't all that impressive as they are comparing it to the average. Lets do a comparison of cars in the same price range. And by price range I mean pre subsidy.
      • For that matter, Tesla and its autopilot are not really valid test cases for "Are self-driving cars safer?". Misconceptions by idiots aside, Tesla's autopilot is functionally equivalent to every other autopilot in existence: a tool to reduce the workload for the pilot, but not something that allows for fully autonomous self-operation of the vehicle.

        A better place to look for statistics is with Google's experiments with self-driving cars, which are designed to operate entirely without human control.

      • But remember, Tesla's are being driven on roads next to non-auto-pilot vehicles. You know, us idiots. So that likely brings the rate up. The true question is how much better the auto-pilot safety record would be if ALL cars were auto-pilot.
    • by vux984 ( 928602 )

      This is something that can be easily figured out with statistics.

      It's not easily figured out without data. And we have no data.

      Accident rates and serious injuries per miles driven in autopilot vs human.

      On the one hand, human drivers with a couple hundred million people driving billions of miles in the USA, rack up on average 1.08 death per 100 million miles. all miles. all weather. all types of driving. includes drunks. includes iced up roads. includes stolen cars. includes accidents caused by poorly maintained vehicles losing a wheel. it even includes suicides.

      On the other hand, you have autonomous vehicles that haven'tracked up remotely enou

      • by mysidia ( 191772 )

        as for drunks and suicides and stolen vehicles ... clearly that doesn't represent human driving ability

        Eliminating suicides and stolen vehicles is fair, But eliminating Drivers under the Influence is not.
        It is common for human drivers to be drunk or driving while exhausted or under the influence of a substance, and that's a common contributor to accidents, and one of the variables that autonomous cars should eliminate.

        • by vux984 ( 928602 )

          First, I mostly agree with you.

          However, the issue i have with drunks in the stats is essentially that the addition of autonomous cars into the system makes the human statistics better; as the 'least able humans' would avail themselves of the autonomous option to get home.

          This really counts as a point FOR autonomous cars; but it needs to be noted because it suggests the autonomous systems in some sense don't need to be better than the average human drivers... they just need to be better than the worst human

    • This is something that can be easily figured out with statistics. Accident rates and serious injuries per miles driven in autopilot vs human. Unfortunately most people make decisions with gut feelings not detailed statistical analysis and politicians take advantage are very eager to score easy points with the mindless masses

      And there's a logical reason for that. If you average out the life expectancy of the average human (70 years), would you opt to live for this many years plus one (71), or take your chances on making to the average in your local area (mine is 82)?
      Surely 71 would be an improvement overall? Unless you are 72 or older that is...

  • Without being subsidized would Tesla be a viable company? On a complete cycle basis does driving a Tesla actually put less CO2 into the atmosphere?

    • by 0100010001010011 ( 652467 ) on Sunday November 13, 2016 @07:32PM (#53277797)

      Without being subsidized would GM or Chrysler have been viable companies?

      • Without being subsidized would GM or Chrysler have been viable companies?

        Yes, they would have declared bankruptcy, restructured, and continued on......which is actually what they did in the end anyway.

  • Nothing. We know nothing about how self-driving cars interact with each other. And even less about how millions of self-driving cars interact with each other. And even less than that about a mix of millions of self-driving and human-driving cars. We know nothing. So predictions about long term safety balance are meaningless. Wait until we know something; until then do nothing.
    • Add to that the fact that no one knows how enough people are supposed to afford these things to make a statistical difference. Past developments in the automobile have not tended to trickle down to the masses unless forced by law and are not a good sign that the average person will ever obtain automation.
  • Secretiveness (Score:3, Interesting)

    by speedplane ( 552872 ) on Sunday November 13, 2016 @07:28PM (#53277783) Homepage

    The biggest problem with Tesla is not their technology, but their communication. They call their system "Autopilot", but backpedal that statement in their legalese and fine print. They say their car is safer, but only acknowledge accidents after investigative reporters uncover them (the attached article is a perfect example). Further, they always shift as much blame on to the driver as possible, while giving as few details about the crash as possible. This is poor communication. Tesla should be transparent about how well they're doing. It's to their benefit, as many people (myself included) would be more open to trying out potentially unsafe technology if the risks are clearly explained and can be mitigated.

    Back to the original point of the question, "Are Tesla Crashes Balanced Out By The Lives That They Save", Tesla could easily answer that question if they wanted to, they have the data. Unfortunately, due to their secretiveness and poor communication, they only share self-serving pieces of what they know, and no one gets the full picture. Based on the way Tesla has conducted itself, we have to assume that these vehicles are unsafe unless proven otherwise.

    • by AK Marc ( 707885 )
      autopilot in an aircraft requires a pilot in the pilot seat. So how is "autopilot" incorrect? Because it's a term used in sci-fi to refer to (true) AI-driven spacecraft? Seems like that's an issue with the users. I never assumed "autopilot" meant "AI-driven car".
  • On the one hand, for many Slashdotters - Elon Musk can do no wrong.

    On the other hand, this is Slashdot - and Betteridge dictates that the answer to the question has to be no.

  • by macsimcon ( 682390 ) on Sunday November 13, 2016 @07:47PM (#53277869)

    A few score may die now to save hundreds of thousands later. Have you driven on the freeways of America lately? People drift into your lane, they don't stay centered in their own lane. Drivers are looking at their phones while they're driving, no matter what the laws say. People are NOT as qualified to drive a car as a computer which checks its sensors hundreds of times per second.

    Autonomous cars are the future, and Tesla is pushing that forward. There are going to be mistakes in the beginning, and people will die and be injured.

    Driving is dangerous, but we don't outlaw cars because their utility outweighs the risk. Same here.

    • Driving is dangerous,

      Is it? I drive a fair bit, and sure it's more risky than lying on your couch, but not by much. I think the word 'danger' gets over-exaggerated these days considering how safe just about everything is relative to even 50 years ago.

      • by AthanasiusKircher ( 1333179 ) on Monday November 14, 2016 @01:45AM (#53279073)

        Driving is dangerous,

        Is it? I drive a fair bit, and sure it's more risky than lying on your couch, but not by much. I think the word 'danger' gets over-exaggerated these days considering how safe just about everything is relative to even 50 years ago.

        Umm, yes, driving IS dangerous -- it's basically one of the most dangerous things people do. "Unintentional injury" is the leading cause of death in people age 1-44 (and the third highest after cancer and heart disease in people aged 45-64), according to CDC stats [cdc.gov].

        And of those causes classified as "unintentional injury" again according to the CDC [cdc.gov], motor vehicle accidents are either the LEADING or second-highest cause of death for all of those age groups.

        Bottom line -- being involved with cars (either as driver, passenger, or as a pedestrian around cars) is basically the MOST dangerous single activity people deliberately choose to do on a regular basis.

        • Bottom line -- being involved with cars (either as driver, passenger, or as a pedestrian around cars) is basically the MOST dangerous single activity people deliberately choose to do on a regular basis.

          Most dangerous is not the same as dangerous. Out of a pillow and a kitten, one of those is most dangerous, but that doesn't make either of them particularly dangerous.
          The reason car accidents are leading is because we've made everything else so safe. And even with cars, if you have a modern well maintained vehicle with ABS, airbags, crumple zones, wear seatbelts, don't drive drunk, speed, use you phone while driving, drive fatigued etc, you have next to no chance of being killed in a car accident (seriousl

          • The reason car accidents are leading is because we've made everything else so safe.

            True. But just because we've gotten so much safer in most thing, does that mean we should stop and not worry about making things even better?

            And even with cars, if you have a modern well maintained vehicle with ABS, airbags, crumple zones, wear seatbelts, don't drive drunk, speed, use you phone while driving, drive fatigued etc, you have next to no chance of being killed in a car accident (seriously look it up, you'll be surprised how many accidents could be avoided with these simple measures).

            I'm well aware of such stats. One thing you should note, however, is that some of your things have to do with driving a well-maintained modern vehicle, and other things have to do with personal behavior choice (drunk driving, texting, driving fatigued, etc.) While the former mitigates your risk in all cases, you have less control over the latter in other drivers (or

          • One last thing to note about life expectancy -- it's important to note that medical science 100 years ago was nowhere near as advanced as today. Thus, injuries were much more likely to result in death.

            For every car-related fatality, there are nearly 100 injuries, and roughly 10 times as many serious injuries/hospital stays. Many of the latter result in serious disabilities or permanent health issues.

            A century ago, most of those hospital stay cases would likely have resulted in death. Just because med

      • Driving is dangerous,

        Is it? I drive a fair bit, and sure it's more risky than lying on your couch, but not by much. I think the word 'danger' gets over-exaggerated these days considering how safe just about everything is relative to even 50 years ago.

        You're sitting in half a ton of steel, glass, and liquid explosive, as is everyone around you, and you're all navigating a concrete obstacle course with varying degrees of concentration at speeds you're not evolved to cope with.

        I guess 50 years of cleverly abstracting this reality away with safety features has worked.

    • First let me be clear that I completely agree with "self-driving" features on cars so far likely save a LOT more lives than people injured. That said...

      People are NOT as qualified to drive a car as a computer which checks its sensors hundreds of times per second.

      I'd just change this slightly to people on average are not as qualified. If you look at stats, certain demographic groups and personality traits make up a disproportionate number of accidents. (For example, males in their late teens are something like seven times as likely as females in their late teens to drive drunk.)

      Back when Google first started tou

  • Comment removed based on user account deletion
  • Obsession (Score:4, Insightful)

    by shaitand ( 626655 ) on Sunday November 13, 2016 @08:14PM (#53277967) Journal
    You don't see a news story every time a Mazda is crashed during a test drive. Stop giving clicks to this drivel.
    • Mazda isn't making trying to make the case that they should be given special treatment because their cars kill less people.
    • by AmiMoJo ( 196126 )

      Because Mazda didn't call its driver assistance tech "autopilot" and isn't promising fully antonymous drive via future software updates when the current version is apparently not a lot less wonderful than they make out.

      Tesla's initial pitch was hands off cruising. Demo videos showed drivers not touching the wheel at all. Trying to backtrack now doesn't excuse that.

  • But there's also been stories about Teslas that have saved lives -- like the grateful driver whose Model S slammed on the brakes to prevent a collision with a pedestrian, and another man whose Tesla drove him 20 miles to a hospital after he'd suddenly experienced a pulmonary embolism. (Slate wrote a story about the incident titled "Code is My Co-Pilot".)

    That's one incident of a dangerous situation where the Tesla acted appropriately, and another where a user in a medical crisis chose a course of action based on the Tesla's abilities.

    In neither case do we know what would have happened without the autopilot.

    Even some of the accidents have the same ambiguity. I'm personally a skeptic that the Tesla is saving lives, but it's a fundamentally difficult thing to measure.

  • I find it curious that AI-guided vehicles appear to be held to a standard- perfection- that is never expected of humans. This is especially important given that driving is the kind of task that humans do especially poorly: it requires extended attention to something that is, for the most part, repetitive and boring. Given those kinds of tasks, humans easily lose focus, where computers do not.

    Given that the US has averaged 35480 deaths on the road over the last 10 year (https://en.wikipedia.org/wiki/List_of_

    • When I hop on a bus I expect to get to my destination without being in a terrible accident. This is no different then expectations that people have for autopilot.
    • I think, psychologically speaking, that people are more scared of things that feel out of their control. As such, even though flying is statically far safer than driving, as a passenger, you have no control over your safety. I think it's the same thing when trusting a computer with driving your car. It *feels* like you have more control when you're driving manually, and so you feel safer. Of course, the safety is something of an illusion, since so much of your safety relies on other people driving safel

      • It doesn't just feel like you have more control, you do have more control. If you truly fear accidents, in a manual car you can choose to drive 20 mph everywhere you want. Everyone who drives a manual car is actively balancing convenience with safety, so when you get in an accident it is attributable to the choices you made while driving. In an automated car you have no choice, so it is somewhat akin to getting into a bus. When you give control to someone else you make a risk estimate based on what you
        • If you truly fear accidents, in a manual car you can choose to drive 20 mph everywhere you want.

          No, you actually can't. It's illegal to impede the normal flow of traffic.

          Beside which, being human, everyone is bound to make stupid mistakes when they're momentary distracted. At some point, we'll invent cars that are less likely to make dangerous mistakes, simply because they can see in every direction at once, and never, every get distracted, tired, drunk, or decide to check their smartphone or put on makeup or eat breakfast. And the point about it being an illusion is because a good portion of your

          • Nor only does the technology be able to perform, but it's not going to make a lick of difference if 90% of the population can't afford it. I don't personally think seeing in all directions and making split second decisions gets us very far towards being a good driver overall.
            • Powerful electronics - super-computers of a few decades ago - are practically dirt cheap these days, and software costs are amortized over time and across many products. I don't see any reason why self-driving cars won't be affordable by the masses in the future. Sure, the first few iterations will be on high-end luxury vehicles. Fine with me. Let the 1%ers beta test the technology, and by the time it trickles down to the masses, it'll be commodity hardware that only adds a relatively modest amount to t

  • by rbrander ( 73222 ) on Sunday November 13, 2016 @08:35PM (#53278033) Homepage

    We spend far more attempting to avert air crashes than car crashes. The regulators of both form of transportation have struggled with why they are pushed by political forces above them to have such different levels of concern for the same lives. People doing polls and focus groups, professors doing anthropological studies, say, have formed the opinion that it relates to control.

    We chafe at having our autonomy restricted in cars - speed limits, four-way stops, seat belt laws, helmet laws, all unpopular, though such restrictions seem small prices to pay for your life. The cost of a highway interchange, at $50 million plus a million or more a year to maintain and replace, can be controversial though it would save a life per year in perpetuity, a couple or three million per life. We feel a death is a lot less anybody else's fault if we were in control of the vehicle at the time. On the other hand, a death in air traffic is just being tossed into the ground at 600 MPH by somebody else who screwed up. We really hate that a lot more.

    So, yeah, autopilots are always going to have to do twice as well to be half as appreciated. It's a glitch in human nature. Sorry.

  • by jfdavis668 ( 1414919 ) on Sunday November 13, 2016 @08:44PM (#53278069)
    It would be hard to determine what it prevented when no one is reporting it. Some of the drivers may not have even been aware of the incident the car avoided.
    • It would be hard to determine what it prevented when no one is reporting it.

      Tesla certainly knows. The fact that they're not sharing it does not bode well.

    • Yep, a good outcome isn't headline news. I use this very point whenever a conversation steers towards someone mentioning that they don't want to fly because it's "unsafe" based on crash reports in the news. I ask people if they still drive, despite hearing about 10+ accidents every morning and another 10+ in the evening on the radio traffic reports and they say sure. Then ask how often a plane crash makes the news. Certainly not every day, right? Then, if possible I show them this video:

      https://www.you

    • by AmiMoJo ( 196126 )

      That's a failure.

      The system requires that the driver is always aware. It beeps and eventually comes to a stop if they are not. If the driver doesn't notice things that might have caused an accident but were avoided, then the system has failed to keep them alert and ready to take over in the event that it can't handle those things itself.

      So what you are saying is we have an unknown number of very difficult to detect failures that Tesla gets a free pass on.

  • The lawyers will answer this question. Will automated driving be able to survive the slew of lawsuits that will occur when people die because they used the product in a way that they wouldn't otherwise have?

    Since there is no realistic plan for how we get automation into the hands of everyone and not just the wealthy to achieve these "lives saved", we can't accept this as an excuse.
    • Comment removed based on user account deletion
      • You think insurance companies are going to be happy paying for a machine error? If anything automated car companies should be more afraid of insurance companies because they have the money to back lawsuits. If the automated car company has its own insurance and it is worked into the cost of the car then fine, then I don't pay for insurance and it is all between them and their insurers and I get a big settlement from them if anything happens. For a fully automated car, we will just be paying to insure the
  • by XSportSeeker ( 4641865 ) on Sunday November 13, 2016 @09:43PM (#53278321)

    In the very grand scheme of things, autonomous driving technology has the potential to save hundreds of thousands of lives every year.

    The idea is that while autonomous driving can advance, become better and actually learn from past experience... human drivers cannot.

    Things have been getting slightly better in recent years, but if you take overall statistics ranging from the 70s or 80s 'till today, the number is pretty constant. I mean, the number of accidents, and the number of deaths in crashes. Too many people die every year in car crashes, and a whole lot of them comes from problems we are tired of knowing about. Driving under the influence, speeding, not paying attention to road signs, underestimating the severity of handling a x tons metal box at high speeds. Ramping up fines, making it harder to get a license, educational efforts, changes in law, among several other measures might have helped a bit, but not enough. And there doesn't seem to be much better solutions for human drivers.

    Autonomous driving has the potential to drastically reduce numbers when it eliminates the human factor. There will always be crashes, accidents will still happen, and I don't think autonomous driving can ever reach a point of perfection... but if done right, it could eliminate a whole lot of erratic and problematic behavior behind wheels.

    But this is about the overall technology, not about Tesla in particular. It's a huge and drastic change, and if I'm honest about it, I think Tesla is rushing things out, not taking lots of stuff in consideration, and turning the whole thing into hype and selling point - with big risks of making it a step back instead. It doesn't take many car crashes while using the autonomous thing for people to start avoiding the technology altogether.

    Tesla is selling it as if they had the complete solution already, but it's really the first steps into the technology, which is a really questionable strategy. It's skipping ahead using consumers as testers, quality assurance and research and development, instead of doing it like other manufacturers are doing it - in controlled environments, by employees.

    So no, if a Tesla car crashes and kills the driver because the technology isn't working as it should, it's their responsibility. Nothing balances out. Saving the lives of others won't bring back the lives of those who were killed, it won't fix things for the families of those who were lost, and it's no excuse for putting out a technology prematurely. But the technology itself is worth investing and worth insisting on.

    • Things have been getting slightly better in recent years, but if you take overall statistics ranging from the 70s or 80s 'till today, the number is pretty constant.

      That's completely wrong. Road deaths have decreased significantly since the 70's. They are 1/3rd what they were in the 70's in my country, just on raw numbers. If you include the number of vehicles and kms traveled, it is much lower again. I just checked the USA stats and fatalities per vehicle miles traveled is also 1/3 what it was in the 70's.

      • True, but that's because laws were introduced to modify both driver behaviour and car design. You can no longer have a few beers and drive legally, and steering wheels no longer leap into the driver's chest cavity when they have a fender-bender. These have reduced injuries and deaths.

        In terms of safety features, you now have seat belts, crumple zones, strengthened chassis frames, airbags, ABS, traction and stability control, etc. These are all great, but they are mere mitigation of the #1 cause of vehicular

  • Our Tesla MS is a 2013, so we do not have the hardware for AP. However, a number of other owners have told me that AP save them from crashes. Apparently with other cars, it is primitive and just does simple braking. These folks mention that it detects others coming in their lanes esp. in blindspots; warns them of going too fast; etc. These folks have sold there Mercedes, Audis, BMWs, caddies, etc. They tell me they are inferior. We are keeping our old highlander with 160k miles. Will try to keep it going
    • However, a number of other owners have told me that AP save them from crashes.

      So prior to owning a Telsa, these same people were crashing all the time?

  • Repeat after me: "a Tesla on Autopilot is NOT a self-driving car." Do not take your hands off the wheel. Be ready to hit the breaks when necessary. I'm sure it's there in the manual: read it.

    It's probably time for Tesla to publicly announce that they're changing the name of this feature since it's the public (and sales staff) perception of its capabilities that is causing problems. Didn't the German government demand this change recently too?

  • People always care about something or other, without understanding.

    For example Wind-Generators are blasted because they kill almost as many birds as a tenth of a glass building or a third of a cat, but nonetheless there are people denouncing this as a 'problem'.

  • Safety requires an attentive driver AND a vehicle with the reaction times to respond to emergency situations. By allowing the driver to check out Tesla is NOT being safe. The vehicle software has already demonstrated that it can fuck up badly and an attentive driver may be all that's there to prevent a fatal accident.

    The car cannot be said to be safe as it could and should be until it starts forcing the driver's attention. Allowing them to take their hands off the wheel for more than a few seconds is unac

  • https://en.wikipedia.org/wiki/... [wikipedia.org]

    Risk compensation is a theory which suggests that people typically adjust their behavior in response to the perceived level of risk, becoming more careful where they sense greater risk and less careful if they feel more protected. Although usually small in comparison to the fundamental benefits of safety interventions, it may result in a lower net benefit than expected.[n 1]

    By way of example, it has been observed that motorists drove faster when wearing seatbelts and closer

  • Human drivers won't improve much at all. Autonomous cars will improve continuously and there is no end to how much they can be improved. Tesla is already a very safe car and we can expect them to get better and better every year. Frankly we will soon get to the point where human input to a vehicle will probably be illegal. As usual this means vast, social change. For example motorcycles may have no place to exist as a computer can plan safe situations that would cause the driver of a motorcycle

"More software projects have gone awry for lack of calendar time than for all other causes combined." -- Fred Brooks, Jr., _The Mythical Man Month_

Working...