Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation

Many US Drivers Treat Partially Automated Cars As Self-Driving (reuters.com) 117

An anonymous reader quotes a report from Reuters: Drivers using advanced driver assistance systems like Tesla Autopilot or General Motors Super Cruise often treat their vehicles as fully self-driving despite warnings, a new study has found. The Insurance Institute for Highway Safety (IIHS), an industry funded group that prods automakers to make safer vehicles, said on Tuesday a survey found regular users of Super Cruise, Nissan/Infiniti ProPILOT Assist and Tesla Autopilot "said they were more likely to perform non-driving-related activities like eating or texting while using their partial automation systems than while driving unassisted."

The IIHS study of 600 active users found 53% of Super Cruise, 42% of Autopilot and 12% of ProPILOT Assist owners "said that they were comfortable treating their vehicles as fully self-driving." About 40% of users of Autopilot and Super Cruise -- two systems with lockout features for failing to pay attention -- reported systems had at some point switched off while they were driving and would not reactivate. "The big-picture message here is that the early adopters of these systems still have a poor understanding of the technology's limits," said IIHS President David Harkey.

This discussion has been archived. No new comments can be posted.

Many US Drivers Treat Partially Automated Cars As Self-Driving

Comments Filter:
  • Nope. (Score:4, Insightful)

    by Locke2005 ( 849178 ) on Tuesday October 11, 2022 @08:05PM (#62958213)
    Having worked with computers for 40 years, I don't even trust my radar equipped cruise control to keep me from ramming the car in front of me!
    • Truth.

      • Re: Nope. (Score:3, Insightful)

        by saloomy ( 2817221 )
        But the same can be said of human drivers. In fact, I would bet more rear end collisions are caused by human drivers than by faulty radar guided cruise control vehicles. There is plenty of examples to dive deep into human caused crashes, surely the statistics must be out there.

        My question to everyone is this: at what metric or data point should we trust the computers over the humans (or at least to the same extent)? There are already so many activities that computers perform that humans could simply not d
        • Re: Nope. (Score:5, Insightful)

          by Berkyjay ( 1225604 ) on Tuesday October 11, 2022 @10:26PM (#62958555)

          People are constantly misunderstanding the discrepancy in scale between the amount of human driver miles there are and the amount of miles driven with/by tech assistance. On the whole, Americans drive 3.2 TRILLION miles each year. [thezebra.com]. Compare that to Tesla's "Full Self-Driving" which has only clocked a total of 35 million miles in nearly 2 years [electrek.co]. So to try and compare the results from those two data sets is just laughable.

          My question to everyone is this: at what metric or data point should we trust the computers over the humans (or at least to the same extent)?

          Again, you misunderstand your data. All of your examples happen within a very confined environment and backup up by human guidance. The Dragon capsule isn't docking with the ISS while everyone else is off doing other activities. It is an intensely monitored and choreographed event. The same with planes. I could go on to dispute your other examples, but really it is all about the environment it is happening in. There is NO autonomously controlled systems that are operating in an environment like a car on a crowded highway or city street. There's just no other current analog to compare it with.

          So you ask when I would trust autonomous cars? Well I'll tell you how I think this should have been handled. These companies should have started with highway driving instead of trying to get their cars working like a human driver on a city street. A dedicated highway lane and standardized communication protocols would have gone a long long way towards bringing about practical and useful autonomous driving. Imagine driving from SF to LA. You drive your Tesla onto the highway and engage your FSD which queues you up for access to the autonomous lane. You car is starts communicating with the lane and all the cars in the vicinity in that lane. The all synchronize themselves to drive at an 80mph clip and you just sit back and close your eyes.

          This is something that is completely attainable with our current hardware/software tech. But it is all but ignored by every company working on the tech. They are all foolishly trying to create this mythical computer program that can navigate any terrain like a human would and they are going to fail at it. Sure, they'll get to a point like Tesla has with their FSD and make it look like their cars really are safely autonomous. But that is just a smoke screen that will evaporate when their cars start hitting that trillion miles driven each year. The level of complexity is just too great and entropy will put it's heavy hand on the scale.

          • Re: (Score:2, Troll)

            by Joce640k ( 829181 )

            People are constantly misunderstanding the discrepancy in scale between the amount of human driver miles there are and the amount of miles driven with/by tech assistance. On the whole, Americans drive 3.2 TRILLION miles each year. [thezebra.com]. Compare that to Tesla's "Full Self-Driving" which has only clocked a total of 35 million miles in nearly 2 years [electrek.co]. So to try and compare the results from those two data sets is just laughable.

            It's almost as if you've never heard of percentages.

            • Or standard deviations in statistics. Hint: he hasnt
            • OK, well instead of the snark, explain how I'm wrong.

              • by mspohr ( 589790 )

                Look up the concept of "statistically significant". Doesn't depend on absolute numbers but percentages.

                https://measuringu.com/statist... [measuringu.com]

                • I understand what statistical significance is. Does that apply to the question of "Are AV's safer than humans"? I personally don't believe that there is enough data to show that AVs are indeed safer in terms of mixed use roads (both human and AVs) with any level of confidence. But I'd love to see research on this.

                  • Tesla safety report.
                    1 billion miles of travel.
                    4x as safe

                  • One of these three statements must be wrong if your answer is honest.
                    1. You understand statistical significance
                    2. You understand the scope of data domain of the ADAS systems in use today (Tesla FSD Beta in particular)
                    3. You understand how much driving the system(s) do on a daily basis.

                    If you understand those three things, the only logical conclusion is, based on the ample amount of data for their domain (city, Highway, and freeway travel), they are safer than humans already, since they are statisti
                    • No. These are your personal opinions. You are trying to hide behind the concept of statistical significance when in fact Tesla FSD has only (in 2 full years ) driven 0.00001% of the 3.2 trillion miles US drivers drive each year. That is NOT statistically significant.

                    • Ok. So you dont understand statistical significance. Got it.

                      Just for your edification, the data point you are looking for is miles between crashes. The miles driven on AP and FSD is in the billions. You have absolutely enough data to make statistically significant deductions about its safety with tight, tight margins of accuracy given that much data. That humans have driven so much more wont make their standard deviation and better than negligibly tighter. When you are looking for such frequent occurren
                    • Ok. So you dont understand statistical significance. Got it.

                      Lol, OK buddy.

                    • Anytime. If you still have trouble understanding why, reach out. :)
                    • If you still have trouble understanding why, reach out

                      To an idiot like you? Got it.

                    • Who the fuck are you to jump into this conversation? Dumb ass.
          • by eth1 ( 94901 )

            And where do you propose to get all of these specially equipped dedicated lanes from? They would have to be physically separated from normal traffic to keep idiots out of them, so you'd need at least two lanes of space in each direction to allow for traffic and emergency vehicles to pass stationary cars. Congratulations, you've just *doubled* the size and cost of a normal Interstate highway for a tiny handful of cars to use.

          • The biggest issue I see with cooperating, communicating self-driving cars is it requires 100% of the cars on the road to be communicating and functioning 100% correctly. I.e. a single bug in a single car can cause a 100 car pile up. Of course, given the amount of accidents I see on I-5, that might still be an improvement over idiot drivers. (Actually, I'm surprised by how many times I have seen big car-carrier trucks stalled in the middle lane of I-5; that causes huge backups, and a basic mechanical failure
            • What makes you think a failure ahead will not cause a car to brake or avoid it that it is physically able to? AI does not need 100% functionality to work. It can see an accident or a stopped car just as well (if not better) than we can. For one, cameras can see more detail fur than we can given a moderate lens. For another, they almost never stop looking. And if they do stop, they alert you that something in the system is off and demand you take control.
            • The biggest issue I see with cooperating, communicating self-driving cars is it requires 100% of the cars on the road to be communicating and functioning 100% correctly.

              Not really. in order to gain access to the autonomous lane, your car would need to have the proper protocol. Also, the purpose of the dedicated lane is to reduce the variables the software has to deal with. Plus, with cars talking to each other, you increase the amount of data for the surrounding areas. With proper redundancy, you could get to a pretty safe and reliable system.

              I'm also not saying that this is how it has to be forever. I just feel that it's a much safer foundation for the technology rat

              • What's to keep idiots without the proper protocol from pulling into the lane? I see idiots alone in the HOV lane all the time, although the motorcycle cops do have a field day pulling them over.
                • OK well let's play that out real quick. Let's assume that all the cars in the lane are communicating constantly. Let's also assume that they all have sensors like cameras. So what happens if a vehicle is detected in the lane that is not communicating with the group? Those cars can bring themselves to a halt forcing the interloper out of the lane. They could also take a picture of the car and it's license plate and report that to the CHP (or whichever enforcement agency). So sure, if you wanted to trol

                  • I like the idea of the cars that play nicely with others ratting out the ones that don't. I'm not sure if legally they can fine someone based on a picture from a private vehicle, but they don't seem to have any problem fining people based on video from red light cameras or school zone cameras. So an almost certain instant fine should be enough to keep all but the most dedicated idiots out of the lane. (Pretty sure the pictures need to include an identifiable face of the driver to be used in court.)
                    • Can you imagine the AVs responding to an unknown vehicle in the AV lane by boxing it in so it can take pictures of the license plate and driver? Lol!!

          • Your proposal would require CalTrans to upgrade the freeways with FSD lanes and for all other drivers to stay the hell out of them. This is the same CalTrans that took more than two DECADES to replace the eastern span of the Bay Bridge after it collapsed during Loma Prieta. And the same CalTrans that employed incompetent welders, bought dodgy sub-standard steel, screwed up the foundation, allowed some of the steel to rust away before waterproofing it, screwed up the waterproofing that was completed, [wikipedia.org] and u [sfist.com]

          • by dgatwood ( 11270 )

            These companies should have started with highway driving instead of trying to get their cars working like a human driver on a city street.

            What rock have you been living under for the past four years? Tesla Autopilot is exactly that — autonomous (but supervised) highway driving. It has supported freeway-only navigation (including lane changes to pass slow traffic, exits, etc.) since late 2018.

            On the whole, Americans drive 3.2 TRILLION miles each year. [thezebra.com]. Compare that to Tesla's "Full Self-Driving" which has only clocked a total of 35 million miles in nearly 2 years [electrek.co]. So to try and compare the results from those two data sets is just laughable.

            That was based on data from late June. The number of FSD vehicles expanded pretty dramatically shortly thereafter, so the current number is probably more than twice that, and growing rapidly.

            Tesla is likely somewhere around half a percent of all

          • So you ask when I would trust autonomous cars?

            Not before there is a robocar that can pass a standard
            driver's license test in every state and province, night
            or day and in every kind of weather conditions.

            That's what we expect of people, why not machines?

        • by AmiMoJo ( 196126 )

          You can't just compare simple accident statistics because the situation is far more complex than that.

          Automated systems fail in different ways to humans. We have come up with all sorts of safety devices to protect from common and even uncommon human failure modes, but with autonomous systems many of the failure modes are new and unique to them.

          Some of it we can test for. I'm sure everyone has seen video of Teslas ramming into inflatable cars, bikes and pedestrians at full speed. But other failure modes are

          • 99% would be good enough if humans are right 98% of the time. Also, at some point, unrealistic or not, Tesla can not be in every vehicle watching it all the time. The human that is present has to be the end of liability.
          • by nasch ( 598556 )

            I don't know about a spate; as far as I know that has happened twice.

        • by mspohr ( 589790 )

          Tesla publishes a quarterly safety report which shows that its cars are consistently multiple times safer than human drivers.
          https://www.tesla.com/VehicleS... [tesla.com]

          Active safety features come standard on all Tesla vehicles made after September 2014 for an added layer of safety beyond the physical structure of each car. Because every Tesla is connected, we’re able to use the billions of miles of real-world data from our global fleet – of which more than 1 billion have been driven with Autopilot engaged

          • by dgatwood ( 11270 )

            Because every Tesla is connected, we’re able to use the billions of miles of real-world data from our global fleet – of which more than 1 billion have been driven with Autopilot engaged

            That bit is a little confusing to me. Tesla said that they had 3 billion miles on Autopilot way back in 2020 [evnewsdaily.com]. Given that it was growing by a billion miles a year from 2018 to 2020 and the size of the fleet has probably at least doubled since then, I'd expect them to be approaching the 10 billion mark by now.

            I mean yes, 10 billion is technically more than 1 billion, but...

        • by dgatwood ( 11270 )

          But the same can be said of human drivers. In fact, I would bet more rear end collisions are caused by human drivers than by faulty radar guided cruise control vehicles.

          Only if you count the crashes caused by radar-guided cruise control vehicles slamming on their brakes unnecessarily as being caused by the human driver who rear-ends them. :-D

          • I would gladly give you that pool of crashes, and it would still be safer. By miles. Humans are TERRIBLE at driving.
            • by dgatwood ( 11270 )

              I would gladly give you that pool of crashes, and it would still be safer. By miles. Humans are TERRIBLE at driving.

              This is true, but I couldn't resist a dig at Tesla's infamous phantom braking problem.

    • Having worked with computers for 40 years, I don't even trust...

      I'd just stop right there. Don't trust.

    • Re:Nope. (Score:5, Insightful)

      by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday October 12, 2022 @08:09AM (#62959307) Journal

      Having worked with computers for 40 years, I don't even trust my radar equipped cruise control to keep me from ramming the car in front of me!

      Having worked with people for 50 years, I don't even trust myself or other drivers to remain attentive to avoid accidents.

      As with many tasks that computers do imperfectly, we often mentally compare the computer's performance with an idealized human performance, not with actual human performance. Tesla claims (though I'd like to see those claims independently evaluated and verified) that drivers using their Autopilot system crash less than drivers not using it. If the claims are true and you were driving a Tesla, your position would be foolish, unless you have good evidence (not your own perception) that you are actually a better driver than average.

      I'm not disagreeing with you about the state of computer and software systems, especially AI systems that were trained by trial and error. I've been programming computers for 40+ years, too, and know a little about machine learning. What I'm pointing out is that the neural networks in our heads are also, essentially, hardware and software systems that were trained by trial and error, and are demonstrably bad at many tasks, very much including the tasks involved in driving. This is why 30,000+ people die in US automobile accidents annually, because people are actually not very good at driving.

      At present, people are better at handling unusual or complex driving situations than computers are. But I think that computers are currently better than human brains at driving down wide, smooth, well-marked roads for hours on end, mostly because they don't get distracted, bored, or tired. I expect self-driving systems to continue increasing the space of situations in which they're better than humans... and I fully expect them still to be imperfect and those imperfections to cause deadly accidents, just as human drivers do (though they'll almost certainly fail in different ways).

      But when computer performance is better than human performance, as measured in crashes and deaths, you're an idiot to rely on human performance rather than delegating to a device that does the job measurably better, keeping you safer. Curmudgeonly cynicism is popular, but data-based decisions are better.

  • Well yes (Score:5, Funny)

    by quonset ( 4839537 ) on Tuesday October 11, 2022 @08:08PM (#62958223)

    People are stupid. News at 10.

    • People are stupid.

      Perhaps. But not for the reasons in TFA.

      If trusting their cars was an actual problem, there would be frequent reports of vehicles crashing while Autopilot was engaged. There isn't. Accidents are less common than for human-driven cars.

      This is simply a case of legislation failing to keep up with reality. Drivers are legally required to pay attention, but they do not need to do so.

  • ... don't have these features in their cars.
    Stop headlining soon. Kthx

    • ... don't have these features in their cars.

      Over a million Americans have self-driving features in their cars.

      That qualifies as "many".

      • by Arethan ( 223197 )

        https://www.statista.com/stati... [statista.com]

        I think that chart shows quite clear that, comparatively, they number in the 'few'.
        For context, if you don't want to follow the link, there are over 8million licensed drivers in the US that are 19 years old or younger - a cohort that is handily outnumbered by 5-year age groups up until you reach 70+ years of age.

        And don't be fooled by the link slug, the data is updated to cover 2020.

        • A non-paywalled link was so easy to find. [hedgescompany.com]

          Even if the total registered drivers were an accurate measure of this problem, which it really isn't, 1 in 238 would qualify as "many" when the issue is vehicles driving around under the control of the flimsiest of AI and no human ready to intervene. 238 vehicles is an urban area is a small interstate traffic jam.
      • ... don't have these features in their cars.

        Over a million Americans have self-driving features in their cars.

        That qualifies as "many".

        That's not even one percent of the cars on the road. So, no, that's not "many".

    • and yet people still treat the vehicles as though they are self-driving- eating, shaving, fornicating while on the road.
  • So this sounds like Super Cruise and Autopilot are convincing users that they're doing a better job than the human would, while ProPILOT isn't. I believe SuperCruise is restricted to carefully mapped highways, while Autopilot will engage almost anywhere, leading to more places where Autopilot shows its weaknesses. In most cases, using these systems with an experienced driver also paying attention is the best option, but for much of highway driving, we're really at the point where letting the computers tak

    • by Miamicanes ( 730264 ) on Tuesday October 11, 2022 @08:28PM (#62958287)

      The thing these sensationalistic articles overlook is, most of the people who aren't paying "full attention" while the computer is driving wouldn't be paying "full attention" if their car didn't have self-driving capabilities, either. For most of these people, the fact that their car is actively paying attention to the road ahead when the driver's aren't represents a net improvement in overall safety vs the previous status quo.

      The fact is, if cars with self-driving capabilities had accident rates that were statistically worse, insurance companies would have noticed by now & raised their rates. The fact that they haven't is almost proof that the insurance companies themselves breathe a collective sigh of relief when people upgrade to self-driving features. It might technically be "two steps forward, one and a half steps back"... but it's still a net improvement of a half-step forward.

      In any case, Carol Burnett had an amusing skit about 30-40 years ago about the topic of "eating while driving" & pointed out that people who are eating are probably more dangerous than people who are drunk, because at least people who are drunk are trying to drive well instead of trying to dig the last french fry from a semi-crumpled bag that fell onto the floor when they slammed the brakes 3 blocks earlier.

      • Agreed. Tired drivers, distracted drivers, multitasking drivers, high drivers, and other classifications of voluntarily dangerous drivers can be every bit as dangerous as a buzzed driver. Legally alcohol gets treated differently, even if the choice to drive while unsafe is comparable and the devastating outcomes are the same. Boggles the mind.

        One aspect that concerns me about these systems is that they dumb down the driver for when they actually need to drive. If you spent the last 9 months being chauff

        • Legally alcohol gets treated differently

          That's because it is easy to prove a driver was drunk.

          Proof of driving while daydreaming is more difficult.

      • The fact is, if cars with self-driving capabilities had accident rates that were statistically worse, insurance companies would have noticed by now & raised their rates

        You make a good point but it also depends on how many people use/abuse the self-driving features. If most people do not use it because they don't trust it, the insurance premium will not be much higher since it is based on the collective.

        It is also worth noting that Tesla has high premiums (after a quick googling https://insuraviz.com/vehicles... [insuraviz.com]). It is comparable to Audi and BMW which are known for being used and driven by idiots. Regular brands like Ford, Toyota, Volvo have 20-30% lower premiums. Of cour

  • by Ed Tice ( 3732157 ) on Tuesday October 11, 2022 @08:20PM (#62958269)
    Admittedly I've only ever used the driver assistance features when I've encountered them with rented cars but I was thoroughly unimpressed. Lane keeping "worked" in the sense that you didn't go out of the lane but the tracking was so poor that I wouldn't have trusted the safety of my family to the system. The systems didn't kick in until you were uncomfortably close to the line. Why the heck not just stay dead center in the lane the way a human driver would. I did not test any of the emergency braking features!
    • You clearly don't have experience with the current generation of systems being discussed here. They don't behave anything like what you described. Yes, there are systems like that, and they specifically avoid doing anything until you're drifting out of the lane so that there is no question that you have to pay attention and the automaker doesn't get bad press about people letting the cars drive. The Tesla Autopilot (version 1) in 2015 would stay dead center in the lane, and was pretty good on highways.

      • You clearly don't have experience with the current generation of systems being discussed here. They don't behave anything like what you described.

        It depends on the system. The Toyota system warns you when you are near the line, and nudges you back into the lane, but if you let it, it will just ping pong against the two lines because it does not center the vehicle.

        I can't speak to other systems, other than second hand knowledge. The Tesla system I hear is quite nice about actually driving the car and keeping it in the lane.

    • For what it's worth, Autopilot does keep you dead center in the lane, and will actually swerve to avoid the inattentive dumbass in the next lane over that is paying more attention to their FaceTime call while driving, then driving.

      • by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday October 12, 2022 @08:17AM (#62959325) Journal

        For what it's worth, Autopilot does keep you dead center in the lane

        Indeed, and I consider it a defect, because it tries too hard to keep you dead center in the lane. This results in some annoying and sometimes frightening behaviors.

        First, when you're on the freeway passing an entrance ramp that doesn't have a dashed line separating it, Autopilot will often decide that your "lane" just became 30 feet wide and will move over (sometimes fairly aggressively) to get to dead center in this ultra-wide "lane". If the entrance ramp ends fairly quickly, the car often has to move rather aggressively back to the left as the "lane" narrows. In this case it's not really that the car doesn't keep centered, it's that it fails to recognize that the lane didn't actually get wider just because the line moved.

        Second, and more subtly, human drivers don't keep dead center in the lane, for good reasons, and Autopilot's failure to mirror normal human driver behavior is often offputting and sometimes mildly frightening. For example, if there's no one to the left of you while you're passing a vehicle that comes close to the near edge of its lane, either because it's a wide vehicle or because that driver is hugging the left edge of their lane, a human driver will slide a little to the left to maintain some separation. Autopilot stubbornly stays in the center, right up to the moment that the other vehicle begins crossing into your lane, at which point it brakes or swerves unnecessarily aggressively. Similarly, a human driver with vehicles to left and right will attempt to stay centered between those vehicles, even if that's not the center of the lane. Basically, Autopilot tracks lane lines, regardless of the position of adjacent vehicles. A human uses the lane lines as guides, but focuses primarily on maintaining distance from other vehicles.

        Another case where human drivers don't stay centered in the lane is when going around a bend, especially in the outside lane with a guard rail near the outside lane -- or perhaps especially when there is no guard rail and the road drops off rapidly. In this case, a human driver will generally slide a bit further in, away from the edge. The human can feel the centrifugal force pushing the vehicle to the outside of the turn, and naturally wants to generate a little more space and therefore reaction time in case something happens to cause the car to follow a straight line off the road. Again, a human driver uses the lane lines as guides, but really positions the car away from edges/obstacles, while Autopilot stays in the center of the lines.

        This second case is often quite unnerving, because the tendency to move away from the edge is so strong that when Autopilot fails to do it, the humans often feel like the car is moving toward the edge. If you watch closely you can see it really isn't, it's maintaining center. But it feels wrong, sometimes frighteningly so.

  • People already take their eyes off the road for several seconds, having the car do more and more to assist just adds to the complacency. Automatic braking, auto cruise control, and lane maintaining systems are pretty common now, so that inattention only grows. If they think it's harmless vast majority of the time, they presume it always will be. Which makes my straw man seem rather juvenile with the defence of "I never thought it would do that, it never did before." Back in my day (not really) you drove st

    • by crow ( 16139 )

      Stop-and-go traffic is so much better with Autopilot. You see so many secondary accidents in traffic jams from humans momentarily failing to pay attention and rear-ending the car in front of them. That doesn't happen with Autopilot. Yes, people will behave differently, but expect accident rates to go down, not up.

      I know Autopilot slammed on the brakes for a deer that I didn't see in time to react to, saving my car and possibly my life, not to mention my vacation.

  • by kriston ( 7886 ) on Tuesday October 11, 2022 @08:46PM (#62958317) Homepage Journal

    My Honda Sensing drives like a drunkard. If I let it do its lane-keeping thing on the highway it just drifts to the left dotted line and to the right and back, then a warning pops up on the dashboard: "Steering Required."

    • by crow ( 16139 )

      The systems the article is talking about are good enough that other drivers won't notice that the car is doing anything unusual. They fully take over steering, keeping the car smoothly in the lane. It's not at all the same feature.

  • 40,000 (Score:1, Troll)

    by backslashdot ( 95548 )

    Over 40,000 people are killed in traffic accidents every year by human-driven vehicles. We need to ban humans from driving ASAP.

    • by jsonn ( 792303 )
      If a human survives the crash, they can be held accountable for their (in)actions. As long as mistakes by "self-driven" cars are shrugged off as "Software problem, nothing we can do", I certainly don't want them anywhere near me. As long as companies caring more about hype and coolness are pushing the industry, I'm deeply concerned. What we see right now is the opposite approach to aviation technology: lack of redundancy in the name of cost-savings as well as lack of understanding and documentation as a res
      • by nbvb ( 32836 )

        So you find it more important to punish someone who causes an accident than actually trying to develop a superior system altogether?

        That’s pretty backwards.

        • Re:40,000 (Score:5, Insightful)

          by jsonn ( 792303 ) on Wednesday October 12, 2022 @06:08AM (#62959151)
          Without accountability, there is little incentive to create something that actually is superior. Capitalism is fundamentally about taking any shortcut that you can get away with it. Self-driving cars provide so many options for taking shortcuts to save money on research or hardware. Take the whole discussion about Tesla using only optical sensors. Consider what would happen if Boeing or Airbus would propose removing the radar used for the ground proximity warning system because they can compute the altitude and position precise enough from the GPS and therefore check directly against the 3D model of earth whether a collision is possible. They would become the laughingstock of the industry. At the same time, the whole 737max situation also shows the fundamental legal limitations to our accountability.
          • Capitalism is fundamentally about taking any shortcut that you can get away with it.

            Wait, what? That's NOT what capitalism is. You are talking about ethical flaws and misuse. If you want to be cynical, any economic system can be about certain people exploiting others for power and control.

          • by jjo ( 62046 )

            >Capitalism is fundamentally about taking any shortcut that you can get away with it

            It is silly to ascribe that behavior only to capitalism. That is precisely what happens under communism, with the added problem that you can get away with a lot more when you can silence (temporarily or permanently) anyone who complians.

      • If a human survives the crash, they can be held accountable for their (in)actions.

        Which does nothing to solve the problem. Humans will continue to make mistakes.

        As long as mistakes by "self-driven" cars are shrugged off as "Software problem, nothing we can do",

        Bullcrap. Nobody says that.

        What happens is that there is an NTSB investigation, the cause of the accident is identified, and the software is improved.

        • by jsonn ( 792303 )
          You are kind of proving my point. When was the last time a Tesla employee or manager went to jail? Oh wait, Tesla's autopilot is not full driving and the company doesn't pretend in its advertisement material that it is? Guess Tesla can't be held accountable!
          • As long as mistakes by "self-driven" cars are shrugged off as "Software problem, nothing we can do",

            Bullcrap. Nobody says that.

            What happens is that there is an NTSB investigation, the cause of the accident is identified, and the software is improved.

            You are kind of proving my point. When was the last time a Tesla employee or manager went to jail?

            When an airliner crashes, NTSB investigates, identifies systemic problems and specifies fixes... but no one goes to jail. And that's a good thing!

            If you actually want a system to improve it's very important not to threaten people with jail or other serious consequences, because that motivates them to try to cover it up. It's far better to take that off the table to begin with and instead focus on identifying root causes so that the system can be improved and the problems avoided in the future.

            The incred

            • by jsonn ( 792303 )
              The difference is that in aviation, you have a culture about redundancy and failure analysis. People can go to jail if they falsify those records. In fact, I'm curious to see if the investigation of the 737max crashes will result in exactly that kind of prosecution. "Blameless post mortem" is a wonderful concept and I'm absolutely in favor of it, but it is a part of the social contract that requires a very defensive mindset. I don't see tech companies in general and self-driving startups specifically havin
          • Cars have fatal crashes and get recalled due to mechanical failure all the time, yet nobody goes to jail. By the way, all of the Tesla deaths so far happened with drivers who were inattentive and didn't take evasive action. But whatever. My point is that you haven't asked for anyone to be arrested in those. I mean, about 10 years back, GM deliberately covered up a flawed in the ignition switch that led to 124 deaths, yet nobody was arrested for that. 16 people died due to Toyota's unintended acceleration is

  • My car only has adaptive cruise with lane centering, but even after a few minutes on a straight road you develop bad habits. I tend to start to look at the scenery as if I'm a passenger.

    But then my leg cramps up because wtf do you put your foot with cruise control? I hover over the brake pedal and that gets tiresome quickly yet I don't feel comfortable putting it flat on the floor in front the pedal.. what if a situation arises?
  • by 140Mandak262Jamuna ( 970587 ) on Tuesday October 11, 2022 @11:15PM (#62958623) Journal
    There is a newly opened highway near my town. Its not yet on the speed limit map of Tesla. I set the cruise to 70 mph. All ok, till I go over an overpass. The road below is 40 mph. It thinks I am doing 70 in a 40 mph zone and slams the brakes...
    • There is a newly opened highway near my town. Its not yet on the speed limit map of Tesla. I set the cruise to 70 mph. All ok, till I go over an overpass. The road below is 40 mph. It thinks I am doing 70 in a 40 mph zone and slams the brakes...

      Sure. Until the next mapping update ... then they'll all work perfectly.

      (did you report it?)

      Meanwhile: The humans are constantly removing all the experienced drivers from the pool of drivers and replacing them with ignorant new ones.

  • Simply because about half of the population or more _are_ idiots that cannot read a manual and if they do, they do not understand what is in there. The other thing is that many people have gotten to comfortable that they forget that some things, like driving a car, are very dangerous and need to be done right.

  • "said they were more likely to perform non-driving-related activities like eating or texting while using their partial automation systems than while driving unassisted."

    If a car's autonomous navigation detects the driver not paying attention, it should sound the horn and flash blinkers continuously and gradually bring the car to a safe stop in an emegency lane. An impaired, unsafe driver should be public knowledge to the people they may impact (quite literally too).

    Car companies are only embracing this behaviour now, with reluctance. You'd think monitoring the driver is a much easier task than monitoring the road. Instead, most companies chased the glamour of autonomous dr

  • If you can't even take your hands of the wheel to dip your nuggets or take a drink then why even put those systems in a car. We put this tech in your car than can kind of self drive your car but it's not safe to use it that way.
  • When you use a name like "Autopilot", many lay-people will expect that this means "autopilot", like how a plane will fly itself.

    Many people don't have the technical awareness or knowledge to know that this is not the case.

    • Airplane autopilots are only useful and stay engaged in the cruise phase of the flight. So, much like car "Autopilot".

  • Many US drivers treated completely unautomated cars as self-driving.

"It's the best thing since professional golfers on 'ludes." -- Rick Obidiah

Working...