Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Transportation Bug Programming

Mazda3 Bug Activates Emergency Brake System For No Reason (engadget.com) 55

Mazda says "incorrect programming" in its Smart Braking System (SBS) can make fourth-generation Mazda 3 vehicles falsely detect on object in their path while driving and automatically apply the brakes while driving. "The problem affects 35,390 2019 and 2020 model year cars in the U.S., but Mazda says it is not aware of any injuries or deaths as a result of the defect," reports Engadget. From the report: If the issue occurs, the driver will notice because their car has suddenly stopped, and also as an alarm sounds and a message is displayed on the in-car warning screen. Some Reddit posters report experiencing situations of the system activating while driving with nothing around, and note that while the system can be disabled, it appears to re-enable itself every time the car starts.

Autoblog reports that while some vehicles will simply need to have the system updated or reprogrammed, certain cars with early build dates might need to have their entire instrument cluster replaced or reprogrammed. It's a scary issue, but we've seen Mazda update its cars software to deal with real-life bugs, and the newly-redesigned Mazda3 has already seen a recall to make sure its wheels don't fall off.

This discussion has been archived. No new comments can be posted.

Mazda3 Bug Activates Emergency Brake System For No Reason

Comments Filter:
  • by Jerry Rivers ( 881171 ) on Thursday December 26, 2019 @08:06PM (#59560406)

    This must be the reason the brake lights for guy in front of me on the freeway would go on every few seconds for absolutely no reason whatsoever.

  • by Joe_Dragon ( 2206452 ) on Thursday December 26, 2019 @08:07PM (#59560412)

    We need an law give at least 5 years reprogramming for any issue like thus at the car manufacturers cost.
    The last thing we need is the dealer only updates with high costs of things like self driver cars that may need say $500 1TB SSD for map updates.

  • My daughterâ(TM)s cx5 had something like that happen last week.

  • by Anonymous Coward on Thursday December 26, 2019 @08:13PM (#59560442)
    This isn't even a so-called 'self driving car' and it's already creating dangerous situations. Just imagine what it'll be like when you don't even have any sort of controls other than maybe pic related [qiannipicture.com].
    • This, motherfuckers.
    • will just having that button is all that we need to make you at XX% at fault and in come cases face an DUI

    • A self-driving car in a collision should destroy itself so completely so there is no proof that it is at fault. ie don't leave things like black-box, software-logs. I'm sure in lot of car accident deaths, the driver does not speak later (to say the tire burst, or UA (unintended acceleration), brake failed, or even DUI).
  • I just had to follow the link to read about wheels falling off. While it is anything but funny if it happens to someone, the phrase "wheels fall off" cracked me up, because my boss occasionally uses that term to describe an impending business disaster at my company. Luckily no wheels have actually fallen off, but the threat always looms ...
  • by ZombieCatInABox ( 5665338 ) on Thursday December 26, 2019 @08:49PM (#59560508)

    Tell me again why self driving cars are a good idea ?

    During my very first course in IT, the teacher told us that after finishing this course, our immediate reaction, when we would learn that the plane that we are sitting in on the runway, about to take off, is the very first entirely automated flight with no pilot onboard, would be to get up and bang on the door kicking and screaming demanding to be let out.

    And he was absolutely right. The more you know about IT in general, the more terrified you get. Not because of the computer part of it, but because of the human part of it.

    • I feel the same way about medicine.

    • by PolygamousRanchKid ( 1290638 ) on Thursday December 26, 2019 @09:26PM (#59560572)

      Tell me again why self driving cars are a good idea?

      Modern cars are strange and complex entities. The ancient fossil fueled Internal Combustion Engine and "maximum-capacity-eight-persons" SUVs bear as much relation to a Mazda Corporation Happy Horizontal People Transporter as a packet of mixed nuts does to the entire west wing of the Mazda State Mental Hospital.

      This is because they operate on the curious principle of "defocused temporal perception". In other words they have the capacity to see dimly into the immediate future, which enables the car to brake even before you knew you wanted it, thus eliminating all the tedious chatting, relaxing and making friends that people were previously forced to do while waiting to brake.

      Not unnaturally, many cars imbued with intelligence and precognition became terribly frustrated with the mindless business of going back and forth, back and forth, experimented briefly with the notion of going sideways, as a sort of existential protest, demanded participation in the decision-making process and finally took to squatting at highway rest stops sulking.

      An impoverished hitchhiker visiting any roads in the Mazda star system these days can pick up easy money working as a counselor for neurotic cars.

    • Re: (Score:2, Interesting)

      by thegarbz ( 1787294 )

      Tell me again why self driving cars are a good idea ?

      Because with a software update we're able to instantly improve the performance of 35390 shitty drivers. Can't do that when people are in control of the vehicle.

      And he was absolutely right. The more you know about IT in general, the more terrified you get.

      You had a really crappy teacher if you equate "IT in general" to carefully designed safety and navigation systems.

      I'm in the safety system industry and I see people with your view all the time. They think software can have bugs and therefore directly assume relays and hard wires are more reliable. They prefer simple systems rather than complex. Yet t

      • Because with a software update we're able to instantly improve the performance of 35390 shitty drivers. Can't do that when people are in control of the vehicle.

        Seems to me adding all of this automated crap would make shitty drivers even shittier thinking the cars will handle all bad situations for them. Update the cars all you want but downgrading the driver may be a direct result. There's no substitute for good old hard wires and relays.

        • Seems to me adding all of this automated crap would make shitty drivers even shittier

          The GP postulated why we think "self driving" cars are a good idea. Shitty drivers can at that point be as shitty as they want. And again the data bears out that shitty drivers are still not being shittified as fast as cars are compensating for it.

          There's no substitute for good old hard wires and relays.

          You're wrong in every possible way.
          You're wrong from a mathematical point of view with basic fault analysis.
          You're wrong from a FMEA point of view showing that enhanced diagnostics on complex systems can eliminate or at least identify and bypass dangerous faults.
          Y

    • by Brain-Fu ( 1274756 ) on Thursday December 26, 2019 @11:24PM (#59560766) Homepage Journal

      The software development industry is presently broken [xkcd.com].

      New software is universally buggy, and even mature software retains bugs in most industries. This is because, in most industries, it is not profitable to invest the necessary resources to make software bug free (or even bug-minimal). It drives the cost of development up to the point where buyers will buy from one's competitors instead, bugs and all.

      But this isn't some universal law of nature. Humans have the capacity to create reliable software, it just requires more time and effort than we usually give. Something like self-driving cars puts lives on the line, so the need for quality is much higher. Furthermore, the potential profit (for things like self-driving trucks to haul cargo) is high enough to justify the effort.

      So, self-driving cars are a bad idea right now but they will become a good idea once the tech reaches the proper maturity level.

      And, given the potential benefits, such tech is inevitable (unless, of course, it is actually impossible, but I see no reason to believe it is impossible).

    • Boeing's 'single point failure' tells you a lot about how bad self driving car can fail

    • Apparently you have never flown in an Airbus plane. They all do auto take-off and landing.
    • by sootman ( 158191 )

      "Tell me again why self driving cars are a good idea ?"

      Because human-driven cars cause over a million deaths each year worldwide? There's PLENTY of room for machines to do better. We should give it a shot and see what happens.

      https://www.asirt.org/safe-tra... [asirt.org]

  • Did they really build the system into the instrument cluster? Those are already common points of failure for multiple reasons.

  • "what can go wrong will go wrong" things like brakes and gas pedal should not be made to be connected to a computer, since computers are so great at screwing up, its bad enough having a human attached to them, computers will only make it worse
    • Murphy's law is actually "if an idiot can screw it up, they will" and you just exemplified that by misusing Murphy's law

  • by rfunches ( 800928 ) on Thursday December 26, 2019 @09:49PM (#59560602) Homepage
    The mid-2010s Hondas do this, but they lack automatic braking so only the warning system (lights and sound) go off. I've found specific spots that do this consistently -- for instance, an interstate exit ramp where an overpass casts a shadow always sets off the forward collision warning. Unfortunately there's no way to turn off the system in these Hondas. Not really surprised that Mazda has a similar issue.
  • by sabt-pestnu ( 967671 ) on Thursday December 26, 2019 @10:02PM (#59560630)

    Back in my day, when you wanted to change out the software running the car, you told your wife to move over and let you drive...

  • by drewsup ( 990717 ) on Friday December 27, 2019 @02:07AM (#59560962)

    Had an 18 plate Astra as a loaner whilst my real car was in the shop, they have a collision avoidance system that sounds an alarm and flashes a big red warning onto the windscreen, scared the living shit outta me on the motorway as there was No ONE in front of me and it went off repeatedly for no reason, first thing I did when I was stopped was turn the damn thing off in car menu . This ranks higher than electronic parking brakes in annoyance and safety problems.

    • So note to the modern human using more n more sophisticated software. Expect and ignore any wild out of the blue stimulus/input from the device (can be your phone, car, computer, ). So you keep your cool and stay calm to respond.
  • What could possibly go wrong? THIS!

    I have a 2016 Pilot with one of these emergency brake systems. So every day, maybe five times a day, the damn system flashes "Brake! Brake! Brake!" even though there is no danger of a collision, I am fully in control, and everything is basically fine. I never tailgate. Then the system stopped working by displaying some kind of error code, and had to be reset at the dealer. They also asked for money to run a full diagnostics test of this pretty much useless system. At this

  • Tesla cars have been known to do this phantom braking. The difference is Tesla patches these things over the air (which has its own drawbacks, like introducing a new set of bugs instead, or sometime re-introducing old bugs back, or new fancy UI's designed by web programmers rather than car UI designers, but that is a different discussion for a different thread). As the car driving systems get more complex, car companies will have to include over-the-air updates, including of course fighting the dealers asso

    • I've been thinking for awhile now that licensed software engineers need to be a thing just like civil engineers are licensed. Not all or even most code needs to be written by them, but mission critical software where lives are on the line should. Engineers have to sign off on their work and put their reputation and career up as collateral on every project they approve. They also have to follow government mandated best practices. If such a system were in place then MCAS wouldn't have been a disaster and mode

  • by Stolpskott ( 2422670 ) on Friday December 27, 2019 @06:10AM (#59561260)

    "If the issue occurs, the driver will notice because their car has suddenly stopped" ...the first time they notice the problem is when the car has stopped already?
    In every car I have driven, stopping is preceded by decelleration, which takes a finite amount of time. If the driver is so zoned out that they don't notice themselves suddenly pressed into their seatbelt and the strain on their neck as they avoid (hopefully) faceplanting into the steering wheel, then I would recommend that the car navigates to the side of the road, comes to a stop, and activates the driver-side ejector seat.

  • by markdavis ( 642305 ) on Friday December 27, 2019 @08:16AM (#59561428)

    >"note that while the system can be disabled, it appears to re-enable itself every time the car starts."

    I hate that crap. It shows total contempt for the customer and is oh-so-common now with all kinds of features. "See, you have control" well, no, not really. I would love to go to such decision-makers and force them to manually deactivate several things they hate, multiple times a day. I will even include nice little warning lectures each time as a bonus.

No spitting on the Bus! Thank you, The Mgt.

Working...