Forgot your password?
Transportation Software The Courts

Toyota's Killer Firmware 610

Posted by Soulskill
from the skynet-draws-first-blood dept.
New submitter Smerta writes "On Thursday, a jury verdict found Toyota's ECU firmware defective, holding it responsible for a crash in which a passenger was killed and the driver injured. What's significant about this is that it's the first time a jury heard about software defects uncovered by a plaintiff's expert witnesses. A summary of the defects discussed at trial is interesting reading, as well the transcript of court testimony. 'Although Toyota had performed a stack analysis, Barr concluded the automaker had completely botched it. Toyota missed some of the calls made via pointer, missed stack usage by library and assembly functions (about 350 in total), and missed RTOS use during task switching. They also failed to perform run-time stack monitoring.' Anyone wonder what the impact will be on self-driving cars?"
This discussion has been archived. No new comments can be posted.

Toyota's Killer Firmware

Comments Filter:
  • by Anonymous Coward on Tuesday October 29, 2013 @04:29PM (#45272869)

    Those working on self-driving cars and those that are watching the technology already know that any such car would need an absolutely 100% rock solid OS.

    This changes nothing.

  • by dclozier (1002772) on Tuesday October 29, 2013 @04:31PM (#45272879)
    The owner of a self-driving car will have had to accepted the EULA [] and accepted not to hold the manufacturer liable for sofware defects. (half joking but I wouldn't rule it out)
  • by freakingme (1244996) on Tuesday October 29, 2013 @04:35PM (#45272913)
    Sure, they will be more safe. Just like in the aviation industry, where each incident/crash is investigated meticulously, and flying has become safer ever since 1903. With non-selfdriving cars 99% of the incidents were caused by human error. Now no more, so we can fix it!
  • by neoritter (3021561) on Tuesday October 29, 2013 @04:38PM (#45272941)
    If there's no human fall back or ability to overthrow the computer's control of the car I'll never drive it. I don't think this will change anything except maybe give the people that are rushing for self-driving cars some pause. Every developer knows the risks of self-driving computer controlled cars (if they don't, well they're naive). Between human error in programming and human maliciousness, there are two camps. People who think they can overcome the possibilities of putting a semicolon in the wrong place and prevent hackers from comprising the software's integrity. And people who realize the first people are fooling themselves.
  • Re:The Toyota Way (Score:5, Insightful)

    by div_2n (525075) on Tuesday October 29, 2013 @04:40PM (#45272953)

    Your post demonstrates a complete lack of understanding of what JIT manufacturing (i.e. lean) is and what it tries to accomplish. Hint: it's not about doing more with less. Further, you either willingly fail to mention Kaizen (continuous improvement) or just aren't aware that THIS is the heart and soul of the true Toyota Way.

    Whatever the reasons they failed in software engineering, neither JIT nor Kaizen would be to blame because neither of those try to nor should they translate to "engineer badly".

  • by neoritter (3021561) on Tuesday October 29, 2013 @04:49PM (#45273037)
    Or we could present this as the new Therac-25 and learn from it. :)
  • by Anonymous Coward on Tuesday October 29, 2013 @04:49PM (#45273043)

    Won't do any good. I can agree to a hold-harmless provision (and, despite the language of the EULA, such clauses are not actually universal). What I cannot do, is agree to it for someone else. You'd better believe a pedestrian hit by my self-driving car can sue the living daylights out of them. Heck, as previously mentioned, depending on what the particular problem is, *I* can still sue them.

  • by NatasRevol (731260) on Tuesday October 29, 2013 @04:50PM (#45273047) Journal

    I'd be happy with a car OS that kills less than 30,000 people per year. []

    Or even less than 10 million accidents a year. []

  • by Impy the Impiuos Imp (442658) on Tuesday October 29, 2013 @04:56PM (#45273145) Journal

    Not necessarily. If said cars kill fewer people than humans, it's still an improvement that should be done.

    The problems are lawsuits. A drug that saves 90% of cancer patients but kills 1 in 10 independently will have it's ass handed to it in civil. court -- assuming it makes it past the FDA.

    Would that outcomes analysis be applied to government activities and civil lawsuit lawyers ' claims of bettering the system as they fatten their wallets.

  • by mjr167 (2477430) on Tuesday October 29, 2013 @05:03PM (#45273217)
    You don't trust the engineer, but you trust the 16 year old girl trying to apply makeup and text her boyfriend while driving on the highway?
  • by Anonymous Coward on Tuesday October 29, 2013 @05:05PM (#45273235)

    As a old mechanic if you believe for one second that autonomous cars are going to maintained and inspected the way that planes are then you got a bridge to sell you.

    The question is not can we build these thing to me, the question is can we reliably maintain then in any capacity. As a mechanic I would take on liability for the parts repaired can you imagine the legal infrastructure required to allow someone other then the manufacturer to maintain and build these things. How do you compensate for a wheel bearing going bad or a brake that is dragging or any othe small thing that will throw the whole calibration off.

  • Re:The Toyota Way (Score:3, Insightful)

    by thesupraman (179040) on Tuesday October 29, 2013 @05:11PM (#45273295)

    Actually, there is absolutely zero proof that they did fail.
    NASA certain could not find any way to fault the system.

    What this decision is based around is a bunch of technical argument that they could have tried harder to prove
    that the system could not fail, but with absolutely zero proof that it does or even can fail. No procedure to make
    the software fail was presented, no theory of a set of inputs that could result in the theorised output was presented,
    only a critique of their testing and analysis procedure that poked a few holes in that.

    This is a VERY concerning direction for programmers in the USA, as of course complex software by definition cannot
    be proven correct (at least there currently exists no known way). It opens the door for all sorts of development-process
    based litigation, which is a very very bad direction for things to take.

    Again, so far ZERO evidence, proof, or test case has been provided that the software is in any way responsible for this

  • by Rising Ape (1620461) on Tuesday October 29, 2013 @05:24PM (#45273417)

    Yes, but software failures like this are a very rare cause of accidents. Vastly more common is human error, which your classic car won't help with. However when some human cockup results in a crash you'll be more likely to be injured or killed thanks to the much poorer crash safety of old cars. This will easily outweigh the tiny reduction in risk from having no software.

  • by es330td (964170) on Tuesday October 29, 2013 @05:28PM (#45273471)
    The problem with "a new car" is that some of the functionality has been taken away from the driver. In a classic car, if I put it in neutral, the gears disengage, especially if it is a stick. I may blow the engine if I push on the clutch and the throttle is stuck but power will be disconnected from the drive wheels. If I turn the key counter clockwise, the car WILL shut off. In a push button start, drive by wire car the driver uses physical inputs to tell the computer to do something and then the computer does it. If due to a software glitch it suddenly decides to max the throttle there isn't much I can do as the driver to stop it, at least not in the very limited time I have before I collide with another car or a wall. It isn't the probability of collision with which I have a problem, but the fact that significant parts of the control of a two ton machine powered by incendiary fuel are put under the control of a computer program.
  • by SleazyRidr (1563649) on Tuesday October 29, 2013 @05:29PM (#45273479)

    Yeah, the point of crumple zones is that the car gets damaged as opposed to the people inside. In fender benders old cars do better, but in a serious accident you'll be hurt worse in an older car. That doesn't stop me using a old car as my primary transportation, but I am aware that I am taking a risk doing so.

  • by ttucker (2884057) on Tuesday October 29, 2013 @05:43PM (#45273589)

    Good points. I guess the 1949 Chevy truck my dad and I rebuilt back in the 1990s wasn't very safe for passengers. You'd get thrown from it or something. But it sure was safe itself. One time we had a car come flying around the corner to close and slammed into the left rear wheel well of the truck. The car was totaled. The truck had a small dent on the fender. (The metal is so much thicker on those old cars, we had to use a sledge hammer instead of a normal body work hammer to take the dent back out). But again, if we were IN the truck when that happened we probably would have not fared so well.

    Modern steel is much stronger, the cars just crumple because they are supposed to.

  • by TapeCutter (624760) on Tuesday October 29, 2013 @06:39PM (#45274171) Journal
    A big red button on the dash marked "emergency stop". As I said elsewhere I've experienced a jammed mechanical throttle on a Honda 750 motorbike. Because I had a clutch the incident was no danger to anyone or anything except the engine, which screamed it's guts out before I turned it off.
  • by Frobnicator (565869) on Tuesday October 29, 2013 @07:02PM (#45274393) Journal

    Obviously it can fail, but it's a soft fail. The engine won't run, or more likely won't run well. Sudden acceleration or unstoppable engine though? Forget it. With the throttle plate closed there's no way you can get any more than the power produced at idle, no matter what the ECU does.

    That is exactly the thing that makes this jury verdict so suspicious.

    The driver was 76 years old at the time. This crash was subject to an NTSB investigation, and investigators found no evidence that it was a software fault or a hardware fault. The crash recorder says the driver pushed the accelerator and was not pushing the brakes, and then the car was hit.

    And most interestingly from TFA is the last line. Ten of the 12 jury members said they wanted to punish Toyota.

    If he was pushing on the brakes he could have probably overcome most of the force of a sudden accidental acceleration. If he had more time there were other options like shifting to neutral, but he was approaching an intersection.

    When I look at it, an older driver and vehicle recording systems saying the accelerator was pressed and the brakes were not, investigators finding no evidence to support the claim of a software failure, and then the jury stating they want to punish Toyota, I don't see this as a good verdict.

  • More Details (Score:5, Insightful)

    by rabtech (223758) on Tuesday October 29, 2013 @11:54PM (#45276607) Homepage

    Couple of details here:

    Toyota had no software testing procedures, no peer review, etc. The secondary backup CPU code was provided by a third party in compiled form, Toyota never examined it.

    Their coding standards were ad hoc and they failed to follow them. Simple static analysis tools found massive numbers of errors.

    They used over ten thousand global variables, with numerous confirmed race conditions, nested locks, etc.

    Their watchdog merely checked that the system was running and did not respond to task failures or CPU overload conditions so would not bother to reset the ECU, even if most of the tasks crashed. Since this is the basic function of a watchdog, they may as well not have had one.

    They claimed to be using ECC memory but did not, so anything from single bit errors to whole page corruption were undetected and uncorrected.

    A bunch of logic was jammed in one spaghetti task that was both responsible for calculating the throttle position, running various failsafes, and recording diagnostic error codes. Any failure of this task was undetected by the watchdog and disabled most of the failsafes. Due to no ECC and the stack issue below, a single bit error would turn off the runnable flag for this task and cause it to stop being scheduled for CPU time. No error codes would be recorded.

    They did not do any logging (eg of OS task scheduler state, number of ECU resets, etc), not even in the event of a crash or ECU reset.

    The code contained various recursive paths and no effort was made to prevent stack overflows. Worse, the RTOS kernel data structures were located immediately after the 4K stack, so stack overflows could smash these structures, including disabling tasks from running.

    They were supposed to be using mirroring of variables to detect memory smashing/corruption (write A and XOR A to separate locations, then compare them on read to make sure they match). They were not doing this for some critical variables for some inexplicable reason, including the throttle position so any memory corruption could write a max throttle value and be undetected.

    Instead of using the certified, audited version of the RTOS like most auto makers, they used an unverified version.

    Thanks to not bothering to review the OS code, they had no idea the OS data structures were not mirrored. A single bit flip can start or stop a task, even a life-safety critical one.

    These are just some of the massive glaring failures at every level of specifying, coding, and testing a safety-critical embedded system.

    I am now confident in saying at least some of the unintended acceleration events with Toyota vehicles were caused by software failures due to gross incompetence and negligence on the part of Toyota. They stumbled into writing software, piling hack on top of hack, never bothering to implement any testing, peer review, documentation, specifications, or even the slightest hint that they even considered the software something worth noticing.

"If that makes any sense to you, you have a big problem." -- C. Durance, Computer Science 234