Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Bug Transportation Software The Courts

Stack Overflow Could Explain Toyota Vehicles' Unintended Acceleration 664

New submitter robertchin writes "Michael Barr recently testified in the Bookout v. Toyota Motor Corp lawsuit that the likely cause of unintentional acceleration in the Toyota Camry may have been caused by a stack overflow. Due to recursion overwriting critical data past the end of the stack and into the real time operating system memory area, the throttle was left in an open state and the process that controlled the throttle was terminated. How can users protect themselves from sometimes life endangering software bugs?"
This discussion has been archived. No new comments can be posted.

Stack Overflow Could Explain Toyota Vehicles' Unintended Acceleration

Comments Filter:
  • Go Amish? (Score:5, Insightful)

    by dbc ( 135354 ) on Friday February 21, 2014 @08:03PM (#46307769)

    "How can users protect themselves from sometimes life endangering software bugs?"

    Amish buggies typically don't have software throttle failures. Run-away horses can be an issue.... and actually having to share the road with dipshit drivers who don't understand the number of slow moving vehicles (not just buggies) that there are out in farm country are a real danger.

    Seriously, software has bugs. Mecanical throttle linkages can stick, too. Life has risks.

  • Re:Wow (Score:0, Insightful)

    by Anonymous Coward on Friday February 21, 2014 @08:10PM (#46307829)

    Is there anything Stackoverflow can't do?

    Fix beta?

  • Re:Go Amish? (Score:4, Insightful)

    by Anonymous Coward on Friday February 21, 2014 @08:22PM (#46307905)

    Coming from the aerospace industry, you cannot have software that has bugs. And if there was the possibility of a software bug, you have to prove that you can mitigate the effect in hardware. So just to say "software has bugs...life has risks" isn't an acceptable answer (in my opinion). We have to remember this is not an apples to apples comparison. Just because traditional consumer software always has bugs in it (which are acceptable) doesn't mean they are acceptable in other industries. Considering that the failure puts someone's life at risk, I would think it should be considered unacceptable in automotive industry as well.

  • Can != did (Score:4, Insightful)

    by 140Mandak262Jamuna ( 970587 ) on Friday February 21, 2014 @08:24PM (#46307921) Journal

    "We've demonstrated how as little as a single bit flip can cause the driver to lose control of the engine speed in real cars due to software malfunction that is not reliably detected by any fail-safe," Michael Barr, CTO and co-founder of Barr Group, told us in an exclusive interview. Barr served as an expert witness in this case

    Emphasis mine.

    Yes, a single bit flip can cause unpredictable behavior in any code. You could say that without any analysis. A single mistake in sign can get you a goose egg in the Algebra paper. So many people could have won the lottery if only one digit was different. These are well known. But can != did. Did that stack overflow? Did it actually happen? That is the question.

  • by EmperorOfCanada ( 1332175 ) on Friday February 21, 2014 @08:24PM (#46307923)
    Quite simply the absolute control should not be handed over to the computer. Basically doing something like pulling on the handbrake should basically physically cut the throttle. Or stomping on the brakes should activate a simple solenoid that cuts the throttle. This mechanism should be 100% separate from the computer and override most computer outputs.

    I see this as critical in a driverless car. There needs to be a way for people to pull the plug and there needs to be a way for people to phone in an emergency. So if someone is lying in a pothole being run over by car after car, or the bridge is failing, there needs to be a way for 911 to say that a stretch of road is now cut off. The key is that this cannot be ab abusable by officials. I do not want my car grinding to a halt because the police are looking for some runaway or a bank was robbed.
  • by Futurepower(R) ( 558542 ) on Friday February 21, 2014 @08:57PM (#46308117) Homepage
    Technically knowledgeable people often give very poor names to their efforts.
  • by hermitdev ( 2792385 ) on Friday February 21, 2014 @08:59PM (#46308133)

    I've known a lot of high-quality developers over my 15 years of professionally developing software. The reason I don't want an automated car is because of these people. People make mistakes, intentionally or otherwise. There are unforeseen circumstances that the software may not understand. A foreseen circumstance: I've yet to see a demo of an automated car navigating anything resembling an icy surface (safely or otherwise), let alone in stop & go traffic in a city such as Chicago, where such things are quite common.

    Yeah, but we know today who pays: the insurance company of the at-fault driver (provided they have the legally required insurance - I think collision is required in all US states). Failing that, the at-fault driver. Failing that, the dead at-fault driver's estate.

    The question at hand is, in the case of an automated car, who is at fault (when the automated car is deemed to have caused the accident)? The manufacturer, because, it must have been a design/implementation flaw? The owner? The driver (because owner/driver aren't necessarily the same)? It becomes more difficult when you divest yourself from current paradigms of car transport. Oh, I sent my 6 year old daughter to school as a passenger, and along the way it ran over someone. There was no driver (the daughter was a passenger), but the car still killed someone. Am I at fault because I ordered the car to make the trip? Is the manufacturer at fault because the car didn't detect and prevent the collision that killed the other party? Is my 6 year old daughter at fault, because she was the only human occupant? This is the question that is being posed.

  • by Demonantis ( 1340557 ) on Friday February 21, 2014 @09:05PM (#46308173)
    That is a software failure. It isn't failing safe at all. A watchdog timer, of sorts, should be occurring that would detect a failed sensor assembly.
  • Re:Go Amish? (Score:5, Insightful)

    by amorsen ( 7485 ) <benny+slashdot@amorsen.dk> on Friday February 21, 2014 @09:06PM (#46308181)

    Killing the ignition also means killing power steering and power braking. There is a quite widespread belief that it can also engage the steering wheel lock, but AFAIK no one has been able to name a car where that happens so far. The next challenge is that in many modern cars the ignition switch is just a button which is handled in... software. You could throw the key out of the window and wait for the anti-theft device to kill the fuel supply, but that does not seem like something that most people would try.

    In most cars you can put the gear box in neutral. The car will likely have a rev limiter (possibly in software, but it might still work). Worst case the engine breaks, but in almost all cases that would not be fatal to the people in the car.

    However, in almost all cars, when not going down a steep hill, the brakes are actually more powerful than the acceleration. Just do not let off the brakes once you get the car slowed down and you think things are under control -- then the brakes overheat and you have a stuck accelerator combined with no brakes, and that has killed at least one driver already.

  • Re:Go Amish? (Score:5, Insightful)

    by arkhan_jg ( 618674 ) on Friday February 21, 2014 @09:15PM (#46308219)

    Even in the aerospace industry, there are software bugs. Very few, yes, because a lot more time and money is spent to track them down. There are mechanical failures too, despite the best engineering efforts. But anything we build has the potential to be flawed somewhere in the process. That's why we still put highly trained pilots in the things, for when something goes wrong - and even then, human error causes unintended faults, sometimes catastrophically.

    If a car cost as much as a jet, and drivers went through as much training as a passenger pilot - and had to have a co-driver at all times - we'd be far safer on the roads.
    After all, the vast majority of car crashes are driver error. And failure modes when you're at 30mph on wheels are a lot better on the whole than when at 30,000 feet. But if we built cars to the same standard, and held drivers to the same standard as aerospace engineering, only the rich could afford to.

    There's a trade off between acceptable risk, and cost. Even though the designs get safer every year, maybe we allow too much risk in drivers and their cars. But absolute flaw free cars? Impossible.

  • by JMZero ( 449047 ) on Friday February 21, 2014 @09:16PM (#46308223) Homepage

    People make mistakes, intentionally or otherwise

    Yes, people do make mistakes. Often while driving. The test shouldn't be whether automated cars make mistakes, but rather whether they do better than an average driver. Can they deal with icy roads as well as an average driver? That bar's pretty low, even here in Edmonton.

    Once they've reached that average competence and start being deployed, they'll also improve rapidly over time; computers have the potential to be much safer drivers than humans. They'd know where other cars are and where they're going, they'd be able to apply brakes to wheels independently with lightning reactions, and would not be subject to health conditions, intoxication, aging, or inexperience.

    I'm not sure how far off we are, but it's definitely coming.

  • Re:Live in a cave (Score:0, Insightful)

    by Anonymous Coward on Friday February 21, 2014 @09:20PM (#46308239)
    You're one to talk, you can't tell break from brake. Idiot.
  • by ebno-10db ( 1459097 ) on Friday February 21, 2014 @09:25PM (#46308269)

    I've known a lot of high-quality developers over my 15 years of professionally developing software. The reason I don't want an automated car is because of these people. People make mistakes, intentionally or otherwise.

    When it comes to true high-rel software, like that written to DO-178B Level A (an avionics software standard used for things like fly-by-wire) it's almost never the software per se that's at fault. The stuff is amazingly good. It's also amazingly expensive to write and test. You might also find it frustrating because it brings new meaning to the idea of conservative design. For example, I don't think it allows recursion. I know it doesn't allow dynamic allocation.

    There are unforeseen circumstances that the software may not understand.

    That's more the point. When things like fly-by-wire systems have problems, it's almost never the software itself, but something that was unforeseen by the system designers.

    A foreseen circumstance: I've yet to see a demo of an automated car navigating anything resembling an icy surface (safely or otherwise), let alone in stop & go traffic in a city such as Chicago, where such things are quite common.

    Agreed. This technology is interesting, but it's way over-hyped. It's impressive to be able to drive on a nice clear day, but a far cry from the real world, even in Silicon Valley, let alone Chicago.

    Ever hear of the flying cars that in the 1960's they said we'd all have soon? I haven't seen any lately. I suspect driverless cars robust enough not to require human intervention when the going gets tough may be developed on the same schedule.

  • Re:Live in a cave (Score:5, Insightful)

    by D'Sphitz ( 699604 ) on Friday February 21, 2014 @09:29PM (#46308297) Journal

    During the trial, embedded systems experts who reviewed Toyota's electronic throttle source code testified that they found Toyota's source code defective, and that it contains bugs -- including bugs that can cause unintended acceleration.

    I'm currently weighing your hunch against the industry experts who reviewed the source code, identified specific flaws in detail in an 800+ page report, and testified to their findings, presumably under oath. I'm on the fence here, it's a tough call. I'll let you know what I decide.

    Although I'll admit I get the feeling that there is some kind of conspiracy going on here, probably related to an Obama Muslim invasion. I mean, what are the odds that the "several hundred [people] contending that Toyota's vehicles inadvertently accelerated" all happened to be fans of Toyota's with the same electronic throttle control system?

  • by phantomfive ( 622387 ) on Friday February 21, 2014 @10:04PM (#46308559) Journal

    he was sitting at a stop sign waiting for a train to pass when the car inexplicably accelerates and lurched forward.

    Why did he take his foot off the brake?

  • by Cassini2 ( 956052 ) on Friday February 21, 2014 @10:19PM (#46308661)

    Almost all safety systems in Ontario have not been designed or written by a PEO licensed engineer. The PEO is the same organization that tried to get Microsoft to stop using the term "Microsoft Certified Systems Engineer (MCSE)" and largely lost. If you start analyzing real and deployed systems, you will be shocked at what you find.

    Yes, there are a few very well designed machines out there that do hardware and software interlocks properly, and in an obviously safe fashion. These are the exception, and I am delighted to find the few exceptions that exist.

    However, if you want excellent examples of obviously unsafe things, consider:

    - The gas pumps at Shell, Esso, and Petro-Canada. How many brands have an Emergency Stop button? One?

    - Toyota cars have a push to start button that is also a push and hold to stop button. So how do you stop the car quickly? Shouldn't a car that has push-button start, also have a push-button stop, that is a different button and works quickly? Why would Toyota follow the Microsoft standard of using a start button to stop, instead of following the very well thought out emergency stop button standards?

    - Hospitals have implemented a number of computer systems that are networked, and make the job of nurses quicker, easier and more productive. This reduced nursing costs considerably, and fewer nurses are looking after more patients. However, these systems are not reliable, and the official backup plan is that a nurse will step in and do the job manually if the system fails. Unfortunately, many of these core systems are also running on Microsoft Windows (often Windows XP.) One virus, or one bad update, written by a non-engineer, to wipe out many core systems. A major hospital had its Internet linked systems disrupted because too many people watched Olympic hockey (over the critical internal network.) Has any engineer approved any of this? Does any hospital have enough nurses to cover off in the event of a computer failure?

    - Most servo-motor drives are sold with a "not recommended that power be cut by an emergency stop/safety system" warning buried somewhere in the documentation. Ignoring this, and assume braking resistors are used, and power is really cut. Most motors will follow an exponential stopping curve, and appear to coast to a stop. A mechanical engineer doing a PSHSR (Pre-Start Health and Safety Review) will expect the machine to stop quickly, and not coast. The cheapest way to do that is to dynamically brake into braking resistor under full software and transistor control. The second cheapest way is to use a parking brake, but those are not rated for safety and only a fraction of the servo-motor market uses parking brakes anyway. How many PEO licensed mechanical engineers doing PSHSR reviews have passed systems with incorrect software E-STOP circuits and Safety Circuits, and failed E-STOP circuits and Safety Circuits that cut power to the motors in hardware?

    Do not depend on the PEO and statutes to keep you safe ...

  • by GrahamCox ( 741991 ) on Friday February 21, 2014 @10:36PM (#46308745) Homepage
    They didn't speculate. They analysed the source code, which they had clean-room access to, as well as the actual compiler and test harness used by Toyota. Toyota testified that the stack utilisation was 41%, whereas the analysis showed that it was actually 97% *before* the recursion was taken into account. It looks pretty certain that stack overflow could occur. Following the stack are key system structures used to control the scheduling of threads on the CPU, and damage to these structures could cause one or more of the threads to never get any scheduled time. One of the threaded tasks not only controls the throttle, but also the failsafes in case of some scenarios,and also the code that writes fault codes to the battery-backed RAM. Basically, if that task dies, then the throttle is left uncontrolled, the failsafes don't kick in, and no fault codes are written so that the problem is revealed after the fact. It's a terrible design; a disaster waiting to happen.

    Uses can protect themselves against this sort of thing by not buying a Toyota until they are compliant with the relevant standards. Only hurting them in the marketplace will get them to fix this problem.

    The testimony from both expert witnesses Barr and Koopman are now a matter of the public record and actually make fascinating reading - they'll be especially interesting to computer guys because it goes into a lot of detail about the code design, though frequently translated for the benefit of the court into layman's language. It's going to go down in history I reckon as a classic case of how not to design a safety-critical system.

    A great summary and links to the public court documents can be found here: http://www.safetyresearch.net/2013/11/07/toyota-unintended-acceleration-and-the-big-bowl-of-spaghetti-code/ [safetyresearch.net]
  • by Anonymous Coward on Friday February 21, 2014 @11:32PM (#46308969)

    I've worked with embedded systems for the past 8 years, right out of university. You know what I've noticed?

    Most embedded code is terrible, and the programmers themselves are completely full of themselves. And it's not because "its built to be reliable", "or its maintainable", or whatever else. It's because the programmers who wrote the code barely even know C. They walk around on their high horse thinking they are vastly superior to every other programmer because they "work in embedded", "are close to the bits", and whatever else. And when they encounter a function pointer, or some CPP statements, they are taken aback and immediately start into "how dangerous" it is.

    Here is a great gem of an exchange:
    Me: the variable you declared in that .c file is not visible to the other compilation unit - you need to make it visible. That's why is wont build.
    Them: make it 'static'!
    Me: that wont work - your making the problem worse.
    Them: oh, these C compilers are so stupid! They didn't used to be like this! Somebody needs to fix C!

    Bad programmers are bad programmers. If you write "pointer heavy" code and you are getting out-of-bounds faults all the time - your doing it wrong. If you are getting alignment errors, your doing it wrong. And if you have global variables scattered all over the place, tangled up in 100's of horribly named flags, are marking every variable as 'volatile' because you think it means "dont optimize", you are doing it wrong.

    Sorry, I obviously disagree with you (quite strongly). Shitty programmers are shitty programmers, young or old. It has nothing to do with today's education.

  • Re:Can != did (Score:4, Insightful)

    by Cochonou ( 576531 ) on Saturday February 22, 2014 @01:04AM (#46309245) Homepage
    Did it actually happen? That is the question.

    From an engineering standpoint, it's not really the question - if there is a design flaw so that the system can fail with a non-negligible probability, it will eventually fail. Bits flip everyday, everywhere, but there should be mitigation in place to take care of that (at least a watchdog).
  • by firewrought ( 36952 ) on Saturday February 22, 2014 @01:43AM (#46309343)

    Technically knowledgeable people often give very poor names to their efforts.

    I thought "Stack Overflow" was great branding for a website aimed at helping programmers solve technical problems. It's a distinctive, cheeky in-reference understood by its intended audience. (And honestly, it didn't hurt that most developers enjoy being made to feel clever about themselves.) That's what a brand is suppose to do [about.com], and it partially* explains their overwhelming success [alexa.com]. And hey, much better branding choice than ExpertSexChange.com!

    *Of course, branding is just one of many things they did right. They also filled a unique niche, understood their community (because it was started by programmers, for programmers), and made the site super-easy to use by (here comes the important part...) NOT crapping over the UI with a fake paywall that sought rent for years' worth good-faith user contributions. However, they are sort of starting to be dicks about subjective questions (such as help with API choices, etc.). That may provide a niche for a new competitor to fill...

  • Re: Live in a cave (Score:4, Insightful)

    by viperidaenz ( 2515578 ) on Saturday February 22, 2014 @01:47AM (#46309353)

    I even found some information for you:
    http://www.caranddriver.com/fe... [caranddriver.com]

    A 268 horsepower Toyota Camery can stop from 70mpg in 190 feet with the engine at full throttle. Only 16 feet longer than not throttle. At 100mph the stopping different was 88 feet.

    When they tested a full throttle stop from 120mph, the car slowed to 10mph before the brakes overheated.

    Perhaps you should put your foot harder on your brake pedal.
    You are american though, so your Impala is probably an automatic. In first gear with the torque converter multiplying the torque it may be able to slowly move the brakes when they're fully applied.

  • by Attila Dimedici ( 1036002 ) on Saturday February 22, 2014 @09:17AM (#46310465)
    Actually, the best analysis of the situation I have seen came from someone who had investigated a similar problem that was reported for cars with purely mechanical acceleration linkages some years back. They had studied that situation and discovered that overwhelming majority of those experiencing the problem were in a particular age range (I forget now if that age range was 55-65, or somewhat older). Further analysis revealed that most people go through a kinesthesia change during that period (kinesthesia is the awareness of where a body part is based on the sensations you are receiving from it). When going through that change, people often believe that their feet are positioned a few inches from where they actually are. As a result, drivers in this age range are likely to be positive that their foot is on the brake when it is actually on the accelerator. The interesting thing is that once the person becomes used to the change, they are perfectly capable of being aware of where their foot is once more. The person who did that original analysis analyzed the Toyota data and found that the majority of reported cases involved drivers who were in the same age range. When he took out those data points which looked suspicious as to being part of this actual problem(drivers who looked to be cashing in on the publicity of this, either for money or to explain away their own bad behavior, accidents where no one in the vehicle survived, but this was used to explain irrational behavior on the part of the driver, etc) the overwhelming majority of cases were in this age range and most of the remaining were inexperienced drivers.
  • Re:Can != did (Score:4, Insightful)

    by AmiMoJo ( 196126 ) * on Saturday February 22, 2014 @10:11AM (#46310653) Homepage Journal

    A watchdog isn't the best solution to this problem. You don't really want your ECU to reboot randomly due to a fixable error. Just use ECC RAM and keep redundant copies of critical values in memory.

    There is no way this issue could have caused the number of events people are claiming to have had. Such a bit flip has never been observed in real life (they used a debugger to simulate it) and the changes of that particular bit out of millions corrupting is extremely low. Of bit flips were common enough to cause this we would see the effects as other systems like the dashboard display and head unit crash, not to mention other parts of the ECU.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...