Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Software Google

Hackers Could Blow Up Factories Using Smartphone Apps (technologyreview.com) 125

An anonymous reader quotes a report from MIT Technology Review: Two security researchers, Alexander Bolshev of IOActive and Ivan Yushkevich of Embedi, spent last year examining 34 apps from companies including Siemens and Schneider Electric. They found a total of 147 security holes in the apps, which were chosen at random from the Google Play Store. Bolshev declined to say which companies were the worst offenders or reveal the flaws in specific apps, but he said only two of the 34 had none at all. Some of the vulnerabilities the researchers discovered would allow hackers to interfere with data flowing between an app and the machine or process it's linked to. So an engineer could be tricked into thinking that, say, a machine is running at a safe temperature when in fact it's overheating. Another flaw would let attackers insert malicious code on a mobile device so that it issues rogue commands to servers controlling many machines. It's not hard to imagine this causing mayhem on an assembly line or explosions in an oil refinery. The researchers say they haven't looked at whether any of the flaws has actually been exploited. Before publishing their findings, they contacted the companies whose apps had flaws in them. Some have already fixed the holes; many have yet to respond.
This discussion has been archived. No new comments can be posted.

Hackers Could Blow Up Factories Using Smartphone Apps

Comments Filter:
  • FUD (Score:5, Insightful)

    by Anonymous Coward on Thursday January 11, 2018 @11:38PM (#55912757)

    Oh look, it's the hackers can bomb you with you own computer headline again.
    This time featuring smartphones and apps oh boy that changes everything!

    • by hey! ( 33014 )

      Well, factories are full of stuff that can kill people and controlling those things with something an operator might treat as a personal device certainly increases the attack surface.

      So maybe we're not talking about new possibilities here, but we may be talking about a new set of probabilities.

      • Factories are full of stuff that can kill people, and preventing them from killing people has nothing to do with controlling them, and everything to do with independent safety mechanisms.

        Any modern plant maintained to any HSE or OSHA minimum standards would allow the control system to do whatever the hell it wants without blowing something up or killing anyone.

        Sure there's a shutdown risk, but the major risks should be controlled in a way independent of something someone at a console could do.

        • by mushero ( 142275 )

          Ah, yes & no - those protections you speak of are in the PLC and controller code, which may well be able to be changed via these apps or vulnerabilities exposed to or by these apps.

          Of course, we try to ensure no console/operator can blow things up, but they can do many bad things, like mix explosive chemicals, run at unsafe speeds/temps with various material mixtures, over-tension, etc. The control system can't know everything in complex systems.

          Plus lots of systems have manual modes and sequencing that

          • Ah, yes & no - those protections you speak of are in the PLC and controller code, which may well be able to be changed via these apps or vulnerabilities exposed to or by these apps.

            No and no. There's no safety systems vendor in the world that provides an "app" that can write to a safety system, and PLCs and controller codes are far from the only systems. Thermal protection for machines is often independent of safety and controllers, for electrical they sit in the electrical protection domain even for things like temperature. For pressure protection there are relief valves, and bursting discs. For flow protection we have check valves (which admittedly spend more time in a jammed state

            • While the article doesn't give exact details on what apps they looked at, the category seems to be 'Industrial equipment control apps'. This could mean a lot of things - these could be apps that just act as a remote control interface to a machine unable to cause the machine to act unsafely.

              ...Or it could be an app used to update/program the machines firmware... in which case all bets are off when you can inject what you want into the brains of the machine if any of it's safety features are dependent on s
    • Re:FUD (Score:4, Insightful)

      by Darinbob ( 1142669 ) on Friday January 12, 2018 @01:27AM (#55913085)

      Why would any important system be controlled by a smartphone app anyway, that's just dumb. And why would these apps be put on Google Play for the public to see? No operator is going to use an app to control machinery, instead they're going to look at the dials, use an official computer on-site, and so forth. Maybe in the IT world the sysadmin works from home, but in any mission critical application the workers are always on site.

      Any apps used are likely for field service workers to get a quick update (what jobs are left to do, verify that changes are being propogated before packing up, etc). Even then, have you tried using a smartphone or tablet while wearing safety gloves?

      It would be nice to see some examples of the kind of apps that are being used this way in the article.

      • Raw access to writing back-end registers would seem like an extremely odd design choice to be sure. There are apps that can initiate pre-programmed sequences (with safety interlocks handled upstream), adjust setpoints (with range checking still handled upstream), and pull telemetry and production data.

        I can see how you could be a nuisance via tablet/smart phone app, but hopefully it would at least require a password. Re-programming safety checks though seems like terrible design.

        • I think there's a group of FUD people out there with regards to SCADA, smart grids, or even embedded systems in general. So we see these sorts of doom and gloom stories quite often that turn out to not have much to them except for the initial panic.

      • by nnull ( 1148259 )
        I don't know. But go to anyone hosting conferences for Siemens, Rockwell, etc. The big talk is about having things controlled with your smartphone app and being able to upload changes while sitting on a beach. Try to mention anything about the dangers of such a system and you get talked down too.
        • If ease of access and remote accessibility takes priority over safety when the dangers are to life and limb of employees, well whoever is pitching that needs to be talked down to. Bring proof and we'll get whoever is hosting these conferences shut down, and shut out of whatever business they are in entirely.
      • I am a Controls Engineer, i.e. I maintain, code, spec, etc. systems like this. Not a programmer for the vendors who make the software, but end user at a plant using controls software and hardware to make things happen.

        The smartphone is not controlling anything, it is the window to look into the controls system to see what is happening.

        All of the major companies are designing applications that can do the same thing the operator interfaces do from a smart phone that is connected to the same network as the ma

      • An important system being controlled by a smartphone app isn't dumb. It is an option as long as the process is locked down and secure. If there is any desire to improve manageability and access then a smartphone app is a good thing. Now, important systems which contain sensitive financial/health/etc information or which affect risk to a persons health or life should not be accessible outside of trusted on-site users. Smart phone apps should not be allowed for these systems to avoid creating a bridge between
    • Re: (Score:2, Insightful)

      Yep, FUD. Any half-engineer puts electrical and mechanical limits to prevent multi-million dollar equipment to do things that they should not, even when the electronics (the computer) try to give orders to do so. This is the fault of those ridiculous hollywood movies that try to pass the retarded idea that a scriptkiddie with a computer can control anything.
      • Not all of it is FUD but it is likely fairly overblown. Remember there was this [wikipedia.org] from 10 years ago where INL let the smoke out of an expensive generator. There is a lot more to the test that was not released to the general public but to those in the industry but it isn't entirely FUD. It also isn't surprising that companies want phone apps to interface with the factory floor devices because that kind of stupid shiny sells to MBA types.
        • I see you did not understand my point, right? It's okay, I understand that not everyone is aware of how heavy power equipment works in real life.

          The idea is that in a system that has a minimum of good sense at the time of design, you have layers of protection of different kinds that prevent a potentially catastrophic command from being executed, and you also design knowing that your control system may have problems and may try to execute exactly these commands that can be catastrophic. Then you put prote
          • You would be surprised at the dumb shit I have seen in dealing with securing similar systems. Yes it is layer upon layer of security measures, or it should be. But far too often someone forgets about that ancient tape changer in storage room b-37 that is still connected, or some PHB decides that they want to be able to check in on machines and shut them down from their cellphone while at home.

            One of the problems with ICS systems and others like them is that they assume that the operator knows what they are
          • by nnull ( 1148259 )

            I'd like to know what fantasy world you live in where I have competent people designing such systems all the time. My auto-tie baler destroyed itself because the physical limit switch failed, brand new by the way from a very reputable manufacturer. It should have never happened, but it did.

            If what you stated is true, I wouldn't have half the problems I have with quite a lot of manufacturers. Point being, not everything is so cut and dry as you state and there are a whole lot of incompetent people building e

            • it was not a fantasy world. I worked in a power plant and in this plant you do things right as I described or really, really bad things happen. And to be honest if the situations you described could not be avoided with failsafes then it means that the engineer who designed that failsafes did not know what he was doing, which then falls in the case of the "incompetent engineer" I have described.
              • by nnull ( 1148259 )

                "I worked in a power plant and in this plant you do things right as I described or really, really bad things happen."

                It'll come, don't worry. Our utility company here, Edison, is already full of growing incompetence. I already have an undersized transformer that glows in the dark supplying my service. Edison laid off a lot of important people, people that I've actually had the pleasure to study their work because they're the ones that wrote the book on anything medium to high voltage stuff.

                "And to be hon

          • Short version: Equipment which can "explode" because of ridiculous "superhackers" only happens in Hollywood or when you have a completely incompetent engineer, and I seriously doubt you're going to entrust a multi-thousand dollar rig to an incompetent engineer.

            I replied to another of your posts, but let me say again here:

            I am a controls engineer, do this for a living, know industry standards.

            Yes, you have layers of protection to prevent things from happening, but the electrical with a mechanical back up you

            • No no, the mechanical protection I have described is of another type. There are several examples I can give but let's get one of the simple ones: Imagine some system where if the valve A is open then the valve B needs to be closed and vice versa, the valves MUST not open at the same time. in a normal situation you have a PLC deciding when to open and close the valves, but the valves contain a mechanical limiter such that when valve A opens the mechanism locks and prevents opening of valve B (and vice versa)
              • No no, the mechanical protection I have described is of another type. There are several examples I can give but let's get one of the simple ones: Imagine some system where if the valve A is open then the valve B needs to be closed and vice versa, the valves MUST not open at the same time. in a normal situation you have a PLC deciding when to open and close the valves, but the valves contain a mechanical limiter such that when valve A opens the mechanism locks and prevents opening of valve B (and vice versa), then even if the PLC orders the two valves to open only one will be able to open because of mechanical blocking (this also exists for electric keys)

                Yes, those things exist and are used, but more often they are not used.

                Even if you use those kinds of mechanical limits, there are more scenarios then I can count where those are not practical or even possible and you can fire open 2 valves if you have access to the code and can blow stuff up, or vent something to atmosphere or overwhelm a Waste water treatment plant.

                When it comes down to it, most things in life are protected by the code of the systems, either process controls systems or safety instrumented

      • by Anonymous Coward
        Not quite true. The advent of cheap servos and encoders has made those limits programmable. I'm starting to see bad design decisions due to it, removing physical limit switches, etc. And if what you say was true, all the big CNC manufacturers wouldn't be replacing spindles all the time because someone made a mistake. There goes 30k for your beautiful Mori Seiki CNC because they still can't for some reason prevent crashing with modern tech.

        So yeah, anyone that really wanted to be nefarious can seriously d
        • Well... You and I would not put cheap servos and non-physical protections on an equipment that weighs several tons and costs easily over a million dollars, right? I know that many people do stupid things when designing safeguards on equipment but these are the incompetent engineers from my example
    • by Anonymous Coward

      Anyone old enough to watch the "Mission Impossible" tv series know to well how this goes ...

      They fed one single punch card into a card reading machine and suddenly the bad guy's computer (a cabinet with lots of flashing lights) gone totally haywire, and smoke billowing out

      Fifty something years later (this is 2018, btw) do we have to continue being bombarded with this kind of bullshit ??

    • Re: (Score:3, Informative)

      by johnsie ( 1158363 )
      Actually... I know of several energy companies whose generators and intake valves are controlled by PLCS. Those PLCs are on the same network as PCs (bad practice I know). Technically it would be possible for a hacker to use an infected computer as a stepping stone to controlling the valves and generators. This would let a hacker completely destroy the generator and a lot of equipment the generator is hooked up to.
      • I would say to report them to NERC if they are in the US or Canada.
        • by nnull ( 1148259 )
          And do what? Most inspectors don't have the knowledge to deal with this. Most inspectors will not ask what the hell the PLC is doing because they don't even know how a PLC works. Edison stole my transformer for my 4000Amp gear, reduced me to 2000kVA, so now I have a hot boiler outside. Utility commission doesn't care and the city is helpless about doing anything about it. Think NERC will be any different?
      • I have followed in person with great interest the operation of a power plant where PLCs are used to command heavy equipment. All the equipment had fail-safes in case the PLCs tried to send invalid or potentially destructive commands so the worst that a "Hollywood superhacker" could do would be to shut down the plant (without damage), and even that I believe would not work because the operators in place also had secondary independent controls in case the main (the PLCs) had problems.
        • by nnull ( 1148259 )
          Give it time, the incompetence will sweep through power plants soon when all the older guys retire just like the rest of the industry.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      While you're correct, I would point out that it *is* a direction which several separate things are actively *attempting* to move us towards.

      On one side you've got businesses who will cut costs at any opportunity, and only ever keep the bare minimum of safety the law mandates - or lie about having it as we may recall with the BP spill among other incidents. The more that can be done from across the globe with the less workers possible, the better. As long as it can be someone else's fault when everything goe

    • Oh look, it's the hackers can bomb you with you own computer headline again. This time featuring smartphones and apps oh boy that changes everything!

      That said, poor security and factory machines accepting commands from smartphone apps does sound like a rather bad idea.

    • And "nasty" people can blow up factories using dogs! (but never cats, they are too lazy)
  • by Anonymous Coward

    OK let's say you have enough knowledge to do this remotely. Even if you can manipulate process automation through a smartphone app, it's a sure bet you can't change most of the limits or permissives. There are specific reasons why process and power are designed to prevent this and covered by ASME or API codes. It's not random or arbitrary design. And while there are industrial accidents they are usually a chain of multiple failures or unforeseen problems in the design no one anticipated.

    This article is

    • by Anonymous Coward

      You have no idea how insecure some industrial systems are. I remember having found unauthenticated remote administration modems directly connected to industrial production robots when on a pentest project. You could do a lot of bad stuff with such an access - kill the machine or even the operator, if you are lucky (or not). Granted, this was some 10 years ago, but I doubt the situation is much different today, as you don't replace industrial systems that frequently. The systems I was testing were from the 8

  • by Anonymous Coward

    1st rule of internet security: Only hook something to the net if it must be hooked to the net to do its job.
    2nd rule of internet security: If a system is hooked to the net to allow monitoring, make it only capable of transmitting onto the net, and not recieving from the net.
    3rd rule of internet security: Do not hire morons who will plug a memory stick into a unit that's not on the net, after that stick has been in a unit that is on the net.
    4th rule of internet security: Disable any wireless connectivity on

    • by Reverend Green ( 4973045 ) on Friday January 12, 2018 @12:26AM (#55912911)

      Organizations that blame their security issues on "morons" are unlikely to develop an effective security posture.

      • By "moron" this means the people creating the security procedures, or the workers who refused to take the proper training. The solution is to fire those workers. Ie, the poster did not mean you should blame the workers who are morons, but meant that essentially no company is being this stupid unless it's actually being run by morons. In that case, you can blame the morons who are running the company.

    • Re: (Score:3, Interesting)

      by AHuxley ( 892839 )
      Re Only hook something to the net if it must be hooked to the net to do its job.

      But that would need more workers on site. They will fully unionize over the long shifts and demand a "living wage".
      The idea of hooking something to the net was so one trusted engineer could do the jobs of many on site workers.
      Without the internet local workers would have to be hired on site again and they will unionize.

      Re Do not hire morons who will plug a memory stick into a unit that's not on the net, after that stick h
    • 3rd rule of internet security: Do not hire morons who will plug a memory stick into a unit that's not on the net, after that stick has been in a unit that is on the net.

      Not possible. If you don't want a memory stick plugged in then you will have to physically remove access. Even smart people with the best of intentions make mistakes or sometimes are duped.

      4th rule of internet security: Disable any wireless connectivity on systems you are not intentionally hooking to the net.

      Wireless (and wired) connectivity systems should be disabled by default and require positive action to enable. End users should not have the rights to enable this functionality.

      5th rule of internet security: Do not hire anybody who would violate the preceeding four rules.

      And how do you propose to identify these people ahead of time since they don't carry Bill Engvall I'm stupid [wikipedia.org] signs.

    • The wireless access is provided to address real-world problems, start-up/commissioning are the most common from the manufacturer/OEM side, giving status and supply level data to floor managers, and eliminating the need for everything to be controlled from the control room.

      These things all increase attack surface, but they are ultimately part of running a lean operation, so they are here to stay.

  • Anyone remember the oil refinery scene at the beginning of Red Storm Rising? Now the fundie engineer doesn't even have to go near the refinery to cause chaos.
    • But then you have to blow yourself up on your own at home... Where's the fun in that? (Shut up, Khomeini! I wasn't asking you)
  • by AHuxley ( 892839 ) on Friday January 12, 2018 @12:04AM (#55912851) Journal
    Some nice fictional movie script could go like this:
    Someone preppy who is photogenic has a modem and a new computer.
    They had the phone number of their local power plant.
    They created a script to dial every extension and only keep the number of any phone number extension that responded to a modem.
    A day later they got a direct line to a modem in the power plant and could interact in computer ways with the local power company...
    Black helicopters, federal law enforcement in suits swarm the local town looking for the computer owner.
    In 2018 the movie has to have an app. The messages to and from the power plant are now are all on social media and have a pretty GUI.
  • by schematix ( 533634 ) on Friday January 12, 2018 @12:13AM (#55912879) Homepage
    Security in automation controls is an absolute joke. In the world of Rockwell Automation (if you're not familiar, roughly 70% of the US automation market), with network access to a single device anywhere on the automation network, you can go in and upload an entire controller entire program and see the full source. Their only 'security' is easily bypassed by a program on sf. Once you have said program, there is nothing, literally nothing, from stopping you from changing the program logic to do whatever you want. If you like you can even make temporary 'test' changes until poop hits the fan, then cancel them, returning things to normal. There's no logging of any of these changes and no security to prevent you from doing it. This is scarier than Meltdown/Spectre and i'm utterly amazed we haven't seen more disasters due to the simplicity of access and modifying these systems.
    • by nnull ( 1148259 )
      The sabotages are already happening. It doesn't get reported. I've already witnessed it at customer plants. It's going to get worse. Siemens and Allen Bradley are by far the worst in security. And of course, everyone now has to load teamviewer on every HMI with a static password and ID to offer support, to punch through firewalls.
    • Anything dangerous has hardware failsafes. If you didn't have that the programmers would blow up the factory by accident, no hackers needed. Plus you can't get access over internet, if you could the viruses would have gotten there first, all industrial automation runs outdated windows, no updates ever. To actually mess with industrial automation instead of just faulting it out as any random virus could, you need to be on the level of Stuxnet creators and have your personal spy agency to do the homework on t
  • Phewww - that was close! But thanks to the diligent bi-partisan efforts of our legislators and the brilliant patriotic leadership of our businesspersons, the United States is safe from this threat. We have no factories left for anyone to blow up.

    • by rtb61 ( 674572 ) on Friday January 12, 2018 @01:01AM (#55913017) Homepage

      Of course if you were going to be that destructive, much safer to drive around in a white diesel van with an PTO and an electromagnetic pulse generator and simply cause wide spread chaos on the move. Pretty hard to track you down, as all the tracking systems and agencies go down and you are only noticeable by the fact you are still moving, whilst everything else is coming to a halt with the damage and impact tied to the power output of your EMP device and how many kilometres you can travel with it pulsing away. Don't do this, it would be bad, seriously but you know where this is going been said again and again. When governments hack governments, the next step is EMP attacks, it is inevitable that it will escalate to this and you can bet corporations will attack corporations, billions at stake.

  • by Pinky's Brain ( 1158667 ) on Friday January 12, 2018 @12:45AM (#55912981)

    If you allow remote access to factory systems with anything else but special purpose laptops with hardware VPN and zero Internet access, you're doing it wrong. Any data crossing between from internet to intranet should require red tape, any software mountains of red tape (all on physically archived paper). Any data from intranet to internet should be across busses verified to be strictly unidirectional (ie. not tcp/ip with some ungodly complex stack written in C).

    Almost everyone is doing it wrong ... the only place you should BYOD is the unemployment line.

    • by AHuxley ( 892839 ) on Friday January 12, 2018 @01:11AM (#55913043) Journal
      Re "Any data crossing between from internet to intranet should require red tape"
      East Germany faced just that problem. One day a trusted member of staff walked out with a list of East Germany spies in other nations.
      Before creating new trusted spy networks with new names something had to be done to prevent a list of spies ever walking out again.
      Details about mission, the spy codename, the real identity got split up into very different physical files kept separated.
      Nobody could every put the real name to the results of a mission without mountains of red tape to walk each file together and see a person's name linked to a mission.
      East Germany then went digital.
      Th East Germans thought it would be good to have a full list that could be accessed if spies had to be given new missions very quickly.
      The CIA walked out with the list of all their spies.
      The same was used for NSA compartmentalization until the political rush for private sector contractors resulted in walk outs.
      The storing of some US gov/mil/contractors/workers information, clearance levels, past work, mission history, lifestyles in plain text on internet facing computers.
      Political parties who have trusted staff walk unencrypted data to the waiting media.
      So much is done to save time, for politics, for cost savings that later results in vast amounts of data walking.
      No apps needed as everything is in plain text as thats how its been used everyday.
      • by rastos1 ( 601318 )
        Was it, by any chance, called a NOC list [youtu.be]?
        • by AHuxley ( 892839 )
          East Germany lost its actual spy contact lists. Names, locations, everything needed to find the person in another nation quickly and contact them.

          The US stored some of its workers, contractors, some gov/mil background information in plain text on internet facing networks.
          That copy kept in plain text, copied out onto the internet gave away all information about some workers life, some work within the US gov/mil. The skills set they had. Any past lifestyle issues with say gambling, healthcare, past le
    • by Anonymous Coward

      I was developing an experimental medical monitoring device; we couldn't legally use anything electrically connected on a patient without FDA approval, and we couldn't get FDA approval without patient testing.

      The standard approach in such situations is to send the data over unidirectional fibre optics from the device to the data logger (a laptop in this case). Physically impossible to send anything back along that connection with the hardware we were using; the transmitter had no ability to recieve signals a

  • by Gravis Zero ( 934156 ) on Friday January 12, 2018 @12:47AM (#55912985)

    The only way we are going to see any change in the industry is if it starts costing them money because simply continually cleaning up the messes of careless companies isn't going to change their attitude toward security. The reality is that you are actually enabling them to continue on with their poor security practices.

    • by nnull ( 1148259 )
      Unfortunately, there is still too little people interested in exploiting such companies. I'm pretty sure it will come. Crypt locking machines seems like it could be a very lucrative business.
  • Two security researchers, Alexander Bolshev of IOActive and Ivan Yushkevich of Embedi

    Just out of curiosity, do all "security researchers" come from shithole countries?

    • Damn Slashdot stepped on my joke. The subject line of my above comment was supposed to be,

      Two security researchers, Ivan Yaganoff and Ima Chirkoff

  • "Two security researchers, Alexander Bolshev of IOActive and Ivan Yushkevich of Embedi, have been playing WatchDogs 2 way too much."
  • by Anonymous Coward

    It's not hard to imagine this causing mayhem on an assembly line or explosions in an oil refinery.

    Some /. headlines and summaries are bad, some are misleading, and some are unconscionable. It is hard to imagine that competent companies and engineers can design their systems so stupidly as to allow "hackerZ to BLOW UP FACTORIES USING SMARTPHONE APPS". Yes, incompetence happens. Yes, competent terrorism/vandalism happens. But no, the presumption is that this jump of imagination is simply an unethical sen

  • by thegarbz ( 1787294 ) on Friday January 12, 2018 @03:50AM (#55913361)

    Any refinery or chemical plant that is even remotely complaint with HSE rules should have very limited exposure to anything the control system can do to cause a truly major incident.

    Sure it is trivial to shut it down or trivial to do something like cause catalyst or product to go to where it shouldn't. But any scenario that could cause something like an explosion should be identified and protected by safety systems independent of control systems and unable to be directly controlled.

    Even when you look at oil industry incidents recently you can see the majority of accidents are due to missmanagement or bypassing of safety barriers for abnormal reasons which aren't properly risk assessed.

    This potential scenario is one of the reasons the TRITON / TRISIS [slashdot.org] malware we covered recently got so much interest, and likely one of the reasons why the attacker was attempting to modify the code in the safety system.

    • by Anonymous Coward

      rule. When I was working with high voltage semiconductor equipment, the rule was that there
      had to be 2 electromechanical (i.e. not computer controlled) backup systems to 'safe' things
      before they could be accessed. Seemed sensible to me. Is this not followed anymore?

  • Real life "Watchdogs". Nice. Gotta love this IoT nonsense everybody's into lately.

  • So the guy from Mr. Robot was not that genius?
  • by Anonymous Coward

    SCADA (process control) networks have long been known to have vulnerabilities that can be exploited in the real world. Further, project Aurora proved you could cause a generator to explode with the proper SCADA inputs. Just because they are front ending the mess with apps doesn't change anything.

  • Damn y'all naysayers forgot about Stuxnet fast.

  • by Anonymous Coward

    I will just leave this here:

    https://www.youtube.com/watch?... [youtube.com]

    I think people over estimate engineers consistently and fail to understand the context of an engineers work in todays world. its all fine and dandy to say that proper engineers would never do things like this or allow control of dangerous processes to have contact with the outside world, but engineers are people too, people who have bosses who tell them what to do. They are also afflicted by project costs and inter office politics, so much so that

  • It's not hard to imagine this causing mayhem on an assembly line or explosions in an oil refinery.

    Yeah I can imagine a lot of things. Can these flaws actually be used to blow something up, or just imagine it?

BLISS is ignorance.

Working...