Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Software

'The Year That Software Bugs Ate the World' (fastcompany.com) 95

FastCompany's harrymcc writes: It's not like there's ever a year that isn't rife with stories about buggy software. But 2017 seems to have had an unusually rich supply of software flaws that fouled up major products -- from Twitter to iOS 11 to the Google Pixel 2 -- in ways that were very noticeable and sometimes even funny. Sample this: A nagging flaw in Google's Play Services software for Android causes Gmail to demand access to "body sensors" before it will let users send email. Android Police's Artem Russakovskii discovers that his Mini is recording audio 24/7 and storing it on Google's servers. I rounded up a bunch of them over at Fast Company.
This discussion has been archived. No new comments can be posted.

'The Year That Software Bugs Ate the World'

Comments Filter:
  • by Anonymous Coward on Monday December 18, 2017 @03:36PM (#55763551)

    Programming in traditional programming languages instead of the latest fad language and framework. And develop in our own countries instead of outsourcing it.

    • by CastrTroy ( 595695 ) on Monday December 18, 2017 @03:55PM (#55763699)

      What we really need are programmers who actually know what they are doing. The problem is that there really aren't enough programmers out there to get all the development projects done by knowledgeable programmers. It doesn't matter how much you pay them, the programmers simply don't exist.

      I think that the latest fad language and framework is actually just a symptom of the underlying problem. With a good enough tool set, you can fake your way through it for the most part and make it look like the system works from the outside. But you eventually hit a wall where the framework can't make up for the lack of skill of the developers, and this is where you run into problems.

      • by chill ( 34294 ) on Monday December 18, 2017 @04:13PM (#55763803) Journal

        Why would they do that? They'll just address it in the next sprint! If you're agile enough, that is. Just add those bugs to the backlog! We've got features to ship!

        • Re: (Score:3, Funny)

          by Anonymous Coward

          At the beginning of a project, it doesn't make sense to invest a lot of development effort into a comprehensive, secure, bug-free, scalable, and robust foundation. Doing so costs a fortune and your business flops before it is finished. And anyway the market hasn't tested your offering yet so you don't know if it is going to live long enough to need a foundation that is that advanced.

          During the mid-life of the project the need for a better foundation starts coming up, but it still doesn't make sense to sp

          • by Anonymous Coward

            When will you suckers realize that a solid, well engineered foundation is the key to every successful construction project???

            You "strengthen-it-later" types are why we can't have nice things; you merely sit back in your rubble, and smugly proclaim to be the great first-mover innovators.

            • by Rakarra ( 112805 )

              When will you suckers realize that a solid, well engineered foundation is the key to every successful construction project???

              That's great, except plenty of these programs/apps/what-have-you are made by startups. Startups have to show immediate RESULTS when it comes time for the second round of funding or else there won't be a second round. They have to build the application first, then fill in features and fix bugs.

      • The problem is that there really aren't enough programmers out there to get all the development projects done by knowledgeable programmers.

        This isn't actually true, though. In the late 90s when it was really true, the market responded, pay went up, and job availability went up too. The situation now is that pay isn't going up significantly, and jobs remain "open" forever without any attempt to hire whoever the most qualified person who applied was. You might 1000 applicants, and the "job" remains "open" and the work later gets outsourced.

        If there was a real shortage, hiring would instantly increase!

      • by AmiMoJo ( 196126 )

        The idea of frameworks and new super high level languages is to make it so people don't have to understand the hard stuff to write good software.

        Even the best programmers struggle to write crypto, for example. Most people would be crazy to write their own, better to use a well tested library.

        We need to make better frameworks.

    • Language, Country or origin, and even skill to a degree doesn't really affect the quality of the software. It is management who wants the product done ASAP, setups Rigorous time lines, loosely gathered specifications, and sells the product to the market, before any single feature is tested.

      In a lot of of our software, I wonder how much proof of concept code is out there without being fully fleshed out, because it technically works, however the details to prevent it from breaking and access via ways that it

      • Language affects software quality a lot, because ultimately software quality is determined by the user based on how well their use case is served. Understanding the use case is a very human, language-and-communication type of problem. Even poorly written software can eventually be bugfixed to quality, if the management understands the use case and continues to apply resources.

        Language differences don't prevent that, but it does make understanding use cases harder, so the average maximal result will be lower

    • My favorite bug of the year was the bluetooth one that caused a bunch of idiots to whine and cry that "everybody" was remotely rooted, while in reality RHEL/Centos users were only every exposed to a DoS bug. (box would crash instead of being exploited because RH turned on the bt memory protections already available in the kernel)

    • I don't know about the other platforms, but when I look at the list of things on the security advisories issued for macOS they are pretty much entirely C/C++ code. None of this is latest fad language or framework, it's simply a function of complexity. I remember some old research from IBM that claimed that programmers produced five lines of bug-free code per day, independent of language. The difference now is that there are so many complex interconnected parts in a modern system that the probability of a
  • by Computershack ( 1143409 ) on Monday December 18, 2017 @03:37PM (#55763571)
    99 bugs in the code to be fixed, 99 bugs in the code. Fix a bug, wrap it up, 148 bugs in the code...
  • by gweihir ( 88907 ) on Monday December 18, 2017 @03:40PM (#55763587)

    The average person still does not care at all. Hence software can still get worse and even cheaper to make before it starts to cut into profits. And it will.

    • by antdude ( 79039 )

      Yep, companies too. They even care not about QA. MS axed its QA department years ago. I'm still unemployed after a year. :(

    • This is the real problem. People got used to that bad software service.
      • by gweihir ( 88907 )

        Indeed, it is. People perceive pathetically bad quality as "normal".

      • Partly. Software engineering is a very young discipline and we still don't really know how to make good software. We know a few things that are pretty much guaranteed to make terrible software, but the closest thing to a process that produces bug-free software is that used by the seL4 team, which is estimated to cost around 30 times as much as a conventional process with a very good set of automated tests.
        • Ooops, hit submit without thinking.

          The second problem is the lack of knowledge in consumers. Given two pieces of software that fill the same function, do you have any mechanism to say which one is likely to be more secure? Creating good metrics for evaluating software security is an open research question in cybersecurity. When we don't even have research that can do the comparisons usefully, expecting consumers to make informed decisions with no information seems a bit of a stretch.

        • Bullshit. The competent among us know how to produce GREAT software. The real problem is that very few people have what it takes to do it but a pervasive meme that anyone can do it has resulted in a situation where 90% of the people getting paid to develop software shouldn't be in the field at all.
  • The examples listed are not necessarily bugs, even if they are named so when they're found out.

    Never attribute to malice that which can be explained by stupidity. But then again, never attribute to stupidity that which can be explained by corporate greed.

    • explained by corporate greed.

      There is no such thing as corporate greed; all greed is personal when you look under the hood.

  • by nwaack ( 3482871 ) on Monday December 18, 2017 @03:56PM (#55763717)
    I think a major contributor to all these bugs is that every. single. thing. has to be connected to every. other. thing. My computer has to talk to my phone which has to talk to my watch which has to talk to my refrigerator which has to talk to my toaster. All that connectivity makes software waaay more complicated that it needs to be. Now throw in some corporate greed where software design goes to the lowest bidder and you get what we have today.
    • Oh God this. And this is why I will never buy IoT appliances while "dumb" ones are an option. And why I will never spend one penny on the monstrosity IOTA [iota.org]
    • I think a major contributor to all these bugs is

      Apple.

  • by Anonymous Coward

    Next year: Even more software, even more bugs.

  • I would go with shorter development time, nowadays less and less testing is done before a release,..
    "Beta testing? BAhhhhh,... that is what users are for."
    with that and your boss telling you to release now since he had a quick look and didn't see any problem,... (after a 5min glance)

  • Formal validations of software using math is already difficult and will be more so when applied to AI domain. Just the definition of what constitutes correctness is a challenge in such systems.
    The demarcation between traditional programming bugs vs undesirable outcome due to flawed learning blurs as software complexity increases. Subtle biases or other instabilities can be introduced that influence cognition and it will be nearly impossible to trace.
    If the app misbehaves, trying to trace and attribute it
  • I mean I remember the "Good Old Days" where the system would crash when you look at it wrong, or typed too fast. SQL Injection errors were common...

    These bugs that came out this year, while bugs, is a far cry to the risk of trying to use a computer during the 1990's or before.
    I haven't seen a BSOD (or its equivalent) in nearly a decade now. These glitches that we get today, while some are serious, they are rather small in the big picture.

  • the year the frog noticed the water was getting kind of hot.

    • Doubtful, since most people know that the frog thing was just some bullshit some asshole made up and not a real effect.

      The reality is that frogs in heated water have nowhere to escape. That's the whole story. Give them a chance to escape, and they will; they do understand the problem, and all evidence confirms that. There was never any reason given for believing the cliche; it is just a sort of IQ test; people who are credulous to the point of mental disability will believe it, and everything else they hear

  • Have you taken the time to consider how many libraries are used in the average project? Most of those are open source projects, continually updating and relying on other libraries from other projects. Like coding inception.. a bug or change inside of a library, inside of a library, inside of a library inside of a program that messes everything up. Vigorous testing is the only hope.
  • Fast Company - for those who find The Verge too technical.

  • by ka9dgx ( 72702 ) on Monday December 18, 2017 @07:53PM (#55765473) Homepage Journal

    If we had capability based security in our systems, this kind of stuff would require the user to knowingly allow these types of activities. Until then, we're all screwed. Stop blaming everything but the OS. It's not the programmers or the users.

  • "2017 seems to have had an unusually rich supply of software flaws that fouled up major products -- from Twitter to iOS 11 to the Google Pixel 2 .. Google's Play Services software for Android ..

    Something missing from that story, just on the tip of my tongue, is it any wonder this has become known as the Microsoft Slashdot.

    Two Bytes to $951M [blogspot.co.uk]

"Look! There! Evil!.. pure and simple, total evil from the Eighth Dimension!" -- Buckaroo Banzai

Working...