Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Google Security Apple

Google Bug Hunter Urges Apple To Change Its iOS Security Culture; Asks Tim Cook To Donate $2.45 Million To Amnesty For His Unpaid iPhone Bug Bounties (threatpost.com) 79

secwatcher writes: Prolific Google bug hunter Ian Beer ripped into Apple on Wednesday, urging the iPhone maker to change its culture when it comes to iOS security. The Verge: "Their focus is on the design of the system and not on exploitation. Please, we need to stop just spot-fixing bugs and learn from them, and act on that," he told a packed audience. Per Beer, Apple researchers are not trying to find the root cause of the problems. "Why is this bug here? How is it being used? How did we miss it earlier? What process problems need to be addressed so we could [have] found it earlier? Who had access to this code and reviewed it and why, for whatever reason, didn't they report it?" He said the company suffers from an all-too-common affliction of patching an iOS bug, but not fixing the systemic roots that contribute to the vulnerability. In a provocative call to Apple's CEO Tim Cook, Beer directly challenged him to donate $2.45 million to Amnesty International -- roughly the equivalence of bug bounty earnings for Beer's 30-plus discovered iOS vulnerabilities.
This discussion has been archived. No new comments can be posted.

Google Bug Hunter Urges Apple To Change Its iOS Security Culture; Asks Tim Cook To Donate $2.45 Million To Amnesty For His Unpai

Comments Filter:
  • by Anonymous Coward

    The company with the most Swiss cheese mobile OS on the market today? Unbelievable.

    • Re: (Score:1, Troll)

      by AmiMoJo ( 196126 )

      You can buy a device that unlocks the supposedly super secure iPhone. Every time they update the iPhone software and hardware, the device gets updated very quickly. That strongly suggests that he is right, Apple just fix each bug as they find it and don't fix the underlying flaws.

      On the other hand, no such box exists for Google Pixel phones, for example.

      • Re: (Score:1, Interesting)

        by Anonymous Coward

        "The Box", is simply a PR trick invented to provide a win-win for Apple and the Government.

        1. Apple spawns new company under untraceable ownership.
        2. Comply with government requests to unlock phones by providing the fake company vulnerabilities.
        3. Win with consumers because they don't know it's Apple selling backdoors,

        The whole thing reeks of a collaboration with the government to both comply with FISA/NSLs and appeal to consumers by pretending not to at the same time.

        Wake up.

        • If that were true (not totally discounting it), it would also be a way to actually get paid for all that effort rather than doing it for free or nearly free.

      • by DarkOx ( 621550 )

        Or if you are cynical it suggests Apple wants to have it both ways. They want to show the public they are not kowtowing to anyone and offering consumers good privacy protection. At the same time they don't want to make the devices so secure LEOs can't get into them when the crimes get serious enough to justify paying some security "researchers" a few hundred thousand dollars. I for one think Tim Cook is not dumb.

        I suspect he was pretty confident for example that the FBI was going be able to get

      • Re:From Google? (Score:4, Insightful)

        by TheFakeTimCook ( 4641057 ) on Thursday August 09, 2018 @11:35AM (#57097058)

        You can buy a device that unlocks the supposedly super secure iPhone. Every time they update the iPhone software and hardware, the device gets updated very quickly. That strongly suggests that he is right, Apple just fix each bug as they find it and don't fix the underlying flaws.

        On the other hand, no such box exists for Google Pixel phones, for example.

        No.

        It strongly suggests that that device maker is being helped with Industrial Espionage.

      • by Demena ( 966987 )

        Given the lack of actual facts supplied in your post it remains an unsubstantiated, unquantified allegation. ie. noise.

        Please support your assertions or shut the fuck up.

    • by Demena ( 966987 )
      Yep, if this is not sarcasm then it just is not rational.
  • by malchus842 ( 741252 ) on Thursday August 09, 2018 @09:19AM (#57096328)
    Apple does have a well-thought-out security design. Maybe there are things wrong with it, but to say they 'just fix bugs' and don't think about overall security ignores the truth. But I suppose that's what you get when you're click-seeking. See: https://www.apple.com/business... [apple.com] Can we find holes in that? I'm sure. But they do have a plan. And that's the public one. I'd wager there's an even more detailed internal one.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      You're kidding, right?

      Apple's stance on bugs is "we don't care until it makes the press."

      Remember that bug where you could log in as root with a blank password on almost every Mac? Turns out Apple knew about it for months. They only bothered fixing it when the tech press found out about it.

      This is pretty much the only way security fixes ever happen for Apple products: the tech press hears about the flaw, then Apple decides "OK, now we'll fix it."

      • by orlanz ( 882574 ) on Thursday August 09, 2018 @10:05AM (#57096572)

        So true. Our company's iOS count is in the mid 5 digit range. And early on, there was a Exchange Calendar glitch that we just couldn't solve. It would only appear on iOS and not the numerous non-iOS devices.

        It took us MONTHS to get Apple to even see that there was an issue. Some guy in a forum figured it out but it took us MONTHS to have them accept that it was an issue with how they implemented the ActiveSync protocols. It took almost 18 months for Apple to actually fix the problem (the fix itself was fairly simple, related to assigning a meeting ID properly).

        On one meeting, we were literally told. "Corporate isn't really our target audience, so this is a low priority issue." Which is FINE, just don't be telling us this 6 months into the discussions! Atleast accept the fact that something is wrong and put a communication about it.

        • by Dr. Evil ( 3501 )

          "Corporate isn't really our target audience, so this is a low priority issue."

          This has always been stunning to me about Apple. They're madly successful and have machines snuck in the back doors of corporate, but they seem to show no interest in selling hardware into corporate.

          Even the AppleID scheme is a pain in the butt in Corporate environments. Who owns the Apple ID? My last talk with Apple, they said they would have the employee carry it from employer to employer... We preferred to give the emplo

        • "Corporate isn't really our target audience, so this is a low priority issue." Yeah, corporate SALES are not their target SALES audience. Guess who is their target audience? Higher-income corporate employees, mostly. Who need to connect to their work accounts. It's amazing how narrow-sighted their approach to playing nicely with ActiveSync is. It makes all of their most valuable individual customers extremely frustrated.
      • This is pretty much the only way security fixes ever happen for all "$OEM" products: the tech press hears about the flaw, then the $OEM decides "OK, now we'll fix it."

        FTFY.

    • by Jaime2 ( 824950 ) on Thursday August 09, 2018 @09:42AM (#57096450)
      Guy who found more than 30 iOS bugs says he sees a pattern that indicates Apple is failing at the fundamentals. Guy with access to a PDF say he's wrong. Guess who has the stronger case?
    • by bill_mcgonigle ( 4333 ) * on Thursday August 09, 2018 @09:51AM (#57096504) Homepage Journal

      No, you're talking about something completely different. Back when Apple was working on the 5S, and they developed the whole Secure Enclave architecture, it did have some really good engineers working out good security for system. What this guy's talking about is the past few years where they have the iOS bugs that have been identified, patched, and then in the next go-round we find out that they only patched the extremely specific bug, on one line. The next exploit is a few lines down, the same darn thing, in a slightly different way. The most likely explanation for this is that they lost the talent that was working there, making the system good. Why would top people stay when Apple doesn't innovative any more? It's clear from the results that they lost their performance engineering people, for about four major iOS releases, with only iOS 11 having any kind of decent performance again. Now that they are going into the thought police business, good luck getting anyone worth their salt to work there.

    • Apple does have a well-thought-out security design. Maybe there are things wrong with it, but to say they 'just fix bugs' and don't think about overall security ignores the truth. But I suppose that's what you get when you're click-seeking.

      See: https://www.apple.com/business... [apple.com]

      Can we find holes in that? I'm sure. But they do have a plan. And that's the public one. I'd wager there's an even more detailed internal one.

      Yeah. It is EXTREMELY suspicious why a non-Apple "engineer" would have ANY special knowledge of what Apple's bug-fixing policies are.

      EXTREMELY suspicious.

      Or, as is much more likely, he is talking out his ass.

      • Ian Beer has found numerous, significant iOS bugs. Significant enough and low-level enough that two of his discoveries made the most recent two iOS jailbreaks possible. If he sees a pattern, it is assuredly there.

    • Apple does have a well-thought-out security design. Maybe there are things wrong with it, but to say they 'just fix bugs' and don't think about overall security ignores the truth.

      His point seems to be not that Apple doesn’t have well thought out security system but rather how they respond to bugs. Patching them is important but he is advocating also looking at the root causes of how they came into being so you can rework process to reduce the chances bugs are introduced. Patching problems rather than fixing causes is not unusual, I once had a plant manager tell me he didn’t have time to fix small problems because he had too many big ones. He was not happy when I pointed

  • We shall soon see how tight Cook's ass really is.

    What?

  • Why would a jewelry maker be interested in security?
    You could remove the touch screen and replace it by a "predictive AI" user interface animation that does what it assumes is most likely what the "owner" wants to see, and half the Apple clients would take longer to notice, than the lifetime of the soldered-in battery.

    Google shouldn't talk about security anyway, when their primary business model is snooping on users to enable sleazebags to lie to them (aka advertisement) and rip them off better.

  • by jellomizer ( 103300 ) on Thursday August 09, 2018 @09:49AM (#57096500)

    You have software that took months/years to plan and develop.
    A problem is found.
    You need to Fix it Fast, before it goes out to the wild.
    It will need to be tested to make sure it doesn't break compatibility or break something else.

    If asked to change the infrastructure for every time there is a bug. The fix will take years to get out, and a new infrastructure will introduce new flaws untested.

    A security first design of software made in the 1980's would just have a password login and permissions on what the user could see and do.
    1990's Memory checking and limitation to prevent buffer overflow
    2000's Memory randomization and removing from an ask to allow to don't allow, and you will need to do extra work to allow.
    2010's Application Sand-boxing, Full Encryption, tiered design, redundant checking...

    iOS being a product of the 2000's Is actually stronger then some other systems, but it has a lot of once good practices which are now bad practices in-place. But there hasn't been a massive iOS outbreak of security issues. Like with Windows a decade ago. Makes me figure that the current patching routine is still good enough.

    Will they need an architectural redesign in the future. Probably. Like when Apple moved from MacOS (Classic) to OS X. They will need to upgrade iOS to a new system at some point just to stay current.

    • by orlanz ( 882574 )

      Delaying a release for a better fix is not what Mr Beer is complaining about. Basically he is saying Apple releases a bug fix (assuming they agree it is important enough) and then just moves on. They don't do a process or infrastructure review to see if other similar bugs exist or if future similar bugs will be created.

      Few bugs actually need infrastructure changes. But many bugs hint at process problems that could have been prevented and could still be prevented.

      This ignorance is generally true of most c

      • by Anonymous Coward

        Few bugs actually need infrastructure changes. But many bugs hint at process problems that could have been prevented and could still be prevented.

        Two recent Apple bugs are perfect examples of this: "goto fail" and "passwordless root." Both are symptoms of either Apple not testing, or more likely, Apple only doing positive testing (this works with the values it's supposed to), but never any negative testing (this fails properly when given bad values). As I recall, both bugs were also likely caused by bad merges: fixes were merged from other branches but done incorrectly, so that essential "else if" blocks were missing from the final result, leaving th

        • > Two recent Apple bugs are perfect examples of this: "goto fail" and "passwordless root." Both are symptoms of either Apple not testing, or more likely, Apple only doing positive testing (this works with the values it's supposed to), but never any negative testing (this fails properly when given bad values). ...
          > Both of those are process problems. Failing to do negative tests is a very common process flaw: it's very easy to write a "positive test" that ensures that a correct value produces a correct

          • Wow that had a lot of typos. Let's try that paragraph again:

            In the famous "go-to fail" bug, a TLS certificate was accepted if it was valid - and accepted just the same if it was invalid. They probably tested that it worked - that it trusted a valid cert. But they didn't test that it did not trust an invalid cert.

            There is no "architectural redesign" required to start testing the negatives, checking that NOT entering a password does NOT log you in.

      • It is still a case of back seat development.
        Sometime what may seem like they are not doing a full review, is actually an outcome of a full review. Sometimes that quick fix, is the safest fix. Because having to fix it across the board may be affecting a set of other systems.

        Lets say the fix for the USB hack into the phone is due to a software design problem needed for the Licensed repair team. If they fix the software to stop that hack, then the repair teams will need system updates as well, and need to ma

      • Apple really takes it to a new level.

        Unless you are actually ON the iOS OS Development Team, how would you know that?

      • Delaying a release for a better fix is not what Mr Beer is complaining about. Basically he is saying Apple releases a bug fix (assuming they agree it is important enough) and then just moves on. They don't do a process or infrastructure review to see if other similar bugs exist or if future similar bugs will be created.

        Few bugs actually need infrastructure changes. But many bugs hint at process problems that could have been prevented and could still be prevented.

        This ignorance is generally true of most companies. However, Apple really takes it to a new level.

        Now add to this the fact that Apple typically fails to merge fixes from master into dev and you have an issue where the team working on the next year’s release pushes out an OS that contains a lot of the security issues and other bugs that existed in the previous iteration and the new master branch takes a month or two to eventually get the fix from what was the previous version of the OS.

        • Finally an explanation for why everyone always waits for an x.1 OS release for anyone Apple. I've done it, but I never knew why it was so bad.

    • If asked to change the infrastructure for every time there is a bug. The fix will take years to get out, and a new infrastructure will introduce new flaws untested.

      Precisely.

    • Their major security bugs show a simple PROCESS issue, not an architectural issue.

      You don't have to completely rewrite the architecture in order to test that NOT entering a password, leaving the password field empty, doesn't log you in. You just have to start testing not only "it does the good thing with good input", but also "it does the negative / error case with bad input".

      Their famous "go-to fail" is another example. Their code was basically:
      If certificate is valid {
      trust the certi

Heisengberg might have been here.

Working...