Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google AI Medicine Privacy

Google Accused of 'Trust Demolition' Over Health App (bbc.com) 60

A privacy expert is criticizing Google for taking over a controversial health app developed by AI firm DeepMind. The app in question -- Streams -- was first used to send alerts in a London hospital but hit headlines for gathering data on 1.6 million patients without informing them. DeepMind now wants the app to become an AI assistant for nurses and doctors around the world. BBC reports: One expert described the move as "trust demolition." Lawyer and privacy expert Julia Powles, who has closely followed the development of Streams, responded on Twitter: "DeepMind repeatedly, unconditionally promised to 'never connect people's intimate, identifiable health data to Google.' Now it's announced... exactly that. This isn't transparency, it's trust demolition," she added.
This discussion has been archived. No new comments can be posted.

Google Accused of 'Trust Demolition' Over Health App

Comments Filter:
  • Trust (Score:4, Funny)

    by 110010001000 ( 697113 ) on Wednesday November 14, 2018 @07:33AM (#57641570) Homepage Journal
    In Google We Trust. It is on all new dollars.
    • by gweihir ( 88907 )

      Naaa, just another false God. We already have more than enough of them.

    • The check's in the mail.

      No, that dress doesn't make you look fat.

      Despite the fact that our entire history screams otherwise, we won't compromise your privacy to monetize your data.

  • Comment removed based on user account deletion
  • by CanadianMacFan ( 1900244 ) on Wednesday November 14, 2018 @09:09AM (#57641934)

    Just like Facebook when it said it wouldn't take any user data from WhatsApp when it bought that company. These companies thrive on user data and aren't going to pass up a chance to get more if they can no matter what they've said before.

    • The difference, perhaps, is no one believed Facebook's statement for a second.

    • Absolutely. I am conviced that the FB app sees WhatsApp conversations too; via the 'app family' facility in iOS and data sharing in Android. After all, they only state encrypted end-to-end, the 'ends' are plaintext.
  • Hysterical (Score:5, Interesting)

    by jbmartin6 ( 1232050 ) on Wednesday November 14, 2018 @09:09AM (#57641940)
    There's nothing in the story about sharing data, just the technology. DeepMind said "Patient data remains under our NHS partners' strict control, and all decisions about its use will continue to lie with them. The move to Google does not affect this." The knee-jerk hysteria of privacy nuts is sadly counterproductive, even fewer will listen each time until eventually they are left shouting at each other in an isolated room. Or has that happened already?
    • by gweihir ( 88907 )

      And if you believe that ... never mind.

      • I believe it to a point. There is a certain class of identifier which has no value for Google, such as name and specific street address. So a lot of medical data (and other kinds) are shared widely (not just with Google) after being 'anonymized' by removing the very specific identifiers. This is done for statistical analysis and a host of other reasons. But de-anonymizing isn't overly difficult if you already have other data to match against. Which Google and thousands of other organizations do.
        • by gweihir ( 88907 )

          Well, I do agree that deanonymizing "anonymized" data is routinely very easy, especially when you only want 95% or so in accuracy.

          However, I do not get your point. Are you saying this latest development changes nothing and they were directly lying before and are just maybe a bit more honest now?

          • I'm saying I believe that the data as held by DeepMind likely will not be shared. An "anonymized" subset of the data has probably already been shared multiple times.
    • and all decisions about its use will continue to lie with them

      is actually

      and all lies about its use will continue to be their decision

  • by gweihir ( 88907 )

    Pretty much what more perceptive people have predicted is happening. Also, anybody working at Google should think very hard about what it means to be complicit and whether that is something they want to be.

  • They should be accused of making shoddy products. They are supposed to be geniuses, and (e.g.) Android remains an unholy piece of crap, that sort of works, except when it doesn't, at which point nobody seems to know why, and the default remedy seems to be a factory reset. What a bunch of ridiculous clowns. As for the Apple fans, do not rejoice too much, for Apple's offerings in this space are at least as obnoxious and pathetic.
  • by Gravis Zero ( 934156 ) on Wednesday November 14, 2018 @09:43AM (#57642080)

    Never trust an application to not behave maliciously. If it has the technical capability to copy your information and phone home then you should assume that's exactly what it will do. We need to develop security hardened OSes to prevent this kind of privacy infringement because it will not stop on it's own but it can be prevented from happening in the first place.

  • Don't cross the Streams
  • The ONLY thing that will stop these large corporations is 2-3 years jail time for senior management. Paying fines does not work, they simply view that as a cost of doing business.
  • Really, the NHS should have an in-house data exploitation centre. Research should be invited (and the data 'rented out') but should be done on the premises, using NHS hardware; the data should never leave.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...