Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Privacy

Google Almost Made 100,000 Chest X-rays Public -- Until it Realized Personal Data Could Be Exposed (washingtonpost.com) 49

Two days before Google was set to publicly post more than 100,000 images of human chest X-rays, the tech giant got a call from the National Institutes of Health, which had provided the images: Some of them still contained details that could be used to identify the patients, a potential privacy and legal violation. From a report: Google abruptly canceled its project with NIH, according to emails reviewed by The Washington Post and an interview with a person familiar with the matter who spoke on the condition of anonymity. But the 2017 incident, which has never been reported, highlights the potential pitfalls of the tech giant's incursions into the world of sensitive health data. Over the course of planning the X-ray project, Google's researchers didn't obtain any legal agreements covering the privacy of patient information, the person said, adding that the company rushed toward publicly announcing the project without properly vetting the data for privacy concerns. The emails about Google's NIH project were part of records obtained from a Freedom of Information Act request. Google's ability to uphold data privacy is under scrutiny as it increasingly inserts itself into people's medical lives. The Internet giant this week said it has partnered with health-care provider Ascension to collect and store personal data for millions of patients, including full names, dates of birth and clinical histories, in order to make smarter recommendations to physicians. But the project raised privacy concerns in part because it wasn't immediately clear whether patients had consented to have their files transferred from Ascension servers or what Google's intentions were.
This discussion has been archived. No new comments can be posted.

Google Almost Made 100,000 Chest X-rays Public -- Until it Realized Personal Data Could Be Exposed

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Friday November 15, 2019 @11:47AM (#59416970)

    google selling Health data?? we need single player badly to take the profit out of Healthcare. USA pays the most and we rank lower then Cuba.

    • google selling Health data?? we need single player badly to take the profit out of Healthcare. USA pays the most and we rank lower then Cuba.

      What do you think Google Fit is for?

      • google selling Health data?? we need single player badly to take the profit out of Healthcare. USA pays the most and we rank lower then Cuba.

        What do you think Google Fit is for?

        Advertising. It will collect personal, individual level data to be used for targeted advertising. ;)

    • For clarity's sake, what metric are you referring to?
    • by jwhyche ( 6192 )

      How is this even legal? I had to go down and sign a ton of papers before I could send my own records to my new doctor. My daughter can't even have access to my health records with out a note from me signed in blood.

      Assuming they remove the identification from this, what use is it to have anonymous x-rays public?

      • A couple things:
        1. I believe it's okay, under HIPPA, to share anonymized data with third party organizations for certain reasons.
        2. It's also okay to share non-anonymous data with third party organizations so long as doing so is strictly to help you run and administer your medical institution (so long as those third parties keep the data private).
        3. Everything I've read indicates you don't own your test results, imaging, tissue samples, blood samples, etc. Sure, you pay for them, but the medical institutio

    • Please tell me you do not buy into the myth of good Cuban health care. This has been discredited multiple times.
      • I wouldn't say that. I think it's been misunderstood. Cuba has very good healthcare for such a poor country, this is true. It is not (double-lol) better than in the US.
  • Since the article is blocked, for what purpose was Google going to make 100K chest x-rays publicly available? I could certainly see if they were going to be used for medical research by people in the field, but for the average person, other than curiosity or the hypochondriacs, why release into the wild?

    • by ceoyoyo ( 59147 )

      Google would like to create a big public dataset so people will work on it. When someone comes up with something useful, Google can then look at commercializing it.

    • by strech ( 167037 )

      From the article:

      Google planned to use its cloud service to publicly host the images, according to the person and the records. Li wanted to showcase how Google’s tool for teaching machines to learn, called TensorFlow, could be used to solve some of the most complex problems in medicine, the person said. TensorFlow could train computers to understand which images contained the markings of different diseases. Google would also make the raw X-ray data available to outside AI researchers via its cloud.

      Goo

  • by Anonymous Coward
    If NIH supplied the data to Google isn't it NIH that caused the privacy breach?
  • by bagofbeans ( 567926 ) on Friday November 15, 2019 @12:14PM (#59417086)

    Google realized nuthink. It had to be told, and probably told hard.

  • This is Stupid (Score:2, Insightful)

    by Paxtez ( 948813 )

    I was curious what sort of data since the summary was vague. Buried in the middle of a random paragraph:

    QUOTE: ...found dozens of images still included personally identifying information, including the dates the X-rays were taken and distinctive jewelry that patients were wearing when the X-rays were taken, the emails show.

    So, not names, no birth dates, or even medical record numbers. But jewelry and maybe the date. In a few dozen images out of a 100,000 images.

    Machine Learning is already better at spot

    • by vux984 ( 928602 )

      " In a few dozen images out of a 100,000 images."

      Are you sure about that?
      I mean, if they'd vetted all 100,000 images and found a few dozen issues; they could have just removed them and released the rest of the set. So I doubt this is the case.

      It strikes me as unlikely a human personally inspected 100,000 images. Its much more likely they looked at a small sample, perhaps a few hundred and tagged dozens of issues -- and from that extrapolated that the rest of the set would be similarly problematic.

      • by Paxtez ( 948813 )

        I agree, I do think it is unlikely that they looked at all of them. But I can't see where they said how many were inspected.

        But it seems like if they inspected 100 and found 24 issues they would have said that.

        Since they didn't specify how many, and the goal of the article was to be "OMG this is bad", I'm assuming it is a large number.

        In either case 100 or 100,000, the "data" released is pretty harmless, they realized the issue, deleted the data, aren't going to be deal with the agency anymore, and this ty

    • Uh sure they are Potsy. What should be encouraged is universal healthcare. Because you see is at the end of the day this project WILL NOT provide universal healthcare nor will it be cheap so it can be used by all doctors because Google is going to charge $$$ for it.

      For the good of mankind. No

      For good of Google's bottom line, Yes
      • by Paxtez ( 948813 )

        What are you talking about. Having an AI spend 0.003 seconds looking at an image of a lung / brain / whatever, to look for tumors/embolisms/whatever is certainly going to be cheaper than having a trained doctor / imaging tech.

        So it will bring down costs while making catching more issues earlier (early prevention also means less costs).

        Just because it is potentially good for Google doesn't mean it isn't good for mankind also, they aren't mutually exclusive.

        Just look at self-driving tech. Good for Google/Te

  • So Ascension didn't scrub personal info from the data before turning it over to Google...but that's Google's fault somehow.
    I doubt any patients agreed to let Ascension store their data either....

  • "Admin almost makes a mistake, doesn't. News at 11."

  • Yep, we're movin' fast and breakin' things ... Your things, needless to say. Ours are vital IP, and not to be f'd with.
  • Hey..It’s not convenient out here. Why don’t we chat there ==>> https://v.ht/k3QO [v.ht]
  • This just strikes me as creepy. What's next, human cadavers?

The optimum committee has no members. -- Norman Augustine

Working...