Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Privacy Technology Science

Protecting Our Brains From Datamining 100

Posted by Soulskill
from the we-all-know-what-your-brain-would-tell-us dept.
Jason Koebler writes: 'Brainwave-tracking is becoming increasingly common in the consumer market, with the gaming industry at the forefront of the trend. "Neurogames" use brain-computer interfaces and electroencephalographic (EEG) gadgets like the Emotiv headset to read brain signals and map them to in-game actions. EEG data is "high-dimensional," meaning a single signal can reveal a lot of information about you: if you have a mental illness, are prone to addiction, your emotions, mood, and taste. If that data from gaming was collected and mined, it could theoretically be matched with other datasets culled from online data mining to create a complete profile of an individual that goes far beyond what they divulge through social media posts and emails alone. That's led some to develop privacy systems that protect your thoughts from hackers.'
This discussion has been archived. No new comments can be posted.

Protecting Our Brains From Datamining

Comments Filter:
  • Ridiculous (Score:4, Insightful)

    by sexconker (1179573) on Tuesday June 03, 2014 @06:57PM (#47160899)

    Here we propose an integration of a personal neuroinformatics system, Smartphone Brain Scanner, with a general privacy framework openPDS. We show how raw high-dimensionality data can be collected on a mobile device, uploaded to a server, and subsequently operated on and accessed by applications or researchers, without disclosing the raw signal. Those extracted features of the raw signal, called answers, are of significantly lower-dimensionality, and provide the full utility of the data in given context, without the risk of disclosing sensitive raw signal. Such architecture significantly mitigates a very serious privacy risk related to raw EEG recordings floating around and being used and reused for various purposes.

    So MIT pisses away cash on research that comes up with "Just anonymize the data, sorta, before shipping it off to advertisers and you'll be protected, sorta."? And of course it's peppered with meaningless shit like "personal neuroinformatics system", "smarthphone", and "privacy framework".
    Hey MIT, give me a research grant and I'll come up with an actual solution. Hint: Don't let people put EEG sensors on or around your head for a game, a video, etc. in the first place and you won't have the problem of them selling it to nefarious parties who would use it against you. Much more effective than the proposed equivalent of "Do Not Track" for brainwaves.

"Bureaucracy is the enemy of innovation." -- Mark Shepherd, former President and CEO of Texas Instruments

Working...