Forgot your password?
typodupeerror
Privacy Technology Science

Protecting Our Brains From Datamining 100

Posted by Soulskill
from the we-all-know-what-your-brain-would-tell-us dept.
Jason Koebler writes: 'Brainwave-tracking is becoming increasingly common in the consumer market, with the gaming industry at the forefront of the trend. "Neurogames" use brain-computer interfaces and electroencephalographic (EEG) gadgets like the Emotiv headset to read brain signals and map them to in-game actions. EEG data is "high-dimensional," meaning a single signal can reveal a lot of information about you: if you have a mental illness, are prone to addiction, your emotions, mood, and taste. If that data from gaming was collected and mined, it could theoretically be matched with other datasets culled from online data mining to create a complete profile of an individual that goes far beyond what they divulge through social media posts and emails alone. That's led some to develop privacy systems that protect your thoughts from hackers.'
This discussion has been archived. No new comments can be posted.

Protecting Our Brains From Datamining

Comments Filter:
  • by QuietLagoon (813062) on Tuesday June 03, 2014 @06:37PM (#47160807)
    A: To get larger sensors closer to the brain.
  • by Anonymous Coward

    I've got plenty of Tin foil hats to sell!

  • by Anonymous Coward

    ... a google search you reveal a lot of thoughts. Same goes for email.

    Let's not forget all cellphones are tapped and conversations recorded. So it's not like they don't already have everything at this point. Technology has made it trivially easy to just harvest everything and you're not going to put the genie back in the bottle.

    • by Anonymous Coward

      It's not the tech that's doing it... It's the idiots using the tech who have all the money that is.

      Employers who require you to use cell phones and pagers, who require you to post images of yourself on their company websites, who require you to haul around laptops... Classes who conduct classes online, the teachers of which force you to have Twitter accounts, Google Docs, etc. The companies who pay their employees via electronic bank account deposits instead of an old school paper check...

      All these things--

  • by CheezburgerBrown . (3417019) on Tuesday June 03, 2014 @06:47PM (#47160855)

    Just wait until data from people like me with ADHD and PTSD starts corrupting their Hey look a Squirrel!

  • Increasingly common? (Score:2, Interesting)

    by Anonymous Coward

    Is it really? Or is it a click-bait headline that really means here's a couple of companies who have a product which does it but nobody else does?

    • by dpidcoe (2606549)

      Or is it a click-bait headline that really means here's a couple of companies who have a product which does it but nobody else does?

      Definitely a click-bait headline. They have enough trouble getting the accuracy and resolution required to tell those sorts of things with medical grade EEGs, let alone a consumer grade headset.

      • by tnk1 (899206) on Tuesday June 03, 2014 @08:18PM (#47161321)

        I agree, to an extent. These devices are hardly going to read minds in the sense of providing all of that detail.

        However, whatever they lose in quality (of resolution), they may make up for in quantity. A poor quality device may still be able to provide some useful data points when applied to a larger group of people. Put some branding or situations inside a game, monitor for coarse grained interest or emotion, and you might have something useful to marketers or game designers. Or not.

        When things like this start approaching mass markets, people start thinking of other uses for the data. Working in a field where people are spending good money trying to vacuum up all the data on the Internet, even shitty Facebook posts, I see first hand how people get excited over any new data point. Most of it is crap, but there's some gold in there, for sure.

        Click-bait, but still interesting to consider.

      • by EvilSS (557649)

        Definitely a click-bait headline. They have enough trouble getting the accuracy and resolution required to tell those sorts of things with medical grade EEGs, let alone a consumer grade headset.

        While I agree that it's an inflammatory article now it may not always be that way and "not always" may be sooner rather than later. Using EEGs outside of medicine is in it's infancy but there is huge interest in it from a number of different fields from consumer products to defense. I have no doubt it's going to heat up in the next decade. The more adoption and the wider the markets, the faster it will evolve. So while it's not likely to be an issue today, it's probably best to start acting to head it off

        • by umghhh (965931)

          My thoughts too. I think it has quite good uses too - people with missing extremities can get a replacement that is controlled well by such devices. I think that Mr Hawking would appreciate development in this area very much. Judging on how many problems male-female interactions are causing I think that a bit more sophisticated device helping to distinguish between 'ooohhh no no no' meaning 'please continue' or 'f.off or I call the police'. This said I must admit that knowing that somebody can evaluate you

        • by dpidcoe (2606549)
          The problem is that there's only so much you can do to a signal to amplify it before there's no signal left.

          With a current gen headset, if you were to turn off the 60hz notch filter, the signal it would be picking up from the power lines would drown out the brain signals by several orders of magnitude. Even someone waving their hand over top of your head while you wear one will cause enough interference to blot out the sorts of signals the brain produces. On top of that, those signals that we can pick u
    • by gl4ss (559668)

      click bait headline.

      they can tell if you're squinting really hard though, but detecting that usually gets reported as reading your thoughts.

  • Ridiculous (Score:4, Insightful)

    by sexconker (1179573) on Tuesday June 03, 2014 @06:57PM (#47160899)

    Here we propose an integration of a personal neuroinformatics system, Smartphone Brain Scanner, with a general privacy framework openPDS. We show how raw high-dimensionality data can be collected on a mobile device, uploaded to a server, and subsequently operated on and accessed by applications or researchers, without disclosing the raw signal. Those extracted features of the raw signal, called answers, are of significantly lower-dimensionality, and provide the full utility of the data in given context, without the risk of disclosing sensitive raw signal. Such architecture significantly mitigates a very serious privacy risk related to raw EEG recordings floating around and being used and reused for various purposes.

    So MIT pisses away cash on research that comes up with "Just anonymize the data, sorta, before shipping it off to advertisers and you'll be protected, sorta."? And of course it's peppered with meaningless shit like "personal neuroinformatics system", "smarthphone", and "privacy framework".
    Hey MIT, give me a research grant and I'll come up with an actual solution. Hint: Don't let people put EEG sensors on or around your head for a game, a video, etc. in the first place and you won't have the problem of them selling it to nefarious parties who would use it against you. Much more effective than the proposed equivalent of "Do Not Track" for brainwaves.

    • Yes, TFA is a total waste of time. The concepts are reductive and stultifying and the author's evaluation of EEG capabilities is straight from science fiction.

      **we don't now how the brain works**

      We know an EEG and fMRI and other E-M sensitive sensors can receive waves from our brain and represent that data on a chart.

      Beyond that, it's absolutely the Wild Wild West...it's academic anarchy

      It's so bad that now anyone will say "correlation is not causation" to any scientific claim purely as rhetoric to bolster

      • by s.petry (762400)

        You are making it sound like the only person that could possibly correlate data from an EEG is a dude that is dead, instead of looking at the reality that the paper points out. Which is that much of EEG reading is fully automatic in many types of software. Sure, we have more to learn and the paper makes that clear. That aside, the portions we are sure of are very accurate. Things such as memory mapping (gauging yes/no responses by thought pattern), detecting certain disorders, detecting specific behavi

        • the portions we are sure of are very accurate. Things such as memory mapping (gauging yes/no responses by thought pattern), detecting certain disorders, detecting specific behaviors, and quite a bit more.

          only if you use some kind of Schrodinger's Cat definition of "sure" and "accurate"

          you cannot use an EEG to detect what you claim at all

          you're in polygraph territory...that's a more precise analogy...

          TFA = polygraphy

          • by umghhh (965931)

            Even a journey of a thousand miles begins with a single step.

            Nobody with a healthy brain can claim that currently used techniques can recognize our thoughts. They can recognize some of our feelings or rather general state of mind and we also have techniques that after some training allow some people that lost their limbs to control the devices that replaced them. These are not very accurate but getting better if one is to believe the media. We started this journey and if that is possible (why should it not

          • by s.petry (762400)

            You either failed to read the article and it's references, or you are trying to deny science with statements that equate to "nuh uh!".

            Polygraphs fail for numerous reasons, but most notably are the external influences such as strapping a bunch of cables to someone after stuffing them into a foreign location and being directly interviewed by people that are unknown.

            EEGs in private devices used in private locations are not subject to any of those stresses. EEGs are scientifically proven to be very accurate (8

            • If you want to argue the science with science, fine.

              yes...show me some science to argue with (didn't see any in TFA)

              Polygraphs fail for numerous reasons, but most notably are the external influences such as strapping a bunch of cables to someone after stuffing them into a foreign location

              polygraphs fail b/c they are not what they claim to be...and their failings are so well documented it's an insult to provide them for you...

              they are completely subjective....so is the science in TFA

              TFA and you are making t

              • by s.petry (762400)

                yes...show me some science to argue with (didn't see any in TFA)

                If you truly read The PDF that TFA points to, then you don't know how to read. The referenced sources are all available. Skimming the summary of TFA is not doing the work and is not science. Part of my original quoted statement gave the name of one of numerous studies referenced.

                polygraphs fail b/c they are not what they claim to be...and their failings are so well documented it's an insult to provide them for you...

                We agree that polygraphs don't work, I have read thousands of papers and opinions on various points of failure, naming several commonly related points of failure.

                At the same time, you are claiming that the use of EEGs are the sam

                • Until you have science, you are still arguing with "nuh uh!" nonsense.

                  don't pull this crap w/ me...you've posted exactly zero evidence yourself...

                  your'e trying to make this into an 'evolution vs creation' style discussion and it's obnoxious

                  you're a **scientist** right? "Senior System Engineer/Architect"....so glad you put your specific job title in your sig so we all know you're a **scientist**

                  here's what you do...

                  put the claims of TFA through the same rigor you are using for my claims...

                  also, show me some

                  • by s.petry (762400)

                    You obviously think that reading what I responded to you is the only possible response I could have made in the thread, which is foolish.

                    Link to the reference comment here [slashdot.org].

                    Link to PDF here [ssrn.com].

                    I refuse to link to all of the studies referenced in the link above because you refuse to look for answers and continue to argue from ignorance (intentionally or otherwise).

                    • right...you have advanced to posting links...

                      now...post links **that support your contention**

                      you can't link to your own comment then TFA and call it "evidence" of your contention...

                      you can't cite yourself

                      the P-300 wave exists...we are experimenting to see how it works in the brain...that I agree with...

                      what is wrong and foolish is to say that b/c we see P-300 light up on a screen that means we can "read emotions"

                      I know the science...the problem is people like you have built careers around an unscientific a

                    • I want to explain exactly why you're full of shit

                      http://en.wikipedia.org/wiki/P... [wikipedia.org]

                      that's the P-300 wave

                      we can define it and observe it repeatedly to verify that it exists

                      the problem comes with ***connecting that data to human behavior***

                      define emotions...go ahead...

                      it's impossible to define "human emotion" in a way that is testable with p-300 data

                      it's like trying to read War & Peace when you can only see one letter at a time...it's ridiculous

                      however, researchers need hype to stay funded, so they (TFA) *

                    • by s.petry (762400)

                      And here is why you are full of shit.

                      what is wrong and foolish is to say that b/c we see P-300 light up on a screen that means we can "read emotions"

                      Claiming that you have to be able to read emotions to gauge true/false narrative is astoundingly idiotic. I can't critique you any further, your idiocy speaks for itself.

                      And wholly fuck, nothing like cherry picking a sentence to make your argument.

                      123 Frank et al. explored in [31] feasibility of subliminal attacks, where the reaction to a short-lasting
                      124 information of 13.3 milliseconds was measured. Such stimuli, in theory below conscious perception,
                      125 could poten

                    • You're avoiding your problem & your data doesn't apply:

                      We cannot quantify the human experience of "emotions" in a way that is scientifically comparable and consistent

                      You're dead in the water on this one...

                    • what are you even defending now?

                      you've dropped your main contention...now you're trying to say P-300 waves can be used for lie detection?

                      you must be a polygrapher or on the MIT team or Ray Kurzweil himself

                    • by s.petry (762400)

                      I have provided scientific papers to back my opinion, and you have provided nothing except your opinion.

                      If you truly think your non-fact based opinion is more valuable, you are truly a moron. Grats either way, because your argument is not an argument but a deranged rant no matter how it's perceived. Go troll someone else.

                      No more of your idiocy, good day.

                    • No more of your idiocy, good day.

                      you smug bastard

                      your "evidence" was a link to your own comment and info from TFA

                      I asked for studies or some kind of proof that ****emotions can be scientifically quantified****

                      YOU ARE AVOIDING THE QUESTION B/C YOU KNOW YOU'RE CONJURING FACTS

    • by s.petry (762400)

      The paper, if you read it, also discusses the "Why" you need to do this (as most scientific papers tend to do). It is not just about building a method of giving the data anonymity, but telling people why it should be done. While a bit deep for some, I highly recommend reading the paper.

      Companies are already trying to figure out how to cash in on your EEG data. What's the big deal you ask? Well...

      Using more direct attacks to reveal EEG information, Martinovic et al. investigated in [28] how the
      109 brain's response to a particular stimulus (so-called P300 paradigm) can be used to narrow down the space
      110 of possible values of sensitive information such as PIN numbers, date of birth, or known people. The
      111 tasks required the subject to follow the experimental procedure without explicitly revealing the goal of
      112 the experiment: for example thinking about birth date while watching ashing numbers. Although the
      113 presented attacks on the data may not be directly applicable to preexisting EEG data, as they require
      114 fairly specic malicious tasks, we can expect | as the subjects participate in multiple experiments |
      115 correlations violating privacy could be obtained from raw EEG signal. For example, when a large corpus
      116 of the user responses to a visual stimuli is collected, it could be used in P300-based Guilty-Knowledge
      117 Test, where the familiar items evoke different responses than similar but unfamiliar items [29].

      In other words, and without the formatting and line numbers, people could maliciously collect personal inform

      • You are quite right, and I wish I had mod points. As I was reading your post, the mention of doctors' protocols sent off alarm bells.

        Without going too far off into Big Brother Paranoia Land, I wonder exactly how confidential this sort of ostensibly private diagnostic data is. Is data protected by professional protocols and HIPPAA somehow immune to MITW packet data-mining by Google, NSA, et al?

        • by s.petry (762400)

          I can answer some of this since I work in security and compliance as well as architecture of secure systems (software and hardware).

          Is data protected by professional protocols and HIPPAA somehow immune to MITW packet data-mining by Google, NSA, et al?

          If the standards are followed the answer is "yes". Data must be encrypted at rest and again in transit, so when data is on a wire it's doubly encrypted.

          That said, there is no such thing as perfect software or hardware. The majority of errors are operator errors, but still count as errors. The only difference is that ISPs can be fined for server/software errors, and operators

    • by GNious (953874)

      We really shouldn't be discussing these things - if anyone at Facebook reads about it, they'll "upgrade" the Oculus Rift with EEG sensors (since it is already attached to your head..)

      Actually, eff that, Sony et al are just as likely to try and figure out how to get to your skull.

      *paranoid*

  • ... for all those that are placed above us to lead us,
    and of all those that suck up to same.

  • by crioca (1394491) on Tuesday June 03, 2014 @07:49PM (#47161173)
    This information is undoubtedly being caught up in the global surveillance dragnet, which means that government agents are literally spying on people's brainwaves. The most hackneyed conspiracy trope of all time is now a hilarious reality.
    • by tnk1 (899206) on Tuesday June 03, 2014 @08:26PM (#47161363)

      Edward Snowden will shortly be releasing transcripts of this too. Here's one example:

      SUBJECT 1765467-2: K3yseRS0Z3 [[TRANSCRIPT BEGINS]]

      STRAFE
      STRAFE
      STRAFE
      STRAFE
      SCOPE
      FIRE
      FORWARD (RUN)
      TEABAG

      [[TRANSCRIPT ENDS]]

      I'm not worried. I'm already aware of what most 15 year old boys are thinking about, we don't need the NSA for that.

    • OK, so first we will have the government scanning people in sensitive positions. Then employers will insist as well. And then we will see things like a drivers license brain exam to find out if the person has moments when he drives drunk or on dope or has hidden medical issues. But the most fun will come when it is routine for courts to require lawyers and all parties giving evidence to also be scanned. And then daddy will insist on scanning his female children to make sure they are not playing na
  • Johnny Mnemonic

  • by Opportunist (166417) on Tuesday June 03, 2014 @11:59PM (#47162171)

    People. Please. Be reasonable.

    I've been doing a bit of research in that matter (because, well, the idea of controlling a computer with your brain IS kinda cool), and we're FAR, FAR away from a mind reading device. If such a device is possible at all.

    Every kind of "mind tracking" technology in existence not only needs a LOT of training (on both sides, the device AND the user), but most of all it needs cooperation to the extreme. Actually, it is pretty HARD to make that device actually "understand you", and that's if you WANT it to understand you.

    Now going and trying to pick up subconscious thoughts is at best akin to phrenology, where you have some sort of brainwave patterns of known people and try to pretend that the ones you read on another person that resemble them have any kind of correlation. The whole shit smells like good ol' phrenology.

  • ... Would you like a recommendation for a neurologist and anti-seizure medication?"

    Ugh.

    • ... Would you like a recommendation for a neurologist and anti-seizure medication?"

      Response: GAAHHHHERRRGGGHHHHHHHH........

  • So if I actually buy and wear some overpriced "headset" that has built-in brainwave receptors then companies could be mining my brainwaves? Well hold the presses everyone! Next thing you know there are people who will "hack" into my bank account because I decided to print my login information on a tshirt!!

    The main article is a farce. There is no "remote" reading going on against your will, you actually have to wear some useless "headset" and then be exposed to pretty obvious material ("flashing" straight or

    • Google, Facebook, et al, don't force you to submit data to them, or take it without your knowledge, but when you do they'll mine it for all its worth. Are concerns about that not legitimate either?

      you actually have to wear some useless "headset" and then be exposed to pretty obvious material ("flashing" straight or gay couples or candidates, really??)

      Like, say, an Occulus Rift that shows you ads between game levels, and monitors which ones you find particularly captivating? That doesn't sound so ridiculous.

  • Please agree to our EULA. "... Section 3.a.213.yx - Through the use of gaming software and a nuerointerface, the user may be trained, by the Company, to vote for specific candidates in public elections, and/or to rebel against the government in favor of rule by the Company, if it is determined necessary by the Company to enhance the user's gaming experience. Section 3.a.213.yy - ..."
  • ... between data miners:

    "With this PPH guy, we seem to be stuck in a perpetual game of Leisure Suit Larry."

If builders built buildings the way programmers wrote programs, then the first woodpecker to come along would destroy civilization.

Working...