Protecting Our Brains From Datamining 100
Jason Koebler writes: 'Brainwave-tracking is becoming increasingly common in the consumer market, with the gaming industry at the forefront of the trend. "Neurogames" use brain-computer interfaces and electroencephalographic (EEG) gadgets like the Emotiv headset to read brain signals and map them to in-game actions. EEG data is "high-dimensional," meaning a single signal can reveal a lot of information about you: if you have a mental illness, are prone to addiction, your emotions, mood, and taste. If that data from gaming was collected and mined, it could theoretically be matched with other datasets culled from online data mining to create a complete profile of an individual that goes far beyond what they divulge through social media posts and emails alone. That's led some to develop privacy systems that protect your thoughts from hackers.'
Q: Why did Apple buy Beats headphones? (Score:5, Funny)
Re:Q: Why did Apple buy Beats headphones? (Score:4, Funny)
Beats buyers have brains now?
Don't Worry! (Score:1)
I've got plenty of Tin foil hats to sell!
Re: (Score:2)
Do I really need tin foil, or is aluminum foil good enough?
Re: (Score:2)
Fuck that! I'm going to build a Faraday cage hat! Tinfoil hats are for poseurs.
Re: (Score:1)
Every time you do... (Score:1)
... a google search you reveal a lot of thoughts. Same goes for email.
Let's not forget all cellphones are tapped and conversations recorded. So it's not like they don't already have everything at this point. Technology has made it trivially easy to just harvest everything and you're not going to put the genie back in the bottle.
Re: (Score:1)
It's not the tech that's doing it... It's the idiots using the tech who have all the money that is.
Employers who require you to use cell phones and pagers, who require you to post images of yourself on their company websites, who require you to haul around laptops... Classes who conduct classes online, the teachers of which force you to have Twitter accounts, Google Docs, etc. The companies who pay their employees via electronic bank account deposits instead of an old school paper check...
All these things--
Yeah well, (Score:4, Funny)
Just wait until data from people like me with ADHD and PTSD starts corrupting their Hey look a Squirrel!
Increasingly common? (Score:2, Interesting)
Is it really? Or is it a click-bait headline that really means here's a couple of companies who have a product which does it but nobody else does?
Re: (Score:2)
Or is it a click-bait headline that really means here's a couple of companies who have a product which does it but nobody else does?
Definitely a click-bait headline. They have enough trouble getting the accuracy and resolution required to tell those sorts of things with medical grade EEGs, let alone a consumer grade headset.
Re:Increasingly common? (Score:5, Interesting)
I agree, to an extent. These devices are hardly going to read minds in the sense of providing all of that detail.
However, whatever they lose in quality (of resolution), they may make up for in quantity. A poor quality device may still be able to provide some useful data points when applied to a larger group of people. Put some branding or situations inside a game, monitor for coarse grained interest or emotion, and you might have something useful to marketers or game designers. Or not.
When things like this start approaching mass markets, people start thinking of other uses for the data. Working in a field where people are spending good money trying to vacuum up all the data on the Internet, even shitty Facebook posts, I see first hand how people get excited over any new data point. Most of it is crap, but there's some gold in there, for sure.
Click-bait, but still interesting to consider.
Re: (Score:2)
Definitely a click-bait headline. They have enough trouble getting the accuracy and resolution required to tell those sorts of things with medical grade EEGs, let alone a consumer grade headset.
While I agree that it's an inflammatory article now it may not always be that way and "not always" may be sooner rather than later. Using EEGs outside of medicine is in it's infancy but there is huge interest in it from a number of different fields from consumer products to defense. I have no doubt it's going to heat up in the next decade. The more adoption and the wider the markets, the faster it will evolve. So while it's not likely to be an issue today, it's probably best to start acting to head it off
Re: (Score:1)
My thoughts too. I think it has quite good uses too - people with missing extremities can get a replacement that is controlled well by such devices. I think that Mr Hawking would appreciate development in this area very much. Judging on how many problems male-female interactions are causing I think that a bit more sophisticated device helping to distinguish between 'ooohhh no no no' meaning 'please continue' or 'f.off or I call the police'. This said I must admit that knowing that somebody can evaluate you
Re: (Score:2)
With a current gen headset, if you were to turn off the 60hz notch filter, the signal it would be picking up from the power lines would drown out the brain signals by several orders of magnitude. Even someone waving their hand over top of your head while you wear one will cause enough interference to blot out the sorts of signals the brain produces. On top of that, those signals that we can pick u
Re: (Score:2)
click bait headline.
they can tell if you're squinting really hard though, but detecting that usually gets reported as reading your thoughts.
Ridiculous (Score:4, Insightful)
Here we propose an integration of a personal neuroinformatics system, Smartphone Brain Scanner, with a general privacy framework openPDS. We show how raw high-dimensionality data can be collected on a mobile device, uploaded to a server, and subsequently operated on and accessed by applications or researchers, without disclosing the raw signal. Those extracted features of the raw signal, called answers, are of significantly lower-dimensionality, and provide the full utility of the data in given context, without the risk of disclosing sensitive raw signal. Such architecture significantly mitigates a very serious privacy risk related to raw EEG recordings floating around and being used and reused for various purposes.
So MIT pisses away cash on research that comes up with "Just anonymize the data, sorta, before shipping it off to advertisers and you'll be protected, sorta."? And of course it's peppered with meaningless shit like "personal neuroinformatics system", "smarthphone", and "privacy framework".
Hey MIT, give me a research grant and I'll come up with an actual solution. Hint: Don't let people put EEG sensors on or around your head for a game, a video, etc. in the first place and you won't have the problem of them selling it to nefarious parties who would use it against you. Much more effective than the proposed equivalent of "Do Not Track" for brainwaves.
Re: (Score:1)
EEG is not causation to Galileo (Score:2)
Yes, TFA is a total waste of time. The concepts are reductive and stultifying and the author's evaluation of EEG capabilities is straight from science fiction.
**we don't now how the brain works**
We know an EEG and fMRI and other E-M sensitive sensors can receive waves from our brain and represent that data on a chart.
Beyond that, it's absolutely the Wild Wild West...it's academic anarchy
It's so bad that now anyone will say "correlation is not causation" to any scientific claim purely as rhetoric to bolster
Re: (Score:2)
You are making it sound like the only person that could possibly correlate data from an EEG is a dude that is dead, instead of looking at the reality that the paper points out. Which is that much of EEG reading is fully automatic in many types of software. Sure, we have more to learn and the paper makes that clear. That aside, the portions we are sure of are very accurate. Things such as memory mapping (gauging yes/no responses by thought pattern), detecting certain disorders, detecting specific behavi
as scientific as a lie detector (Score:2)
only if you use some kind of Schrodinger's Cat definition of "sure" and "accurate"
you cannot use an EEG to detect what you claim at all
you're in polygraph territory...that's a more precise analogy...
TFA = polygraphy
Re: (Score:1)
Even a journey of a thousand miles begins with a single step.
Nobody with a healthy brain can claim that currently used techniques can recognize our thoughts. They can recognize some of our feelings or rather general state of mind and we also have techniques that after some training allow some people that lost their limbs to control the devices that replaced them. These are not very accurate but getting better if one is to believe the media. We started this journey and if that is possible (why should it not
Re: (Score:2)
You either failed to read the article and it's references, or you are trying to deny science with statements that equate to "nuh uh!".
Polygraphs fail for numerous reasons, but most notably are the external influences such as strapping a bunch of cables to someone after stuffing them into a foreign location and being directly interviewed by people that are unknown.
EEGs in private devices used in private locations are not subject to any of those stresses. EEGs are scientifically proven to be very accurate (8
s.petry is right or wrong 80-100% (Score:2)
yes...show me some science to argue with (didn't see any in TFA)
polygraphs fail b/c they are not what they claim to be...and their failings are so well documented it's an insult to provide them for you...
they are completely subjective....so is the science in TFA
TFA and you are making t
Re: (Score:2)
yes...show me some science to argue with (didn't see any in TFA)
If you truly read The PDF that TFA points to, then you don't know how to read. The referenced sources are all available. Skimming the summary of TFA is not doing the work and is not science. Part of my original quoted statement gave the name of one of numerous studies referenced.
polygraphs fail b/c they are not what they claim to be...and their failings are so well documented it's an insult to provide them for you...
We agree that polygraphs don't work, I have read thousands of papers and opinions on various points of failure, naming several commonly related points of failure.
At the same time, you are claiming that the use of EEGs are the sam
your rigor on your ideas (Score:2)
don't pull this crap w/ me...you've posted exactly zero evidence yourself...
your'e trying to make this into an 'evolution vs creation' style discussion and it's obnoxious
you're a **scientist** right? "Senior System Engineer/Architect"....so glad you put your specific job title in your sig so we all know you're a **scientist**
here's what you do...
put the claims of TFA through the same rigor you are using for my claims...
also, show me some
Re: (Score:2)
You obviously think that reading what I responded to you is the only possible response I could have made in the thread, which is foolish.
Link to the reference comment here [slashdot.org].
Link to PDF here [ssrn.com].
I refuse to link to all of the studies referenced in the link above because you refuse to look for answers and continue to argue from ignorance (intentionally or otherwise).
citing yourself = intellectual fapping (Score:2)
right...you have advanced to posting links...
now...post links **that support your contention**
you can't link to your own comment then TFA and call it "evidence" of your contention...
you can't cite yourself
the P-300 wave exists...we are experimenting to see how it works in the brain...that I agree with...
what is wrong and foolish is to say that b/c we see P-300 light up on a screen that means we can "read emotions"
I know the science...the problem is people like you have built careers around an unscientific a
more on P-300 (Score:2)
I want to explain exactly why you're full of shit
http://en.wikipedia.org/wiki/P... [wikipedia.org]
that's the P-300 wave
we can define it and observe it repeatedly to verify that it exists
the problem comes with ***connecting that data to human behavior***
define emotions...go ahead...
it's impossible to define "human emotion" in a way that is testable with p-300 data
it's like trying to read War & Peace when you can only see one letter at a time...it's ridiculous
however, researchers need hype to stay funded, so they (TFA) *
Re: (Score:2)
And here is why you are full of shit.
what is wrong and foolish is to say that b/c we see P-300 light up on a screen that means we can "read emotions"
Claiming that you have to be able to read emotions to gauge true/false narrative is astoundingly idiotic. I can't critique you any further, your idiocy speaks for itself.
And wholly fuck, nothing like cherry picking a sentence to make your argument.
123 Frank et al. explored in [31] feasibility of subliminal attacks, where the reaction to a short-lasting
124 information of 13.3 milliseconds was measured. Such stimuli, in theory below conscious perception,
125 could poten
can't quantify "emotion" (Score:2)
You're avoiding your problem & your data doesn't apply:
We cannot quantify the human experience of "emotions" in a way that is scientifically comparable and consistent
You're dead in the water on this one...
lie detection is a farce (Score:2)
what are you even defending now?
you've dropped your main contention...now you're trying to say P-300 waves can be used for lie detection?
you must be a polygrapher or on the MIT team or Ray Kurzweil himself
Re: (Score:2)
I have provided scientific papers to back my opinion, and you have provided nothing except your opinion.
If you truly think your non-fact based opinion is more valuable, you are truly a moron. Grats either way, because your argument is not an argument but a deranged rant no matter how it's perceived. Go troll someone else.
No more of your idiocy, good day.
avoiding the question (Score:2)
you smug bastard
your "evidence" was a link to your own comment and info from TFA
I asked for studies or some kind of proof that ****emotions can be scientifically quantified****
YOU ARE AVOIDING THE QUESTION B/C YOU KNOW YOU'RE CONJURING FACTS
Re: (Score:3)
The paper, if you read it, also discusses the "Why" you need to do this (as most scientific papers tend to do). It is not just about building a method of giving the data anonymity, but telling people why it should be done. While a bit deep for some, I highly recommend reading the paper.
Companies are already trying to figure out how to cash in on your EEG data. What's the big deal you ask? Well...
Using more direct attacks to reveal EEG information, Martinovic et al. investigated in [28] how the
109 brain's response to a particular stimulus (so-called P300 paradigm) can be used to narrow down the space
110 of possible values of sensitive information such as PIN numbers, date of birth, or known people. The
111 tasks required the subject to follow the experimental procedure without explicitly revealing the goal of
112 the experiment: for example thinking about birth date while watching ashing numbers. Although the
113 presented attacks on the data may not be directly applicable to preexisting EEG data, as they require
114 fairly specic malicious tasks, we can expect | as the subjects participate in multiple experiments |
115 correlations violating privacy could be obtained from raw EEG signal. For example, when a large corpus
116 of the user responses to a visual stimuli is collected, it could be used in P300-based Guilty-Knowledge
117 Test, where the familiar items evoke different responses than similar but unfamiliar items [29].
In other words, and without the formatting and line numbers, people could maliciously collect personal inform
Re: (Score:2)
You are quite right, and I wish I had mod points. As I was reading your post, the mention of doctors' protocols sent off alarm bells.
Without going too far off into Big Brother Paranoia Land, I wonder exactly how confidential this sort of ostensibly private diagnostic data is. Is data protected by professional protocols and HIPPAA somehow immune to MITW packet data-mining by Google, NSA, et al?
Re: (Score:2)
I can answer some of this since I work in security and compliance as well as architecture of secure systems (software and hardware).
Is data protected by professional protocols and HIPPAA somehow immune to MITW packet data-mining by Google, NSA, et al?
If the standards are followed the answer is "yes". Data must be encrypted at rest and again in transit, so when data is on a wire it's doubly encrypted.
That said, there is no such thing as perfect software or hardware. The majority of errors are operator errors, but still count as errors. The only difference is that ISPs can be fined for server/software errors, and operators
Re: (Score:2)
We really shouldn't be discussing these things - if anyone at Facebook reads about it, they'll "upgrade" the Oculus Rift with EEG sensors (since it is already attached to your head..)
Actually, eff that, Sony et al are just as likely to try and figure out how to get to your skull.
*paranoid*
Re: (Score:1)
curious wouldn't it be possible before connecting or uplinking the nural device you had a seperate software that simply maps the patterns that are found on command, In games we use directions for example say left turn left till the pathway is mapped. No storage for advertisers because the developers to the device simply program a pointer in the band look for left -brain command left device trigger left --In game turn left.
This presumes you control the device's output and the software works (and works well) with the "less dimensional" output your privacy layer gives it, and that the privacy layer doesn't hinder the experience by introducing additional delays, removing too much data, etc.
You won't get to control the device's output until the things become so commoditized that you're building your own with Arduino, then there's the whole issue of trusting the software.
Re: (Score:2)
You'd get a Nobel prize for medicine.
You really do not understand how complex the human brain is, do you?
Re: (Score:1)
I'd love to have such datasets (Score:2)
... for all those that are placed above us to lead us,
and of all those that suck up to same.
Lets all take a step back to appreciate this: (Score:3)
Re:Lets all take a step back to appreciate this: (Score:5, Funny)
Edward Snowden will shortly be releasing transcripts of this too. Here's one example:
SUBJECT 1765467-2: K3yseRS0Z3 [[TRANSCRIPT BEGINS]]
STRAFE
STRAFE
STRAFE
STRAFE
SCOPE
FIRE
FORWARD (RUN)
TEABAG
[[TRANSCRIPT ENDS]]
I'm not worried. I'm already aware of what most 15 year old boys are thinking about, we don't need the NSA for that.
Re: (Score:2)
Well done sir. And me with no mod points.
Re: (Score:2)
Re: (Score:2)
You've got it backwards. It's the IRS who will want to use it first.
Re: (Score:2)
It's too bad they can't use this technology to stop the conspiracy theorists from revealing all of their conspiracies.
Johnny Mnemonic? (Score:2)
Johnny Mnemonic
Phrenology much? (Score:3)
People. Please. Be reasonable.
I've been doing a bit of research in that matter (because, well, the idea of controlling a computer with your brain IS kinda cool), and we're FAR, FAR away from a mind reading device. If such a device is possible at all.
Every kind of "mind tracking" technology in existence not only needs a LOT of training (on both sides, the device AND the user), but most of all it needs cooperation to the extreme. Actually, it is pretty HARD to make that device actually "understand you", and that's if you WANT it to understand you.
Now going and trying to pick up subconscious thoughts is at best akin to phrenology, where you have some sort of brainwave patterns of known people and try to pretend that the ones you read on another person that resemble them have any kind of correlation. The whole shit smells like good ol' phrenology.
Re: (Score:2)
Again: For it to work out, you need the FULL cooperation of the person you are trying to "read". No cooperation and you just get a lot of garbage out of it.
But if it makes you feel better, keep that tinfoil hat on.
Re: (Score:2)
Let's, just for fun, assume for just a moment that you're right. That immediately requires one question: Why the effort? Control? C'mon.
To control people, you don't need that whole shit. It's far cheaper to keep them busy with petty shit and TV. And, lo and behold, it's not only done, it also works. Why bother with highly sophisticated mind control mumbo jumbo when you can accomplish the same with a few shitty reality shows?
"You are having a seizure. ... (Score:2)
Ugh.
Re: (Score:2)
... Would you like a recommendation for a neurologist and anti-seizure medication?"
Response: GAAHHHHERRRGGGHHHHHHHH........
Wait a minute... (Score:2)
So if I actually buy and wear some overpriced "headset" that has built-in brainwave receptors then companies could be mining my brainwaves? Well hold the presses everyone! Next thing you know there are people who will "hack" into my bank account because I decided to print my login information on a tshirt!!
The main article is a farce. There is no "remote" reading going on against your will, you actually have to wear some useless "headset" and then be exposed to pretty obvious material ("flashing" straight or
Re: (Score:2)
Google, Facebook, et al, don't force you to submit data to them, or take it without your knowledge, but when you do they'll mine it for all its worth. Are concerns about that not legitimate either?
you actually have to wear some useless "headset" and then be exposed to pretty obvious material ("flashing" straight or gay couples or candidates, really??)
Like, say, an Occulus Rift that shows you ads between game levels, and monitors which ones you find particularly captivating? That doesn't sound so ridiculous.
Before activating iBrain... (Score:1)
Overheard .... (Score:2)
"With this PPH guy, we seem to be stuck in a perpetual game of Leisure Suit Larry."