California Passes Law To Protect Consumer 'Brain Data' (govtech.com) 28
On September 28, California amended the California Consumer Privacy Act of 2018 to recognize the importance of mental privacy. "The law marks the second such legal protection for data produced from invasive neurotechnology, following Colorado, which incorporated neural data into its state data privacy statute, the Colorado Privacy Act (CPA) in April," notes Law.com. GovTech reports: The new bill amends the California Consumer Privacy Act of 2018, which grants consumers rights over personal information that is collected by businesses. The term "personal information" already included biometric data (such as your face, voice, or fingerprints). Now it also explicitly includes neural data. The bill defines neural data as "information that is generated by measuring the activity of a consumer's central or peripheral nervous system, and that is not inferred from nonneural information." In other words, data collected from a person's brain or nerves.
The law prevents companies from selling or sharing a person's data and requires them to make efforts to deidentify the data. It also gives consumers the right to know what information is collected and the right to delete it. "This new law in California will make the lives of consumers safer while sending a clear signal to the fast-growing neurotechnology industry there are high expectations that companies will provide robust protections for mental privacy of consumers," Jared Genser, general counsel to the Neurorights Foundation, which cosponsored the bill, said in a statement. "That said, there is much more work ahead."
The law prevents companies from selling or sharing a person's data and requires them to make efforts to deidentify the data. It also gives consumers the right to know what information is collected and the right to delete it. "This new law in California will make the lives of consumers safer while sending a clear signal to the fast-growing neurotechnology industry there are high expectations that companies will provide robust protections for mental privacy of consumers," Jared Genser, general counsel to the Neurorights Foundation, which cosponsored the bill, said in a statement. "That said, there is much more work ahead."
Re: (Score:1)
California sure likes to pass laws on any topic imaginable. I assume they are the champion, in North America at least although Trudeau is serious competition.
Re: (Score:1)
But I wasn't being sarcastic, I was serious here, not actually trolling.
Re: (Score:1)
Sorry, Republicans are banning people from wearing masks [justia.com] to protect themselves and others.
In most states it is against the law to wear a mask -- these laws date from more than 100 years ago. This of course had nothing to do with Republicans or Democrats. Can you show any case where a random person was arrested just for wearing a mask in modern times? Recently, some have been amended to accommodate Muslims.
Is your complaint that Republicans have amended an old law so that it accommodates religious expression? And what might that have to do with privacy laws about brain scans?
Re: (Score:2)
Sorry, Republicans are banning people from wearing masks [justia.com] to protect themselves and others.
In most states
Why would my response, which is 100 % factual and explanatory, directly responding to what seems to be a political post, be down-moderated as a "Troll"?
Won't matter. (Score:4, Insightful)
Whether it's through a data breach, an internal leak, a partner with an NDA who ignores their agreement... that data will escape. A medical firm going bankrupt has enough chaos in which to "lose" the information.
I'm not saying there shouldn't be such rules in place. But they're far more likely to be used as punishment than prevention. Consider it "in the wild" from the start.
Re: (Score:2)
I'm not saying there shouldn't be such rules in place. But they're far more likely to be used as punishment than prevention. Consider it "in the wild" from the start.
This, exactly. Except for the punishment part, which depends on your definition of the word "punishment". If you consider Cost of Business a punishment, then yes. Otherwise, no.
Re: Won't matter. (Score:2)
Medical firm?
https://www.msn.com/en-us/mone... [msn.com]
And murderers kill even though it's illegal (Score:3)
Sometimes just discouraging something is enough, we don't have to kill the good enough in the quest for the perfect.
Re: (Score:2)
Brain data (Score:3, Funny)
MAGA folks are safe.
Neural activity is not deidentifiable (Score:1, Insightful)
This basically removes the ability of scientists from being able to do any research. You could make a case for the actual brain imaging (eg using defacing/deskulling techniques, which are problematic in itself) but neural activity is as unique as a fingerprint, any sort of brain injury affects regions too. On the other hand it is not identifiable either as we currently do not have the technology to do things like functional MRI at scale.
Re:Neural activity is not deidentifiable (Score:4, Informative)
Here is the text: https://leginfo.legislature.ca... [ca.gov]
It formulates in these terms: "Research with personal information that may have been collected from a consumer in the course of the consumer’s interactions with a business’ service or device". This does not apply to the relation between a scientist and the subjects. Research subjects hired by a university or a private company are not consumers.
What seems to be the intent of the prohibition (developing on an example from TFA) is you a consumer purchased a VR headset to play games, it happens to measure brain waves, through your brain wave reactions when presented some images with it determines you must be gay (even if you don't know or accept it), so it registers you into their database as such and starts showing you relevant ads.
Re: (Score:3)
Re: Neural activity is not deidentifiable (Score:1)
Sure, but that scenario is so far up the realm of sci-fi, it would be laughable even if you put it in the Star Trek universe.
Cortical brain waves have minimal to no information about your thoughts and desires. At best brain activity measurements show regions of interest for further imaging and nobody is putting an MRI on their head (it physically will never be possible to even make a device that can do MRI at the scale of a VR headset). Even implants like Neuralink require massive training which basically c
Re: (Score:2)
that scenario is so far up the realm of sci-fi,
It was an example cited in TFA, based on actual research. "Deep Learning in the Identification of Electroencephalogram Sources Associated with Sexual Orientation" https://karger.com/nps/article... [karger.com]
Re: Neural activity is not deidentifiable (Score:1)
That study is rife with issues, not the least that they basically used AI as a buzzword here and donâ(TM)t seem to know how it works. The text reads like they changed their methodology when they couldnâ(TM)t get the expected result. From a cursory reading it seems they trained their network on group 1 and tested the network against the same group despite having a larger sample they did some validation (k-1) but it is light on the details.
Basically if this were true, you would expect morphological
Completely unenforceable (Score:2)
It'll spend years in court with conflicting interpretations and result in absolutely no protection of anything.
Re: (Score:2)
It might be used in the courts on lie detectors as a starter. The classic machines do record nerve activity and that has been used by police.
Stop making stupid laws! (Score:2, Insightful)
I understand wanting to make a name for yourself, but seriously stop making stupid laws that add layers of bureaucracy and bullshit to an industry that is just trying to get started.
Re: (Score:3)
Forgot to add: This is the type of regulations that scare off investors, that's what's so dangerous about it. Many people need this kind of technology, but it will be slowed down or never get developed when you have laws like this. They are reducing the size of the investor pool and the amount of money.
Re: (Score:2)
This is the type of regulations that scare off investors
GOOD.
I'll say it again: GOOD. Not everything needs to be open season for "investors." I'd rather see those "investors" shot dead and their companies burned to the ground along with all of their assets by a lynch mob than allow them to start demanding direct read / write access to my brain. You should too, at least you should if you want to be able to enjoy your gains a little longer. After all, it would only take a few writes to make it so you'd be happy to give them everything for nothing.
Brain data (Score:2)
It also gives consumers ... the right to delete it.
Hand me another beer.
I want more... (Score:1)
Trump suppoters (Score:2)