crednews is the original content division of cred.ai

the original content division of cred.ai

The Brain’s Privacy Battle

The Brain’s Privacy Battle

Believe it or not, the notion of keeping your thoughts to yourself is becoming increasingly old-school. With ongoing advancements in neurotechnology, our inner musings might soon be up for grabs. Some companies, like Elon Musk’s Neuralink, for example, which recently implanted its first brain chip in a human, operate under medical regulations with the aim of helping paralyzed patients control computers through thought alone. However, the waters get murkier with noninvasive technology that could potentially translate your silent thoughts into text—no surgery required. These sorts of devices aren’t regulated like medical implants, so guess who could end up peeking at your brain data? Pretty much anyone.

Consumer Neurotech: Convenient or Creepy?

Right now, you can snag gadgets from merchants like Amazon that examine what’s going on inside your head. Take Muse, an EEG device that reads your brain activity to ostensibly boost your meditation skills and ability to focus, among other things. Sounds cool, but since this high-tech headband and similar gadgets aren’t sold as medical tools, there’s nothing stopping the companies that produce them from potentially sharing your neural data.

Legal Minds at Work

Thankfully, there’s a push to safeguard our brains. Neuroscientists, legal experts, and legislators have joined forces to set a precedent in Colorado, where they’ve tweaked the state’s privacy law this year to shield neural data, just like fingerprints and facial images, thanks to bipartisan support and a nod from Governor Jared Polis. California and Minnesota may be on their way to similar protections.

The Global Race for Mental Privacy

It’s not just about a single chip or device. Big names like Meta, Snapchat, and Apple are all dipping their toes into neurotech waters. Imagine a future where companies casually collect and sell our neural data without our knowledge—a chilling prospect indeed.

That’s why Rafael Yuste, Columbia University professor of biological sciences and neuroscience, teamed up with human rights lawyer Jared Genserback to form The Neurorights Foundation back in 2017.

Yuste and his research team had been able to use “high-resolution optogenetics” to “target selected neurons with single-cell precision, taking control of the mice’s behavior.” Concerned about where this tech could go when applied to humans, Yuste rallied experts to establish “ethical priorities for neurotechnologies and AI,” like mental privacy and protection against mind manipulation, as outlined in a 2017 paper in the journal Nature:

  1. Privacy and Consent: As neural technologies advance, they present unprecedented privacy risks by potentially allowing unauthorized access and manipulation of neural data. Ethical guidelines must prioritize robust consent procedures, opt-out defaults, and stringent regulations on the use, sale, and transfer of neural data to protect individual privacy.
  2. Agency and Identity: Neurotechnologies can profoundly affect an individual’s sense of self and decision-making capacity, posing risks to personal agency and identity. Protections against these impacts should be enshrined in international human rights treaties, with additional support from an international convention and UN oversight to ensure compliance and education on neurotechnology effects.
  3. Augmentation: The use of neurotechnologies to enhance human capabilities could lead to societal shifts and new forms of inequality and discrimination. Ethical frameworks should set international and national guidelines on the permissible uses of such technologies, with a focus on equitable access and preventing augmentation arms races.
  4. Bias: The risk of systemic biases being embedded in neurotechnologies necessitates proactive measures to ensure fairness in algorithmic decisions and device designs. Inclusion of diverse groups in the development process and rigorous public discourse on defining and combating biases are critical to creating equitable technologies.

What’s Next?

The Neurorights Foundation’s efforts paid off when Chile became the first country to amend its constitution to protect “mental privacy” and neurodata in 2021. And while it’s great to see states like Colorado stepping up in the U.S., the ultimate goal is to set up these protections on a national—and ideally global—scale. The challenge is huge, but so is the potential for safeguarding what’s arguably our last private frontier: our minds.

share this story

© crednews a division of cred.ai

cred.ai originals

latest posts

Despite the Federal Reserve’s recent interest rate cuts, prospective homeowners might be scratching their heads as mortgage rates edge higher. Last Thursday, Freddie Mac reported that the average 30-year…
A group of bipartisan lawmakers has introduced the Patients Before Monopolies Act. Led by Senators Elizabeth Warren (D-Mass.) and Josh Hawley (R-Mo.), the senate bill targets pharmacy benefit managers…
The 2024 holiday shopping season is a study in contrasts. As economic challenges meet the forward march of innovation, consumers are tightening their belts yet indulging in self-gifting. From…
Cemeteries, traditionally the domain of manicured lawns and orderly rows, are experiencing a renaissance of sorts. Across the globe, these solemn spaces are being transformed into vibrant ecosystems, embracing…

view the code through your phone’s camera
app and click the link that appears.
click the  X  or “esc” to close.