crednews is the original content division of

the original content division of

The Brain’s Privacy Battle

The Brain’s Privacy Battle

Believe it or not, the notion of keeping your thoughts to yourself is becoming increasingly old-school. With ongoing advancements in neurotechnology, our inner musings might soon be up for grabs. Some companies, like Elon Musk’s Neuralink, for example, which recently implanted its first brain chip in a human, operate under medical regulations with the aim of helping paralyzed patients control computers through thought alone. However, the waters get murkier with noninvasive technology that could potentially translate your silent thoughts into text—no surgery required. These sorts of devices aren’t regulated like medical implants, so guess who could end up peeking at your brain data? Pretty much anyone.

Consumer Neurotech: Convenient or Creepy?

Right now, you can snag gadgets from merchants like Amazon that examine what’s going on inside your head. Take Muse, an EEG device that reads your brain activity to ostensibly boost your meditation skills and ability to focus, among other things. Sounds cool, but since this high-tech headband and similar gadgets aren’t sold as medical tools, there’s nothing stopping the companies that produce them from potentially sharing your neural data.

Legal Minds at Work

Thankfully, there’s a push to safeguard our brains. Neuroscientists, legal experts, and legislators have joined forces to set a precedent in Colorado, where they’ve tweaked the state’s privacy law this year to shield neural data, just like fingerprints and facial images, thanks to bipartisan support and a nod from Governor Jared Polis. California and Minnesota may be on their way to similar protections.

The Global Race for Mental Privacy

It’s not just about a single chip or device. Big names like Meta, Snapchat, and Apple are all dipping their toes into neurotech waters. Imagine a future where companies casually collect and sell our neural data without our knowledge—a chilling prospect indeed.

That’s why Rafael Yuste, Columbia University professor of biological sciences and neuroscience, teamed up with human rights lawyer Jared Genserback to form The Neurorights Foundation back in 2017.

Yuste and his research team had been able to use “high-resolution optogenetics” to “target selected neurons with single-cell precision, taking control of the mice’s behavior.” Concerned about where this tech could go when applied to humans, Yuste rallied experts to establish “ethical priorities for neurotechnologies and AI,” like mental privacy and protection against mind manipulation, as outlined in a 2017 paper in the journal Nature:

  1. Privacy and Consent: As neural technologies advance, they present unprecedented privacy risks by potentially allowing unauthorized access and manipulation of neural data. Ethical guidelines must prioritize robust consent procedures, opt-out defaults, and stringent regulations on the use, sale, and transfer of neural data to protect individual privacy.
  2. Agency and Identity: Neurotechnologies can profoundly affect an individual’s sense of self and decision-making capacity, posing risks to personal agency and identity. Protections against these impacts should be enshrined in international human rights treaties, with additional support from an international convention and UN oversight to ensure compliance and education on neurotechnology effects.
  3. Augmentation: The use of neurotechnologies to enhance human capabilities could lead to societal shifts and new forms of inequality and discrimination. Ethical frameworks should set international and national guidelines on the permissible uses of such technologies, with a focus on equitable access and preventing augmentation arms races.
  4. Bias: The risk of systemic biases being embedded in neurotechnologies necessitates proactive measures to ensure fairness in algorithmic decisions and device designs. Inclusion of diverse groups in the development process and rigorous public discourse on defining and combating biases are critical to creating equitable technologies.

What’s Next?

The Neurorights Foundation’s efforts paid off when Chile became the first country to amend its constitution to protect “mental privacy” and neurodata in 2021. And while it’s great to see states like Colorado stepping up in the U.S., the ultimate goal is to set up these protections on a national—and ideally global—scale. The challenge is huge, but so is the potential for safeguarding what’s arguably our last private frontier: our minds.

share this story

© crednews a division of originals

latest posts

In recent years, an increasingly age-diverse mix of students has filled college campuses. Alongside the typical cohort of 18-23 year-olds, a significant number of older adult students are entering—or,…
President Joe Biden and former President Donald Trump took the stage in Atlanta on Thursday night for their inaugural debate of the 2024 election. It was a historic first:…

view the code through your phone’s camera
app and click the link that appears.
click the  X  or “esc” to close.