Why your brain could be the next frontier of data privacy | CBC Radio - Action News
Home WebMail Thursday, November 14, 2024, 03:40 AM | Calgary | 6.3°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
As It Happens

Why your brain could be the next frontier of data privacy

As tech companies and scientists invest in technology that interacts with our brains, some experts say were far from being able to map moods and thoughts in a meaningful way.Others, however, say brain data is the next frontier of privacy, and we need to pass laws to protect our brain data now.

California, Colorado and Chile have passed laws protecting neural privacy. Heres whats happening in Canada

A mustachioed man smiles and looks up while wearing a skullcap covered in wires attached to electrodes
Sebastian Reul of Germany looks up as he competes during the Brain-Computer Interface Race in Switzerland in 2016. As more neurotech hits the market, some advocates say we need to be proactive in protecting our neural data. (Arnd Wiegmann/Reuters)

Imagine a future in which a wearable device tells advertisers when you're in the mood for chocolate, or lets your employer know when you're not paying attention at work. Or where a medical implant that's supposed to save your life ends up being used against you in court.

These are some of the scenarios that people in the emerging field of neuralprivacy are worried about and they say some are already closer to reality than you might think.

As tech companies and scientists invest in technology that interacts with our brains, some experts say these concerns are overblown, and that we're far from being able to map moods and thoughts in a meaningful way.

Others, however, say brain data is the next frontier of privacy, and we need to pass laws to protect our brain data now.

"There are obviously a lot of bad actors out in the world that are going to try to use these devices for really worrying purposes," Jared Genser, a human rights lawyer and co-founder of the Neurorights Foundation, told As It Happens host Nil Kksal.

What is neural privacy?

Neurotechnology is tech that interacts with our brains or nervous systems. It can largely be broken down into two categories invasive, like implants, and non-invasive, like wearables.

The consumer sphere is dominated by wearables. Think headbands that monitor your state of relaxation to help you meditate, andhats and headsetsthat measure fatigue to reduce workplace accidents.

Big companies like Snapchat, Metaand Apple are also exploring the neurotech space, with the latter having patented earbuds that measure the brain's electrical activity.

Headshot of a smiling man in a suit
Jared Genser is a human rights lawyer and co-founder of the Neurorights Foundation, an organization that advocates for legislation to protect neural data. (Submitted by Jared Genser)

Invasive neurotech, meanwhile, is mostly limited to the medical sphere. There's deep brain stimulation, which uses wires to send signals to the brain to help manage the symptoms of neurological disorders like Parkinson's disease. Brain implants, placed surgically, can send electrical pulses to the brain to block seizures for patients with drug-resistant epilepsy. Andbrain-computer interfaces allow people with limited mobility to control robotic limbs.

Some companies are already working to bring invasive neurotech into the consumer sphere. In January, the first human patient received an implant from Elon Musk's computer-brain interface company, Neuralink, which he later used toplay Mario Kart with his mind.

"What's coming is both incredibly exciting as well as daunting," Genser said.

WATCH | A look at Neuralink's brain implant:
Jared Genser, a human rights lawyer and co-founder of the Neurorights Foundation, told As It Happens host Nil Kksal that Californians' brains are a lot safer after lawmakers updated the state's consumer protection legislation to include neural data.

With these advances in neurotech, comes the rise in "neurorights" advocacy.

The Neurorights Foundation, born from a three-day academic workshop at Columbia University in 2017,advocates for legislation to protect the information inside our brains.

They've had some success. Last week, California amended its existing consumer privacy legislation to include neural data.

LISTEN: Jared Genser says amended law makes Californians' brains safer:

Elon Musks Neuralink: Human enhancement or virtual insanity?

9 months ago
Duration 6:30
Tech billionaire Elon Musk announced this week that his company Neuralink has implanted its first wireless brain chip in a human. The Nationals Ian Hanoansing asks neurology experts Judy Illes and Dr. John Krakauer to weigh in on the development and the future of the technology.

Colorado enacted similar legislation in April, and Minnesota is currently considering a bill to enshrine the right to mental privacy.

Chile became the first country to amend its constitution to protect "mental integrity" and neural data in 2021, and several other Latin American countries are considering similar moves.

What's happening in Canada?

Neurorights are on Canada's radar, too.

The federal Office of the Privacy Commissioner says it considers neural data to be a type of biometric information, which means its protected under the Personal Information Protection and Electronic Documents Act.

Last fall, the office launched public consultation on new draft guidance on biometric technologies, which it expects to release in the coming months.

"My office will continue to work with our global counterparts to identify ways to promote and protect the fundamental privacy rights of our citizens, while also allowing innovation to flourish in support of the public interest,"privacy commissioner Philippe Dufresne said in an emailed statement.

Portrait of a woman with short brown hair and a bright red scarf
Judy Illes is the director of Neuroethics Canada, a research institute at the University of British Columbia. (Submitted by Judy Illes)

Health Canada is also working with experts to draft guidelines on the use of neurotech.Dr. Judy Illes, a professor of neurology at the University of British Columbia and director of Neuroethics Canada, is part of the team putting thosetogether.

She says her team's recommendations, which will be published soon, focus less on imposing laws and regulations, and more on developing a framework of shared values to guide work in this field.

"It's good to put in practice good frameworks for guiding good innovation.What we don't want to do is stop it or prevent it from occurring because now smart, well-intentioned researchers and engineers and neuroscientists are getting nervous about what might happen if they overstep."

Not everyone buys the hype over neurotech

Graeme Moffat, a senior fellow at the University of Toronto's Munk School of Global Affairs and Public Policy, agrees. He's been working with Illes on the Health Canada draft guidelines.

He's also worked for decades in the field of neurotech, most recently as chief scientist at the Canadian company Interaxon, and before that, themedical technology company Oticon.

His experience in the field, he says, hasled him to conclude that fears aboutconsumer technology are "way, way overblown."

"Ethicists are really, you know, dining out on the worry, and the neurotechnology start-ups are benefiting from the hype," he said.

Portrait of a smiling bearded man in a blue suit jacket
Graeme Moffat is a senior fellow at the University of Torontos Munk School of Global Affairs and Public Policy who has worked in the field of consumer neurotechnology. (Submitted by Graeme Moffatt)

The non-invasive neurotech devices on the market right now monitor brain waves or electrical signals, information he says can only derive the user's "gross mental state" like whether someone is relaxed or alert "and not even very reliably."

It's the kind of information he says you can more reliably glean from more commonplacetechnology, likesurveillancecameras and smart phones, which he says we should be "far more concerned about."

"The strongest predictor of future behaviour is past behaviour. So if someone is recording your behaviour all of the time, they don't need to get inside your head to know what you're going to do or what you're thinking," he said.

But Genser says private companies cannot be trusted to self-regulate.

In April, the organization released a report analyzing the privacy policies and user agreements of 30 companies that sell consumer neurotechnology products.

It found all but one could access the data their devices collect and transfer them to third-parties,fewer than half allow users to request their data be deleted, and only three anonymize and encrypt the data they collect.

If data is created, someone will use it: expert

Jennifer Chandler, a law professor at the University of Ottawa who studies biomedical science and technology, says she understands why some people in the tech industry think this issue is over exaggerated.

"But I also think they dismissed the potential uses of these technologies," she said.

Just because something doesn't work well doesn't mean people won't use it, she said. And whenever data is created, she says, someone will inevitably use it or misuse it for unintended purposes.

Portrait of a smiling woman in a blazer, her arms folded over her chest.
Jennifer Chandler is a law professor at the University of Ottawa who studies the legal and ethical aspects of biomedical science and technology. (Submitted by Jennifer Chandler)

Law enforcement in India hasalready used brain-based lie detector tests during interrogation of suspects. The idea is to see if a suspect's brain lights up in recognition when told details of a crime.

"You could happen to know something about that stimulus for a totally different reason, which would then lead to a false positive,she said.

There was also a case in Ohio, in 2017,in which pacemaker data was deemed admissible as evidencein an arsontrial.It's not unreasonable, Chandler says, to presume data from an implanted brain device could be used in a similar way.

"I think it's worthwhile getting ahead of issues,"Chandler said.

"I don't think there's much harm, and I think [there's] a lot of good, in trying to dig in and anticipate where the field might go and what you do with that information."

Clarifications

  • A previous version of this story stated that Graeme Moffat had worked for Meta. Moffat worked for the Canadian data analysis company Meta UCL before it was acquired by the Chan Zuckerberg Initiative and shut down when Facebook Inc. rebranded itself as Meta.
    Oct 11, 2024 6:27 PM ET

Interview with Jared Genser produced by Chris Trowbridge

Add some good to your morning and evening.

Get the CBC Radio newsletter. We'll send you a weekly roundup of the best CBC Radio programming every Friday.

...

The next issue of Radio One newsletter will soon be in your inbox.

Discover all CBC newsletters in theSubscription Centre.opens new window

This site is protected by reCAPTCHA and the Google Privacy Policy and Google Terms of Service apply.