Artificial Intelligence

Mind the Gap: Regulating Neurotechnologies in the Consumer Age

Developments in neurotechnologies, which interface with one’s nervous system to monitor, stimulate, or modulate neural activity in the brain, are progressing at a rapid rate.

Published

Author

Hannah Louise Smith

Share

Developments in neurotechnologies, which interface with one’s nervous system to monitor, stimulate, or modulate neural activity in the brain, are progressing at a rapid rate. While we may not have the mind-altering abilities portrayed in some sci-fi works, we can already “mute” traumatic memories.

Alongside technological developments, neurotechnologies are also becoming increasingly available as they expand beyond traditional clinical domains to commercially available products for consumers, rather than patients. Is this a positive development? What are the risks of such technologies when used without medical supervision? Are consumers adequately protected?

The Promise of Consumer Neurotechnologies

Neurotechnology has benefitted from recent advances in neuroscience, computer science, bioengineering, and material science, but it is not a new field. In 1924 Hans Berger recorded electrical brain signals in humans for the first time, and that still underpins neurotechnologies reliant upon electroencephalograms (EEGs). Modern neurotechnologies have been used in clinical settings to restore lower limb movements in paralysed individuals, help individuals with advanced amyotrophic lateral sclerosis (ALS) in a locked-in state to communicate, and to treat those suffering with Parkinson’s disease symptoms

Recent developments have now made neurotechnologies commercially viable, with a predicted market value of more than $50 billion by 2034. Advances in the hardware and software associated with neurotechnologies have lowered development costs, created scope to reduce their size, and made them more portable. Consumer neurotechnologies include devices that claim to alleviate insomnia, treat depression, and improve the concentration levels of children without the need to engage with clinicians. However, such technologies are not without risks, requiring a careful assessment of the adequacy of current rules regarding their development and use.

Privacy Risks

The collection of neural data can reveal highly sensitive information regarding a person’s health or mental status, raising concerns over the risks to one’s privacy. Some argue that mental data is more sensitive than other categories of data because fears of its exposure may cause unprecedented levels of self-censorship and undermine mental autonomy.

These risks arise not just through the technologies but the impact they may have on broader societal relationships through their ability to increase the surveillance of individuals. One example is the potential for commercial neurotechnologies to enter the workplace, allowing employers to monitor the productivity of their employees or their fitness for work. Such developments are not hypothetical, but a reality in more than 5,000 workplaces.

This means that privacy is a key concern raised by neurotechnologies, requiring commercial manufacturers of such devices to identify and mitigate any privacy risks. The governance of neurotechnologies would also benefit from the inclusion of a broader range of stakeholders in regulatory discussions to shed light on the wider ramifications of commercial neurotechnologies.

Risks to Other Rights

Privacy is not the only fundamental right potentially impacted by the introduction of consumer neurotechnologies. The right to non-discrimination is also relevant because devices may reveal information relating to mental health or other characteristics that might be used to discriminate. Moreover, these devices may rely on data that lacks representativeness, to the detriment of certain demographics. Others have also raised concerns about the implications of neurotechnologies on freedom of thought, where such technologies create opportunities for mental manipulation.

Efficacy Concerns

Another major concern with consumer neurotechnologies is their efficacy. Many consumer neurotechnologies are marketed by manufacturers as wellness products, rather than medical devices, which means they are not as rigorously tested to prove their effectiveness or ensure their safety. This gap in consumer protection exposes individuals to devices that do not work as advertised or, worse, cause harm. It is unlikely that many consumers will have the expertise to determine if their purchase is working properly, making this concern particularly troubling.

Alongside these efficacy concerns are ones arising if a manufacturer decides to end support for a device, leaving a consumer with a device that no longer works as expected. While the U.S. Federal Trade Commission (FTC) requires manufacturers of software-enabled products to provide information on how long devices will be supported, this rule is rarely followed in practice.

Current Governance of Neurotechnology

The EU Medical Device Regulation’s (MDR) scope is dependent upon a device’s intended purpose, as stated by the manufacturer, that seems to exclude devices that are intended for wellness purposes. However, Annex XVI of the MDR lays down requirements for non-medical devices including equipment intended for brain stimulation. Therefore, certain neurotechnologies may have to comply with some requirements, including the need to demonstrate the device’s performance. However, in the U.S., FDA uses a distinction between wellness products and medical devices that may leave users with less protection than they expect.

What Next?

Neurotechnologies span a spectrum, from the rather mundane imaging of brain structures to the extraordinary possibilities of brain-computer interfaces and the manipulation of neural activities. Hence, it is difficult to meaningfully discuss the risks and opportunities of consumer neurotechnologies as a monolithic concept. Nevertheless, like many emerging technologies, there is ongoing debate about how to promote innovation without unduly exposing consumers to harm.

The UN and the Council of Europe have stressed the importance of human rights in the governance of consumer neurotechnologies. Others suggest that novel “neurorights” are required to provide sufficient protections, such as the right to cognitive liberty, although support for this is not universal as some dismiss neurorights as an example of “rights inflation.” The Organization for Economic Cooperation and Development (OECD) has stressed the need to promote responsible innovation, recognizing that such technologies hold great promise, despite their challenges.

Neurotechnologies may be “skating the line” between medical devices and wellness products. Doing so may improve innovation and facilitate access to technologies, but requires a delicate balance with safety concerns. It is crucial for regulators, manufacturers, and consumers to work together to ensure that these tools are used responsibly and ethically to harness the benefits of neurotechnologies while minimizing risk.

Acknowledgment: This article was made possible through the generous support of the Novo Nordisk Foundation (NNF) via a grant for the scientifically independent Collaborative Research Program in Bioscience Innovation Law (Inter-CeBIL Program – Grant No. NNF23SA0087056).


About the author

Hannah Louise Smith is a Postdoctoral Fellow at the Centre for Advanced Studies in Bioscience Innovation Law, through the University of Copenhagen. Her research explores the regulation of new and emerging technologies through a socio-legal lens, with a particular interest in data protection and privacy concerns.