top of page

Are you safe in your own thoughts?


Photo by KOMMERS on Unsplash
Photo by KOMMERS on Unsplash

The sanctity of our inner thoughts is the latest frontier in the fight for our privacy, but the collection of neural data is nothing new. Hans Berger conducted the first human encephalogram in 1924, widely considered the first collection of brain data. Initially, the data collected was rudimentary and not well understood. The raw data was used purely for research purposes. However, as technology progresses, scientists now can translate some of the raw data into identifying a person’s emotions, mental states, and intended or unintended actions. For instance, Brain-Computer Interfaces (BCIs) have already been able to discern when a patient is going to have a seizure, so timely intervention can reduce the duration of the seizure. While the technology is a boon to those with drug-resistant epilepsy, it also foreshadows an intrusive data collection regime of health information that records every instance of a seizure. The latest neural data technology also ventures into decoding users' thoughts. In December 2023, a group of scientists was able to use a portable, non-invasive technique to decode silent thoughts into texts. 


So far, neural data technology has mostly been used in clinical settings such as medical research and treatment, so patient data constitutes “protected health information” protected by HIPPA and other health laws. However, as consumer wearable technology flourishes, neural data collected in commercial settings have been largely unregulated until recently.  


Neural Data & Privacy 

Neural data is generally defined as information generated by a person’s brain, spinal cord, or nervous system that is collected and interpreted by a device. Unlike other conventional biometric data, neural data constitutes a distinct category of personal information because it not only describes a person but also reveals a person. Therefore, before further discussion of new legislation, it’s important to understand the subject of regulation.

Neural data is typically divided into two groups: first-order data and second-order data. First-order data is the raw data that is directly gathered from the primary device and the direct interpretation of that data. This is the data that comes from the implant, headband, or any similar device and is then interpreted to determine factors such as the wearer's level of alertness. Second-order data is a further extrapolation of that data, which might infer future performance or behavior. Progress in neural technology allows a more accurate and broader collection of first-order data and better interpretation of second-order data. 


New Legislation

In the United States, states have led the charge in regulating neural data by extending existing privacy laws to a new category of data. In 2024, two states, Colorado and California, pioneered this approach. 


Colorado:

Colorado passed the first law to protect a consumer’s neural data in April 2024. The law amends Colorado’s existing Colorado Privacy Act to expand the definition of “sensitive personal information” to include biological and neural data. It is unclear whether the Colorado brain data bill will cover both first order and second order data. Colorado’s definition of “neural data” includes “measurement… and that can be processed by or with the assistance of a device.” It is unclear whether the law intends to cover just rudimentary data processing or further inferences of second-order data. The amendment also includes new obligations for businesses that collect neural data, including providing additional privacy notice disclosures, obtaining consumer consent, and safeguarding collected data. 


California:

In September 2024, California followed suit and passed an amendment to the California Consumer Privacy Act. The amendment extended the class of protected personal information to include neural data. An earlier version of the Act protected “biological” data, which did not include neural data as brainwaves are electrical in nature. The Act defines neural data as “information that is generated by measuring the activity of a consumer’s central or peripheral nervous system, and that is not inferred from nonneural information.” The devil is in the details, such that California and Colorado differ in some key areas

First, the California definition of neural data has narrower coverage than the Colorado definition, such that the Colorado definition includes information that “can be processed by or with the assistance of a device.” The California definition covers only first-degree data, while the Colorado definition is unclear on its coverage. Second, the Colorado law mandates that covered businesses obtain opt-in consent from users to collect and process data, while California only requires an opt-out mechanism reserved for users. Third, California’s approach covers both employee and consumer data, but Colorado leaves out employees from the protection of the new legislation. 


Other states:

Several other states, including Connecticut, Massachusetts, Minnesota, Illinois, and Vermont, have proposed bills in 2025 to protect neural data. States adopted different approaches to legislation, from extending existing consumer privacy law to cover neural data (Connecticut, Illinois) or enacting standalone acts (Minnesota, Massachusetts). Substantively, some track California’s narrower definition, and some follow Colorado’s broader definition. 


European Union:

EU’s General Data Protection Regulation (GDPR) does not specifically cover neural data, but consumer wearable technologies that collect neural data most likely fall within the principles approach of GDPR. Recently, the Spanish supervisory authority (AEPD) and the European Data Protection Supervisor (EDPS) issued a joint report titled “TechDispatch on Neurodata,” signaling regulators’ attention to neural technology under the EU’s existing data protection law. 


All the existing regulatory efforts regarding neural data adopt the approach of expanding privacy law; some have noted the limitation of such an approach and advocated for a broader regulatory scheme to oversee cognitive biometrics. Especially with the advancement of neurotechnology, the legal concern will most likely extend beyond data privacy concerns and intrude on the fundamental legal concept of mental state and science. 


Guidance for Businesses

Regulatory efforts to expand the existing privacy law framework provide broad guidance on the issue. Businesses no longer have the ability to do anything they want with the massive amounts (gigabytes) of data that they collect from consumers. The consumers were left unaware of what brain data was even being collected, let alone what companies could do with it. One use by companies would include the selling of their data to be used by anyone for any reason. Now, businesses are limited in that regard. While the prospect of selling consumer brain data is still a valid option for businesses to pursue, consumers will at least need to opt-in before it is sold. Protections under the California law include the right to know what brain data is being collected, limit its disclosure, and to be able to opt-out or have it deleted.


The limiting of access to data may slow down the rate at which those developing neurotechnologies can make progress. As brain data is complex and naturally encoded, the use of AI has been one of the main ways in which scientists have made headway into decoding the data to be usable. However, with consumers now being able to limit the spread of this data, it follows that data used to train the AIs will be lowered as well.

Ambiguous legislation does not provide clear guidelines to business, but in the meantime, regulatees can take some proactive steps. First, they should innovate with integrity and principle and incorporate privacy by design principles to ensure compliance at the source of the technology. 


One technical solution to the privacy concern of neural data is edge processing, where data is processed at or close to its source. For neural data or other biometric information, usually, a wearable smart device or a smartphone collects and processes the data. Edge processing has many advantages. First, no matter the level of encryption, edge processing eliminates entirely the risks associated with data transmission from a local device to a centralized device. Second, the physical storage and process of sensitive data on user-controlled devices avoids exposure of such information to public servers. Third, the limited processing power of local devices restricts the use cases to functions that serve limited purposes and which users are more likely to have consented to. However, the limited capacity of local devices may also limit the user experience for more advanced analytics. While edge processing is one of many toolkits that can be employed while adhering to privacy by design principles that feature built into the product and services eliminates and reduces legal risks. 


In other areas, businesses should conduct comprehensive privacy assessments and further strengthen their compliance programs, including updating notice and disclosure regimes as well as user consent mechanisms.    


Conclusion 

The proliferation of consumer wearable technology that incorporates neural technology has highlighted the need for the supervision and regulation of neural data. Currently, states have pioneered the effort by extending existing privacy laws to a new set of data. It is unclear when further development of the technology reaches far beyond the privacy concern, what kind of regulatory framework would be sufficient. In the time being, consumer wearable technology businesses have joined a growing list of companies subject to a maturing regime of data privacy law with a familiar playbook.


*The views expressed in this article do not represent the views of Santa Clara University.


Comments


bottom of page