In an age where personal data is regularly collected and tracked online, it can feel like our brain is the last truly private place. While people are generally aware that their clicks, likes, and scrolls are recorded and stored, many take solace in the idea that their thoughts remain private. Yet, our “neural” or “mental” privacy is being threatened by the introduction of commercial products that measure and track our brain signals. These brain-based technologies record electrical activity and motor function in the brain, which companies might use to try to discern information about our emotions, preferences, and mental health.
Despite the novelty of these products, technology measuring brain activity is not new. Brain-based technologies have primarily been deployed in healthcare and research settings, where they are used to diagnose, treat, and monitor patients with brain-related diseases. Progress has been made in treating patients with paralysis or other mobility-limiting diseases through the use of Brain Computer Interfaces (BCIs), an “invasive” version of brain tracking technology. Additionally, a range of brain-based technologies are being developed to address mental health disorders, including depression and anxiety through neurofeedback and similar treatments.
Commercially, brain tracking technologies are growing. In 2022, a man with ALS successfully used a computer independently after receiving a BCI that converts his neural activity to cursor movement. Elon Musk’s company Neuralink is also pursuing this technology; in January 2024, the company’s first patient received the implant.
Generally, however, commercial brain tracking technologies are still largely non-invasive and appear in the form of wearable headbands and headphones. Estimates show that the burgeoning neurotech industry is growing at an annual rate of 12% and is expected to reach $21 billion by 2026. Companies like Muse and Brainbit have developed headbands that collect brain activity to improve meditation and sleep. Further NeurOptimal developed EEG sensors designed to assist users with their golf game through “brain training,” while Emotiv developed EEG headphones claiming to monitor attention in the workplace. Just one year ago Apple patented a design for AirPods that measure and collect brain signals from the wearer indicating these technologies are becoming increasingly mainstream.
Commercial brain tracking presents new privacy risks. While brain data in medical settings is protected by HIPAA (Health Insurance Portability and Accountability Act), these protections do not apply in a commercial context governed by FTC (Federal Trade Commission) at the federal level. As a result consumer's neural data could be collected stored sold with little oversight leading to unwanted disclosure increased surveillance productivity monitoring targeted advertising.
Neural data collected commercially may be used without knowledge or understanding making inferences about individual health revealing conditions like epilepsy anxiety depression linked through patterns called biomarkers neuromarkers predicting future outcomes including learning styles substance use guiding treatment approaches posing risks when disclosed sensitive information.
Additionally workplace surveillance poses significant risks as CDT explained Tech-company Emotiv promoted EEG headphones reading cognitive states providing data boosting productivity risking discrimination eroding trust using neural data assessing employee productivity potential harms arising from practices such as unwanted disclosure sensitive health information diagnoses increased surveillance productivity monitoring workplace targeted advertising widespread targeting based on unique responses stimuli combining screen content real-time commodification problematic extension manipulation discrimination invasion privacy
Recognizing potential risks policymakers responded April 2024 Colorado first state passing legislation protecting neural data expanding scope Colorado Privacy Act including biological generated definition sensitive California Minnesota introduced similar legislation positive step emergent concerns consumer neurotechnologies recent American Privacy Rights Act House Representatives included definition
These policies well-intentioned don't go far enough Neural privacy advocate Nita Farahany points out Colorado law applies only biological identification purposes Many companies developing aiming identify individuals instead making inferences mental state mood productivity training artificial intelligence systems language therefore might make law inapplicable wide swath commercial activity intended reach
Moving forward important further understand risks presented respond accordingly Without protections risk ceding essential autonomy