Future Tense

If Police Have Devices That Can Read Your Mind, How Does the Fifth Amendment Fit In?

A woman sits in front of a table wearing a helmet with electrodes, while a man stands on the other side, his hands on the table, looming.
Technologically enhanced interrogation? Photo illustration by Slate. Photos by fpphotobank/iStock/Getty Images Plus and Min Jing/iStock/Getty Images Plus.

This article is part of the Policing and Technology Project, a collaboration between Future Tense and the Tech, Law, & Security Program at American University Washington College of Law that examines the relationship between law enforcement, police reform, and technology.

The police show up to your house. It’s the middle of the night, you are disoriented, and they want to know where you were earlier in the day. You have no idea at that moment that your ex-girlfriend was found dead, and some of your fingerprints were found at her house—but you do know you have the right to remain silent. Until the cops bring out the headset.

Advertisement

One of the hallmarks of the U.S. Constitution is the enumerated right of citizens to not be coerced into self-incrimination or be allowed to “take the Fifth.” But new technologies may one day be able to read your mind to varying degrees, rendering your decision to stay silent moot. While current devices merely collect data such as brain activity, labs are working on revolutionary devices that can record thoughts or allow for telepathic communication. They may be years or decades away, but they are worth thinking about now. As the courts seem to be moving toward allowing more and more personal data to be used as evidence. This data may eventually be both a window into mind and a side-step to Fifth Amendment protection.

Advertisement
Advertisement
Advertisement

Brain-computer interface devices are poised to become an integral treatment for diseases of the nervous system, by restoring brain function, mapping the brain, and enhancing cognitive function. These devices function through direct communication between the signals from a person’s brain and an external computer. Some BCI devices are already on the market, though they are more quotidian: Muse, for instance, offers a wearable EEG device to aid in meditation, already have devices on market. Elon Musk’s Neuralink and Synchron’s Stentrode, which both aim to return motor function to patients with neuromuscular conditions such as paralysis, are working on more invasive interventions, which would require surgery for implantation. Neuralink is designing a robot-driven brain surgery, while Synchron will be implanted via the patient’s blood vessels. Both have received breakthrough status from the FDA, meaning they will have an accelerated regulatory review process.

Advertisement

Eventually, BCIs could allow paralyzed people to walk, use their arms to get dressed, or communicate verbally. Should that happen, these devices will have unprecedented access to the human mind and even an individual’s thoughts themselves. Though there are lots of “if”s here and neural data is very noisy and hard to decode, studies have shown that synthetic speech can be generated from brain recordings.

Advertisement

And we have seen in the past, medical advancements can make their way into the criminal justice system, battering the boundaries of the Fifth Amendment. While it is easy to recognize verbal self-incrimination, it quickly becomes more complex once your mouth is no longer forming words. For example, if the police pull you over and ask whether you have been drinking, you may invoke the Fifth Amendment and decline to answer. However, the Fifth Amendment does not protect you from submitting to a field sobriety test and blood samples, even though they are ostensibly incriminating information gathered from the suspect. In some states, declining such a test can be treated as admission of guilt. This line of reasoning has now been applied to our cellphones and the ways they can be unlocked. For example, though you cannot be compelled to offer up your password to open your phone in most of the U.S., in many jurisdictions you may be forced to use your fingerprint or facial scan to do so. Why? Courts have reasoned that this is no different than a blood sample or left-behind fingerprint. Here, this becomes a simple reduction to what you know versus what you have. That distinction is likely to blur as technology develops.

Advertisement
Advertisement

On one hand, forcing a person to unlock a cellphone with a fingerprint seems vastly more invasive than finding a forgotten fingerprint they’ve left behind. However, if we don’t allow room for entry into the device in some fashion, that will only precipitate the creation of powerful technologies to access locked devices, which criminals may be able to use as easily as law enforcement. After the San Bernardino shootings in 2015, the FBI had a warrant to enter the deceased suspect’s iPhone—but no method of entry, as they had no passcode. Apple intentionally has not developed backdoor entry into its devices, which wipe themselves after 10 failed login attempts. Here, if not for federal hackers, this lawful evidence would have been lost forever. This ability to use the Fifth Amendment to intentionally hamstring the fourth amendment is also problematic.

Advertisement

To deal with this concern, the Supreme Court has developed a “foregone conclusion” test to apply to such cases. It says that if the state can demonstrate that it already knows what is on the device, it may compel the owner to provide the password or other means of entry. Unfortunately, this standard is vague. Courts vary in how narrowly or broadly they apply it, leaving behind disjointed rulings. These disjointed opinions are often the outcome of cases dealing with both the Fifth and Fourth Amendments, search and seizure. The Supreme Court can choose to hear cases that resulted in split decisions, but when the chance came to offer clarity on these questions in Jones v. Massachusetts in 2019, it refused to hear the case, so it will be some time before we have any resolution.

Advertisement
Advertisement

But the problem goes even deeper, as the actual invasion into the mind may not be necessary if the data is being collected in real time by a third party. In many instances this kind of medical data as fallen outside of self-incrimination discussions all together, being easily categorized as evidence and not testimony. Recently, a judge ruled pacemaker data admissible to demonstrate the defendant’s heart rate at time of a crime. The court reasoned that there was far more sensitive information than heart rate in the human body. However, as the heart rate is controlled by a nervous system response, one could also argue this is a rough look into the mind of the defendant. Is your nervous system’s response at the time of a crime as simple as a left behind fingerprint? The courts have not established a clear demarcation between the mind and the body, which will be paramount for dealing with issues surrounding BCIs and self-incrimination. Here a basic protection for data collected from thoughts would provide a safeguard for cognitive liberty. Otherwise, we are left with loopholes that allow the state to have access to our most personal thoughts and motivations, which seems starkly against the spirit of the Fifth Amendment.

Advertisement
Advertisement
Advertisement

Moreover, as these BCI devices will save data, perhaps the Fifth Amendment will become less relevant—after all, companies will be holding onto treasure troves of neural data available for search under the Fourth Amendment. If the data is stored on a third-party device, such as a health app, it is not protected by the Fourth Amendment—under the third-party doctrine, once you voluntarily give your information to, say, a company that provides a service, you waive any expectation of privacy and as such the state may have access. This guideline is commonly applied to phone records, but is it the standard we want in place with complex neural devices?

We need to create some new privacy rules for the 21st century that catch up to technologies new ability to assess not just our bodies but our minds. We should update the third-party doctrine so that cognitive data has more protections. Courts will inevitably have to interpret and apply more tests based on it, but they need a floor to work from in order to create more coherent and united opinions. If we do nothing, the Fifth Amendment could be weakened until its inevitable death.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement