Apple’s latest facial identification technology has set much of the tech world abuzz; while many have lauded Apple’s technological prowess, others have expressed concern over user privacy and the potential for violations of the Fourth and Fifth Amendments. Facial recognition may not be at the forefront of authentication technology for long, however. The next great biometric identifier may be just around the corner, yet it is receiving significantly less public notoriety. One step past facial recognition, “passthoughts,” or single-step, multi-factor authentication systems based on EEG-detected brain activity, could be used as device-unlocking mechanisms as early as 2030.

 

To be considered truly multi-factor, an authentication method must draw from at least two of three main classes: knowledge (something only the encrypting party would know); possession (something only the encrypting party would have); or inherence (something unique to the encrypting party, such as her fingerprints or iris). Multi-factor authentication is appealing because more verification factors generally means a higher level of security, and thus a more difficult passcode to crack. However, more factors also means more steps for users—for instance, typing in a passcode and allowing a fingerprint scan—which has deterred some from using multi-factor authentication. Passthoughts incorporate both knowledge and inherence, but they do so in one step, making them significantly more accessible (and therefore more appealing) to impatient users.

 

What exactly is a passthought? The name itself is fairly indicative. Essentially, the passthought user thinks of a word, phrase, or image only he or she knows to use as a passcode key. Next, the user allows an EEG device to monitor his or her brain’s electrical activity; while wearing the EEG device, if the user thinks of the passcode key, the EEG device reads that neuro activity produced by the word, phrase, or image, and it sends a signal to unlock the passthought-protected device. The result is a single-step, multi-factor authentication process, where the word, phrase, or image constitutes the knowledge factor, and the brain activity itself provides the inherence factor. As neurolaw expert Nita Farahany explained in her presentation at the World Economic Forum in the Swiss Alps, “you could think a song, a little ditty in your head while you are wearing a consumer-based EEG device, and then that, which has a unique neuro signature, can be used as your passcode. It turns out that is an incredibly effective and incredibly safe, almost impossible to replicate, passcode.”

 

While passthoughts may sound somewhat farfetchedly futuristic, researchers in a 2013 study at the University of California at Berkeley were able to match brain signals to the person thinking them 99 percent of the time. Significantly, the study used a consumer-based Bluetooth EEG device (cost of $100), suggesting that passthought protection may soon be available to the general population, as high levels of accuracy can be achieved without rare or exceedingly expensive technologies.

 

The thought is both exciting and alarming. Farahany posits that consumers may eventually adapt to wearing EEG devices at all times, logging into bank account, phones, computers, and more with a single thought. The problem, Farahany cautions, is that “you are interested in your data wearing an EEG device that reads your brain activity at all times, but you are sharing it with your computer and software and with apps…and it’s all part of the free apps that you are sharing with not good Samaritans,” leaving user open to security breaches or identity theft.

 

Along similarly cautionary lines, a recent study from the University of Alabama at Birmingham found that playing video games while wearing an EEG headset could put users at risk for security and data breaches, as malicious software could potentially link gamers’ brain activity with their typing and thus gain access to any passwords or PIN numbers that were input while the user was still wearing the EEG device. The study noted that, after observing a person type a mere 200 characters, “algorithms could make educated guesses at new characters a person entered just by watching the EEG data. That could let a malicious game, say, snoop on someone taking a break to go on the Web [to check an online account or view sensitive data]. It is far from perfect, but it shortens the odds of guessing a four-digit numerical PIN from one in 10,000 to one in 20, and increases the chance of guessing a six-letter password by around 500,000 times, to roughly one in 500.”

 

As noted briefly above, the same Fourth and Fifth Amendment concerns inherent in facial recognition authentication can be applied to passthoughts. However, the fact that EEG devices can read, learn, and—to an extent—predict human behavior presents additional privacy and security issues, and begs the question of how government or outside actors may use the technology. In 2002, NASA began to focus on the development of brain monitoring devices to use in airports, the intent being to analyze passenger’s brain activity to analyze and prevent security risks. Likewise, it has been suggested that EEG devices could be used in police stops to determine an individual’s mental state.

 

In response to these rapid developments in EEG reading and neurotechnologies, lawyers and ethicists have suggested the creation of four new human rights: a right to cognitive liberty, a right to mental privacy, a right to mental integrity, and a right to psychological continuity. This suggestion, however, highlights the different roles courts and the legislature play in the liberty of the average American citizen, and it raises the question of whether the protection of one’s mental privacy as threatened by EEG devices should fall under the umbrella of existing law and legal interpretation, or if the (arguably distant) threat of “mind reading” can and will merit the creation of a new set of human rights entirely. While the Fifth Amendment’s broad protections against self-incrimination seem largely applicable, the possibility of a new levels of surveillance and brain activity monitoring raise concerns heretofore unseen by the courts, and the issue is one the legal, philosophical, and technological communities should all continue to monitor closely in the future.

–Sydney Rupe

Tagged with:
 

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>