Your face is central to your identity. If someone steals your face from an image and puts it on someone else’s body, that action is illegal. However, if someone scrapes your face off social media and puts it through facial recognition technology (FRT), that may not only be legal, but it can endanger you and everyone around you. The general assumption is that such activities are either illegal or strongly monitored by government agencies. The truth is much more cloudy, and in today’s environment, that’s a problem.
The Associated Press ran a story this morning (Saturday, March 29) that was chilling for anyone who might have been involved in any kind of protest over the past five or so years. A software developer, Eliyahu Hawila, built a tool that, allegedly, can identify protesters even if their face is covered. Supposedly, the use of Hawila’s software was used to identify a female student who attended a pro-Palestinian protest. She was wearing a mask and a headscarf so that only her eyes were showing. Yet, Hawila not only published photos of her full face but continued to dox the young woman in such a way as to cause her to lose her employment.
Hawila is part of an anti-Palestine movement that wants to identify anyone who attends a pro-Palestine protest or march and have them deported. “Please tell everyone you know who is at a university to file complaints about foreign students and faculty who support Hamas,” Elizabeth Rand, president of a group called Mothers Against Campus Antisemitism, said in a Jan. 21 post to more than 60,000 followers on Facebook. It included a link to an ICE tip line.
Another pro-Israel group posted, “Do you know students at Columbia or any other university who are here on a study visa and participated in demonstrations against Israel? If so, now is our time!” The message was posted in Hebrew.
One might think that doxing is flatly illegal. It’s not. In fact, the lack of a federal standard on the technology has led to a Swiss cheese-styled mix of open allowance, government oversight, and outright banning of FRT, depending on which state the data is being collected.
Faces are becoming easier to capture from remote distances and cheaper to collect and store. Unlike many other forms of data, faces cannot be encrypted. The current volume of data housed in various databases (e.g., driver’s licenses, mugshots, and social media) exacerbates the potential for harm because unauthorized parties can easily “plug and play” numerous data points to reveal a person’s life. Moreover, data breaches involving facial recognition data increase the potential for identity theft, stalking, and harassment because, unlike passwords and credit card information, faces cannot easily be changed.
Given the impracticality of avoiding FRT, privacy experts argue that omnipresent surveillance chills activities protected by the First Amendment to the U.S. Constitution, such as “free democratic participation and … political activism.” Further linking this to privacy, Margot Kaminski, associate professor at Colorado Law and Director of the Privacy Initiative at Silicon Flatirons, explains that “a [government] that’s capable of tracking your face wherever you are [is] capable of tracking your location wherever you are, which means it’s capable of tracking every association you have.”
Having that information in the hands of the government is bad enough. When the same information is in the hands of a special interest group or private company, the danger grows exponentially. Not only is there the risk of one’s name being given to immigration authorities, but it creates the ability for someone to extort money, limit activities, or otherwise intimidate the person whose image was stolen.
Instead of banning law enforcement FRT use, some jurisdictions have enacted laws imposing government oversight. In Virginia and Pittsburgh, Pennsylvania, prior legislative approval is now required to deploy FRT. Before conducting a facial recognition search, Massachusetts and Utah require law enforcement to submit a written request to the state agency maintaining the database. Similar proposals have been made in Kentucky and Louisiana.
Judicial oversight is imposed in Massachusetts and Washington by requiring law enforcement to obtain a warrant or court order prior to using FRT. Officers in Maine must now meet a probable cause standard prior to making a FRT request, and are prohibited from using a facial recognition match as the sole basis for a search or arrest.
Notice how much difference there is in those laws. Now, consider that several states have no laws regarding FRT at all! Any unauthorized surveillance from anyone, law enforcement or otherwise, continues unabated and unchallenged. Frequently, people are not aware of who, when, or how their facial identities have been stolen.
It’s not like the data itself is all that secure, either. Databases in any organization, including the federal government, are subject to hacking. Once the information is hacked, it can easily be sold on the black market to companies for such reasons as debt collection, movement tracking, and more.
One approach that indirectly regulates commercial FRT use is to regulate the collection and use of biometric data. Illinois’s Biometric Information Privacy Act (BIPA) provides that private entities seeking to use consumers’ biometric information, including facial recognition, must first notify them of the collection. Disclosure of collected biometric data is prohibited without consent, and entities cannot profit from the data. By affording consumers a private right of action, BIPA allows them to hold companies like Clearview AI and Facebook accountable.
Those laws are rare, however, and in today’s climate where turning in one’s neighbors to law enforcement is strongly encouraged, the lack of such laws creates a liability for everyone to protect their own identity.
Calls for a federal law governing the use of FRT have been growing. Some companies have voluntarily limited the sale of their software to law enforcement, while others have closed their FRT departments completely. Yet, independent software engineers such as Eliyahu Hawila demonstrate how futile it can be to attempt to regulate the technology at all.
If technology can identify someone with only their eyes showing, what steps do we have left to protect ourselves? Are we doomed to have to wear full face-covering helmets any time we’re in public? We need to find an answer and force the federal government to act in ways that protect individual privacy above everything else.
Discover more from Chronicle-Ledger-Tribune-Globe-Times-FreePress-News
Subscribe to get the latest posts sent to your email.