Is It Time To Crack Down On Facial Recognition?
At least half of American adults have their photo in a facial recognition network that authorities can search without a court order or meaningful privacy protections.
But that finding by Georgetown University researchers, who dubbed the networks a “perpetual line-up” in late 2016, generated little legislative activity.
Alvaro Bedoya, a Georgetown law professor who helped write the report, says he knows of a single lawmaker, Maryland state Del. Charles Sydnor, who took up his call for regulating official use of facial recognition tools.
Sydnor, a Baltimore County Democrat, later gave up on that bill, which would require police to get a court order to search databases of driver’s license photos, after it died in committee last year.
Some privacy advocates say, however, that the issue's time has come and that regulations are needed as states write rules for other emerging technologies, such as drones and cell-site simulators.
Hints of authoritarian potential are emerging, they say, as Chinese police don facial recognition sunglasses and Dubai authorities apply facial recognition tools to the emirate’s security cameras.
The line between corporate and governmental use can get blurry, as the authorities can take records from companies, generally without a warrant. And in all but two states, Bedoya said, “the law does not clearly block a company from identifying you to strangers."
"Picture your son or daughter pumping gas and having a stranger ID them, by name, just by taking a photo,” he said. “This is the reality right now in Russia, and it’s leading to women being harassed and political dissidents being outed as having participated in anti-Putin protests. It gets dark pretty quickly.”
Private-sector fight
One of the public’s most common interactions with facial recognition technology comes from using Facebook, which in the U.S. suggests “friends” to tag in photos. But companies such as Amazon sell similar technology to other companies, and rapid analysis of video with high accuracy has hit the mass market.
C-SPAN, one Amazon customer, recently introduced auto-tagging of speakers on the public-interest channel's video feeds, more than doubling the archiving speed of content.
Opinions vary on whether the technological advances are shrug-worthy or spooky, but Illinois has emerged as the nation’s battleground for corporate use.
Illinois’ 2008 Biometric Information Privacy Act requires user consent and disclosure from companies about how they use biometric data such as fingerprints, iris scans, and face scans.
Texas has a similar law, but in Illinois, a wave of lawsuits filed in 2016 and 2017 accuses companies of violating the privacy act, with cases against Facebook, Shutterfly, and other companies attracting broad outside interest.
“Is Illinois big enough to make a difference? Absolutely,” said Brenda Leong, senior counsel and director of Strategy at the Future of Privacy Forum. “Other states are watching that and drafting bills to emulate that language.”
Leong said one possible outcome is that companies will choose not to use the technology in regions where doing so puts them at risk of being sued.
“The most dramatic is something that affects consumer ability to use a product in the state,” she said. “For example, Facebook does not offer photo tagging in Europe.”
Bedoya said a logical next step in regulating facial recognition is a spread of state laws requiring companies to get user permission before enrolling them in a facial recognition database.
“Right now, the future lies in regulation along the lines of what Illinois and Texas do, requiring consent to collect and share biometric data,” Bedoya said.
At the federal level, efforts at regulation haven’t amounted to much.
The National Telecommunications and Information Administration has twice begun developing industry best practices for facial recognition. A second attempt ended in 2016 after many privacy groups quit in protest. The NTIA reviews resulted only in the urging of transparency and consideration of voluntary restraint.
A group of privacy groups led by Bedoya issued a statement that recommendations in the final NTIA report “are not worthy of being described as ‘best practices’” and “make the case for why we need to enact laws and regulations to protect our privacy.”
“For commercial use of face recognition, the multi-stakeholder process was a failure. It became too crowded with industry associations who wanted to water it down into meaninglessness,” Bedoya said.
The NTIA did not respond to a request for comment on any future plans to work toward reining in corporate use.
Time to pressure the authorities?
When Sydnor introduced legislation in Maryland to rein in police use of facial recognition, the effort went nowhere. But he sees an inevitable clash, lumping facial recognition into a broader question of how to apply the Fourth Amendment's protection from warrantless searches to new technology.
“Right now, we’re dealing with a bunch of 20th century laws that need to be brought into the 21st century. And how that looks, to be honest, I’m not certain,” he said.
Peter Asaro, a philosopher of technology at the New School in New York, said it may be best to see how authorities use the tools before getting too specific with regulations.
“I think we need transparency and improved privacy regulations now, and will likely need more as new applications emerge,” Asaro said.
With facial recognition, some preliminary research shows racial bias because darker skin may be more difficult for machines to visually analyze.
“Generally, machine learning will replicate and amplify any bias in the data,” Asaro said. “For many consumer applications, this bias simply means the systems don't work well if you do not look like the people who designed the system. But if that system is going to be used to make a significant decision about you, track you, notify police to stop you, prevent you from boarding a plane, etc., then those error rates really do matter.”
Asaro said some present-day applications could be worrying if more broadly deployed.
“It really depends on the nature of the application," he said. “Recognizing members of Congress at a public event in Washington is one thing. Trying to track them to hotels and restaurants, and secret meetings, using that technology would be a different thing.”
For now, Sydnor has abandoned his effort to regulate police use of facial recognition technology in favor of legislation that would set up a task force to examine how police use new surveillance technologies. The bill passed the lower house of the state legislature last year, and he feels he can build enough support for it to become law this year.
Sydnor said public and private-sector use of facial recognition can be two sides of the same coin, owing to the current U.S. legal principle that people — with some exceptions, such as medical records — have no expectation of privacy over information voluntarily shared with a business. The so-called Third-Party Doctrine is a flashpoint in many federal-level fights over FBI and intelligence-agency acquisition of records.
“I don’t think people do realize when you’re giving private companies this information, if law enforcement wanted access, all they have to do is ask," Sydnor said. "When we talk about cellphone data, you only need a subpoena; you don’t need a warrant. It’s a matter of educating the public, and that takes a while."
Bedoya said the time has come to act, amid a groundswell of legislation for other technologies.
“While a handful of states regulate specific aspects of the technology, not a single state regulates face recognition the way dozens of states regulate police use of geolocation technology, drones, and automated license plate readers,” Bedoya said.
To legislators in other states, Sydnor has a message: “Have the debate. It’s here to stay. The technology is a genie that’s out of the bottle. ... I’m not trying to take it away. They are valuable tools. It’s just: How we are we going to use it in a constitutional way?”
Do you like this page?