San Francisco May Ban Police, City Use of Facial Recognition
About the photo: This photo shows a security camera in the Financial District of San Francisco. San Francisco is on track to become the first U.S. city to ban the use of facial recognition by police and other city agencies as the technology creeps increasingly into daily life. (AP Photo/Eric Risberg)
SAN FRANCISCO — San Francisco is on track to become the first U.S. city to ban the use of facial recognition by police and other city agencies, reflecting a growing backlash against a technology that’s creeping into airports, motor vehicle departments, stores, stadiums and home security cameras.
Government agencies around the U.S. have used the technology for more than a decade to scan databases for suspects and prevent identity fraud. But recent advances in artificial intelligence have created more sophisticated computer vision tools, making it easier for police to pinpoint a missing child or protester in a moving crowd or for retailers to analyze a shopper’s facial expressions as they peruse store shelves.
Efforts to restrict its use are getting pushback from law enforcement groups and the tech industry, though it’s far from a united front. Microsoft, while opposed to an outright ban, has urged lawmakers to set limits on the technology, warning that leaving it unchecked could enable an oppressive dystopia reminiscent of George Orwell’s novel “1984.”
“Face recognition is one of those technologies that people get how creepy it is,” said Alvaro Bedoya, who directs Georgetown University’s Center on Privacy and Technology. “It’s not like cookies on a browser. There’s something about this technology that really sets the hairs on the back of people’s heads up.”
Without regulations barring law enforcement from accessing driver’s license databases, people who have never been arrested could be part of virtual police line-ups without their knowledge, skeptics of the technology say.
They worry people will one day not be able to go to a park, store or school without being identified and tracked.
Already, a handful of big box stores across the U.S. are trying out cameras with facial recognition that can guess their customers’ age, gender or mood as they walk by, with the goal of showing them targeted, real-time ads on in-store video screens.
If San Francisco adopts a ban, other cities, states or even Congress could follow, with lawmakers from both parties looking to curtail government surveillance and others hoping to restrict how businesses analyze the faces, emotions and gaits of an unsuspecting public.
The California Legislature is considering a proposal prohibiting the use of facial ID technology on body cameras. A bipartisan bill in the U.S. Senate would exempt police applications but set limits on businesses analyzing people’s faces without their consent.
Legislation similar to San Francisco’s is pending in Oakland, California, and on Thursday another proposed ban was introduced in Somerville, Massachusetts.
Bedoya said a ban in San Francisco, the “most technologically advanced city in our country,” would send a warning to other police departments thinking of trying out the imperfect technology. But Daniel Castro, vice president of the industry-backed Information Technology and Innovation Foundation, said the ordinance is too extreme to serve as a model.
“It might find success in San Francisco, but I will be surprised if it finds success in a lot of other cities,” he said.
San Francisco is home to tech innovators such as Uber, Airbnb and Twitter, but the city’s relationship with the industry is testy. Some supervisors in City Hall are calling for a tax on stock-based compensation in response to a wave of San Francisco companies going public, including Lyft and Pinterest.
At the same time, San Francisco is big on protecting immigrants, civil liberties and privacy. In November, nearly 60% of voters approved a proposition to strengthen data privacy guidelines.
The city’s proposed face-recognition ban is part of broader legislation aimed at regulating the use of surveillance by city departments. The legislation applies only to San Francisco government and would not affect companies or people who want to use the technology. It also would not affect the use of facial recognition at San Francisco International Airport, where security is mostly overseen by federal agencies.
The Board of Supervisors is scheduled to vote on the bill Tuesday.
San Francisco police say they stopped testing face recognition in 2017. Spokesman David Stevenson said in a statement the department looks forward to “developing legislation that addresses the privacy concerns of technology while balancing the public safety concerns of our growing, international city.”
Supervisor Aaron Peskin acknowledges his legislation, called the “Stop Secret Surveillance Ordinance,” isn’t very tech-friendly. But public oversight is critical given the potential for abuse, he said.
The technology often misfires. Studies have shown error rates in facial-analysis systems built by Amazon, IBM and Microsoft were far higher for darker-skinned women than lighter-skinned men.
Even if facial recognition were perfectly accurate, its use would pose a severe threat to civil rights, especially in a city with a rich history of protest and expression, said Matt Cagle, attorney at the ACLU of Northern California.
“If facial recognition were added to body cameras or public-facing surveillance feeds, it would threaten the ability of people to go to a protest or hang out in Dolores Park without having their identity tracked by the city,” he said, referring to a popular park in San Francisco’s Mission District.
Local critics of San Francisco’s legislation, however, worry about hampering police investigations in a city with a high number of vehicle break-ins and several high-profile annual parades. They want to make sure police can keep using merchants and residents’ video surveillance in investigations without bureaucratic hassles.
Joel Engardio, vice president of grassroots group Stop Crime SF, wants the city to be flexible.
“Our point of view is, rather than a blanket ban forever, why not a moratorium so we’re not using problematic technology, but we open the door for when technology improves?” he said.
Such a moratorium is under consideration in the Massachusetts Legislature, where it has the backing of Republican and Democratic senators.
Often, a government’s facial recognition efforts happen in secret or go unnoticed. In Massachusetts, the motor vehicle registry has used the technology since 2006 to prevent driver’s license fraud, and some police agencies have used it as a tool for detectives.
“It is technology we use,” said Massachusetts State Police Lt. Tom Ryan, adding that “we tend not to get too involved in publicizing” that fact. Ryan and the agency declined to answer further questions about how it’s used.
Massachusetts Sen. Cynthia Creem, a Democrat and sponsor of the moratorium bill, said she worries about a lack of standards protecting the public from inaccurate or biased facial recognition technology. Until better guidelines exist, she said, “it shouldn’t be used” by government.
The California Highway Patrol does not use face recognition technology, spokeswoman Fran Clader said.
California Department of Motor Vehicles spokesman Marty Greenstein says facial recognition technology “is specifically not allowed on DMV photos.” State Justice Department spokeswoman Jennifer Molina said her agency does not use face ID technology, and policy states “DOJ and requesters shall not maintain DMV images for the purpose of creating a database” unless authorized.
Legislators also sought a face recognition moratorium this year in Washington, the home state of Microsoft and Amazon, but it was gutted following industry and police opposition. Microsoft instead backed a lighter-touch proposal as part of a broader data privacy bill, but deliberations stalled before lawmakers adjourned late last month.