The CEO of a company that makes facial recognition software has publicly stated that his company will not sell to law enforcement or governments.
(AP) — In recent months controversies have erupted over various tech companies contracting with the various law enforcement and military agencies. At Google, employees publicly expressed their distaste for the company’s contract to provide the U.S. Department of Defense with Artificial Intelligence technology. The frustration was so high that some Google employees actually quit. Amazon was also faced with internal strife as a group of employees circulated an internal letter to CEO Jeff Bezos (who also owns the Washington Post, a newspaper with close ties to U.S. intelligence agencies) demanding that he stop selling Amazon’s Rekognition facial recognition software to law enforcement.
“Amazon has been heavily marketing this tool—called “Rekognition”—to law enforcement, and it’s already being used by agencies in Florida and Oregon,” the EFF wrote. “This system affords the government vast and dangerous surveillance powers, and it poses a threat to the privacy and freedom of communities across the country. That includes many of Amazon’s own customers, who represent more than 75 percent of U.S. online consumers.”
Amazon and Google’s partnership with law enforcement and government has not only sparked a resistance from activists and civil liberties groups. The issue has actually caused the CEO of another tech company to declare that his company will not do business with such agencies. Brian Brackeen, CEO of Kairos, a producer of facial recognition software, recently made the announcement in an op-ed piece for Tech Crunch.
“Having the privilege of a comprehensive understanding of how the software works gives me a unique perspective that has shaped my positions about its uses,” Brackeen writes. “As a result, I (and my company) have come to believe that the use of commercial facial recognition in law enforcement or in government surveillance of any kind is wrong — and that it opens the door for gross misconduct by the morally corrupt.”
Brackeen also states that current facial recognition software has a tendency to falsely identify people of color. “To be truly effective, the algorithms powering facial recognition software require a massive amount of information. The more images of people of color it sees, the more likely it is to properly identify them,” Brackeen wrote. “The problem is, existing software has not been exposed to enough images of people of color to be confidently relied upon to identify them. And misidentification could lead to wrongful conviction, or far worse.”
“There is no place in America for facial recognition that supports false arrests and murder.”
Brackeen told The Wall Street Journal that his company had refused contracts for building a facial recognition program for Axon Enterprise Inc., formerly Taser International, a maker of police body cameras and electric weapons. He also stated that Kairos had refused to build a system to identify people in a crowd from footage collected via drone.
Is Mr. Brackeen exaggerating his fears of placing facial recognition software and AI in the hands of the current government and police? Is there any reason to be concerned about tech companies selling their toys to the “authorities”?
Documents obtained by the ACLU of Northern California recently revealed Rekognition, Amazon’s facial recognition program, is currently used by police in Orlando and Oregon’s Washington County. As with the Stingray cellphone surveillance tools, the tool requires law enforcement to sign nondisclosure agreements to avoid public disclosure. The EFF is calling on Amazon to “stand up for civil liberties” and “cut law enforcement off from using its face recognition technology.”
Amazon’s own promotional material states that Rekognition can identify people in real-time by “instantaneously searching databases containing tens of millions of faces.” Amazon offers a “person tracking” feature that it says “makes investigation and monitoring of individuals easy and accurate” for “surveillance applications.” Amazon says Rekognition can be used to identify “all faces in group photos, crowded events, and public places such as airports.”
The EFF warns that local police could use Rekognition to identify political protesters recorded by officer body cameras. In addition, Rekognition can track people even if it can’t see their face, can identify and catalog a person’s gender, what they’re doing, what they’re wearing, and their emotional state. The program can also flag things it considers “unsafe” or “inappropriate.”
It seems fairly obvious that these types of programs will eventually be used to target perfectly legal, legitimate behavior. This entire apparatus is a part of the growing police and surveillance state. It is of the utmost importance that consumers choose not to use these companies products and services. This means stop searching for information via Google. Use alternative search engines that don’t track you. Stop shopping on Amazon. Use decentralized marketplaces like OpenBazaar. And finally, as Brian Brackeen illustrates, CEO, managers, and other people in positions of power need to take a moral stand and refuse to do business with these governments and companies who are supporting the loss of privacy and freedom.
Since you’re here…
…We have a small favor to ask. Fewer and fewer people are seeing Anti-Media articles as social media sites crack down on us, and advertising revenues across the board are quickly declining. However, unlike many news organizations, we haven’t put up a paywall because we value open and accessible journalism over profit — but at this point, we’re barely even breaking even. Hopefully, you can see why we need to ask for your help. Anti-Media’s independent journalism and analysis takes substantial time, resources, and effort to produce, but we do it because we believe in our message and hope you do, too.
If everyone who reads our reporting and finds value in it helps fund it, our future can be much more secure. For as little as $1 and a minute of your time, you can support Anti-Media. Thank you. Click here to support us