This Company Claims It Can Tell If You're a Terrorist Simply by Looking at Your Face

(ANTIMEDIA) Can you predict who is a murderer just by looking at their face? What about a pedophile? A software company now says it can, and it claims it is able to identify terrorists purely by their facial features. Turning the old idiom that “you can’t judge a book by its cover” on its head, the two-year-old company claims its artificial intelligence algorithms can look at a face and tell if it’s likely to be a terrorist, pedophile, and, wait for it… professional poker player.

The bold claims are made by Israeli start-up Faception, which boasts its breakthrough computer-vision and learning technology can analyze a person’s facial image and automatically develop a personality profile. Claiming its technology will enable security companies to detect and apprehend terrorists and criminals before they have the opportunity to do harm, the company has already signed a contract with the Department of Homeland Security, according to the Washington Post. The Mirror reports Faception’s technology correctly identified 9 of the 11 Jihadists involved in the Paris massacre with no prior information about their involvement.

terrorist

“We understand the human much better than other humans understand each other,” says Faception chief executive Shai Gilboa. “Our personality is determined by our DNA and reflected in our face. It’s a kind of signal.”

Faception uses 15 classifiers that are undetectable to the human eye, including extrovert, genius, professional poker player, pedophile, and terrorist. The classifiers allegedly represent a certain persona with a unique personality type or collection of traits and behaviors. Algorithms then score the individual according to their fit to the classifiers.

The startup’s claims were validated at a poker tournament, where the technology correctly predicted that four players out of 50 amateurs would be the best.

That said, Gilboa admitted a worrying gap in accuracy in that the system is only correct 80% of the time. Roughly translated, this means one in five people could be incorrectly identified as a pedophile or terrorist.

Unsurprisingly, experts have pointed out ethical questions in using the tricky technology. Pedros Domingos, a professor of computer science at the University of Washington, told the Washington Post the evidence of accuracy in the judgments is extremely weak, while Princeton psychology professor, Alexander Todorov, observed Faception comes “[j]ust when we thought that physiognomy ended 100 years ago.”

The new technology raises concerns that relying on it will take us down a dark route that promotes dubious preconceptions of who and what constitutes a terrorist. If the creepy profiling really is accurate, however, perhaps the first places it should be rolled out are in the corridors of power and the film industry.


This article (This Company Claims It Can Tell If You’re a Terrorist Simply by Looking at Your Face) is free and open source. You have permission to republish this article under a Creative Commons license with attribution to Michaela Whitton and theAntiMedia.org. Anti-Media Radio airs weeknights at 11pm Eastern/8pm Pacific. Image credit: Jonathan McIntosh. If you spot a typo, email edits@theantimedia.org.

Since you’re here…

…We have a small favor to ask. Fewer and fewer people are seeing Anti-Media articles as social media sites crack down on us, and advertising revenues across the board are quickly declining. However, unlike many news organizations, we haven’t put up a paywall because we value open and accessible journalism over profit — but at this point, we’re barely even breaking even. Hopefully, you can see why we need to ask for your help. Anti-Media’s independent journalism and analysis takes substantial time, resources, and effort to produce, but we do it because we believe in our message and hope you do, too.

If everyone who reads our reporting and finds value in it helps fund it, our future can be much more secure. For as little as $1 and a minute of your time, you can support Anti-Media. Thank you. Click here to support us