DHS’ New Orwellian Precrime Tech Promises Facial Recognition Software Can Tell If You’re a Terrorist

surveillance camera

Thoughtcrime was not a thing that could be concealed forever. You might dodge successfully for a while, even for years, but sooner or later they were bound to get you. ~ George Orwell, 1984

In George Orwell’s dystopian novel 1984, thoughtcrime is the criminal act of holding unspoken beliefs or doubts that oppose or question the ruling party of Oceania. In the book, the government attempts to control not only the speech and actions, but also the thoughts of its subjects.

Entertaining those unacceptable thoughts is referred to as crimethink, and facecrime was an indication that a person is guilty of thoughtcrime based on facial expression.

Orwell’s Thought Police are charged with uncovering and punishing thoughtcrime and thought-criminals. They use psychological methods and omnipresent surveillance to search, find, monitor, and arrest members of society who could challenge authority and the status quo – even if only by thought.

In the novel, technology played a significant part in the detection of thoughtcrime. Omnipresent telescreens are used to inform the government and monitor the population – and to misinform the people. The citizens of Oceania are watched by the Thought Police through the telescreens. Every movement, reflex, facial expression, and reaction is measured by this system, monitored by the Ministry of Love.

Life imitates art, and 1984 has become an instruction manual of sorts for the US government.

You’ve heard of racial profiling, but have you heard of FACIAL profiling?

No? Well, now it’s here.

From Reuters:

An Israeli start-up says it has developed a personality prediction program that could help police and security services spot people with malign intentions, but an independent security expert says it could harm individual freedom.

Creeped out yet?

Here’s one of Faception’s promotional videos…

…and here is how Faception describes their product:

Faception is pioneering a new market for analyzing anonymous individuals who may impose a threat to public safety.

Our solution enables security companies and agencies to more efficiently detect and apprehend potential terrorists or criminals before they have the opportunity to do harm.

Here’s how the system works:

Faception offers a breakthrough computer-vision and machine learning technology that analyzes a person’s facial image and automatically develops a personality profile.

Oh, and check out the company’s “value proposition”:

Enrich your profile database with a variety of personality scores

Turn unknown individuals into known ones

Integrate your facial recognition solution with Faception’s solution to get real time personality profiles

And hey, guess what? Faception says that a Homeland Security agency has already signed a contract with them to help identify terrorists.

The Washington Post reported on the program recently:

The company said its technology also can be used to identify everything from great poker players to extroverts, pedophiles, geniuses and white collar-criminals.

Faception has built 15 different classifiers, which Gilboa said evaluate with 80 percent accuracy certain traits.

Pedro Domingos, a professor of computer science at the University of Washington and author of The Master Algorithm, told the Post that there are ethical questions surrounding this kind of technology:

Can I predict that you’re an ax murderer by looking at your face and therefore should I arrest you? You can see how this would be controversial.

Alexander Todorov, a Princeton psychology professor whose research includes facial perception, added:

The evidence that there is accuracy in these judgments is extremely weak. Just when we thought that physiognomy ended 100 years ago. Oh, well.

Faception chief executive Shai Gilboa explained the pseudoscientific quackery technology the company uses:

We understand the human much better than other humans understand each other. Our personality is determined by our DNA and reflected in our face. It’s a kind of signal.

Want to know more? Good luck. Information on how the system actually works is vague, and Gilboa – who said he also serves as the company’s “chief ethics officer” – said he will never share the classifiers that predict negative traits with the general public.

Nimrod Kozlovski, an independent security information expert, told Reuters that the ability to predict illegal activity before it happens could harm individual freedoms and presents a severe challenge to legislators in a democracy.

Certainly, advances that enable to monitor an individual and assess traits or attributes about individuals in the open … changes the balance (and) risks private freedoms.

You try to investigate illegal activity after you have some evidence of it. We do not predict illegal activity that hasn’t happened and prevent it ahead of time. This predictive and preventive mode is not something which is even in the architecture of existing law.

Ethics, laws, civil liberties…none of that will stop government spying, and as technology evolves, they’ll become even better at it.

Delivered by The Daily Sheeple

We encourage you to share and republish our reports, analyses, breaking news and videos (Click for details).


Contributed by Lily Dane of The Daily Sheeple.

Lily Dane is a staff writer for The Daily Sheeple. Her goal is to help people to “Wake the Flock Up!”

Leave a Comment