The Machines Say You’re Lying

Not because they caught a contradiction in your words, or discovered proof of your deceit. No, they say you’re lying because your microexpressions suggest anxiety. Your eyes flinch, your voice wavers, your posture stiffens. The algorithm has spoken: you're not to be trusted.

But what if you're just nervous? Or neurodivergent? Or culturally different from the biased dataset the machine was trained on?

This is the sinister promise of emotion recognition technology: not that it will help us understand each other, but that it will reduce us to a set of legible signs, categorized and judged by opaque systems built on flawed assumptions about what emotion even is.

The Enlightenment gave us the myth that reason should be pure, unsullied by emotion. That objectivity and rationality meant suppressing feelings, ignoring the body, and abstracting away from lived experience. This view served empires well. It justified hierarchies by painting colonized peoples as “too emotional” to govern themselves. It reduced women, the poor, and the “irrational classes” to caricatures of volatility—too moved by passion to be trusted with power.

This divide—between reason and emotion—was never neutral. It was constructed and weaponized.

What we now understand from cognitive science is that the dichotomy is false. Emotions are not interruptions to thought. They are thought. Or, more precisely, they are the way thought feels when it becomes meaningful.

Gut Feelings and Smart Heuristics

Gerd Gigerenzer, a psychologist and director emeritus of the Max Planck Institute for Human Development, has spent decades studying decision-making under uncertainty. His work disrupts the conventional notion that rationality is purely about statistical calculation and dispassionate logic. Instead, he demonstrates that people rely on gut feelings—and that these aren’t irrational mistakes but smart heuristics.

In Gut Feelings: The Intelligence of the Unconscious, Gigerenzer explains that emotion-laden intuitions are actually the result of evolved shortcuts, allowing us to make rapid, effective decisions in situations where information is incomplete or time is limited.

“Our emotions are not our enemies. They are our allies. Intuition is a form of unconscious intelligence, not a flaw.”

For Gigerenzer, the best decisions often feel right. Emotion is a way the body communicates accumulated experience and tacit knowledge. To treat this as a bug in the system is not just a scientific mistake—it’s a moral one.

Our Brains are Not Logic Machines

George Lakoff, a cognitive linguist, argues that all reasoning is metaphorical, and all metaphors are grounded in the body. His work in Philosophy in the Flesh and The Political Mind makes it clear: our brains are not logic machines. They are embodied systems that think with feelings.

Lakoff’s critique of Western philosophy is devastating in its simplicity: Descartes was wrong. We do not think independently of our bodies, our emotions, or our environments. The idea that logic can be "clean" or abstracted from values is a fantasy—often one used to justify power structures.

Lakoff also shows how emotion shapes political thought. Conservative and liberal worldviews, he argues, are built on moral metaphors rooted in feelings of safety, threat, care, and punishment. That’s not a bug—it’s how cognition works.

“There is no such thing as dispassionate logic. All thought is emotionally structured. That’s how we know what to care about.”

The Colonial History of “Controlling” Emotion

The Western tradition’s suspicion of emotion has never been apolitical. From Aristotle’s belief that some people were “natural slaves,” to Victorian ideals of stoicism and self-mastery, emotion has been coded as inferior, dangerous, and feminine.

This logic was extended to justify imperial domination. Colonized peoples were framed as overly emotional—too impulsive, too expressive, too embodied to be trusted with democracy. In the white supremacist playbook, feelings became a sign of racial inferiority. Even today, this legacy lives on in how Black people are policed for "anger," how Indigenous grief is dismissed as disorder, how neurodivergent people are labeled as unstable or “erratic.”

The control of emotion became a way to enforce social order. And now, with AI, it is being automated.

Emotion recognition technology, far from being neutral, inherits this legacy. It re-enacts these biases through data. It punishes those who deviate from normative standards of expression. It does not understand emotion—it disciplines it.

Surveillance with a Smile (or Else)

Emotion recognition technology is no longer a speculative gimmick—it’s already being deployed in settings where the consequences are real, and often devastating. From classrooms to border checkpoints, these systems are sold as tools of efficiency and objectivity. In reality, they enforce a narrow, discriminatory vision of what feelings should look like, and who is allowed to have them.

Airports and Border Security

At some international airports—including in China, the U.S., and parts of the EU—emotion recognition software is being tested to detect “suspicious” behavior. The promise is seductive: catch bad actors before they act. But the mechanism is broken. These systems scan faces for microexpressions associated with fear or stress, under the assumption that a guilty conscience will show itself. But many travelers experience fear for reasons unrelated to guilt. The system can’t tell the difference—and doesn’t care.

Hiring and Workplace Monitoring

Companies like HireVue and Retorio offer AI video interview tools that analyze facial expressions and voice tone to predict suitability. These platforms penalize emotional variance, monotone speech, or the wrong kind of smile. Rather than reduce bias, they often amplify it—disfavoring neurodivergent candidates, people with accents, or those from non-Western cultures.

Even worse, some workplaces use emotion-sensing surveillance to monitor morale or detect “emotional compliance.” If you’re not smiling enough, you’re underperforming. If you smile too much, you're not serious. If you look neutral, you might be hiding something.

Predictive Policing and “Pre-Crime”

Startups like Faception have pitched tools that claim to detect criminal intent based on emotional expression. These technologies combine bad psychology with worse policing—embedding racial profiling and ableism into machine logic.

Emotion recognition becomes a tool for pre-emptive suspicion. Not because of what someone has done, but because of how they appear to feel.

Emotion Surveillance in Schools

Some EdTech platforms now promise to help teachers detect boredom, confusion, or disengagement using facial analysis on students. But these interpretations are often wrong, and the damage is real. Fidgeting is pathologized. Stillness is interpreted as apathy. Students become data points in an automated morality play, constantly judged for feelings they may or may not be having.

Emotion Is Meaning

Emotion is not something to be removed from decision-making—it’s what gives decisions meaning. It tells us what matters, what to protect, what to fight for. In politics, in parenting, in protest—emotion guides judgment. It isn’t irrational. It’s relational. It ties cognition to context.

The danger of emotion recognition is not only that it doesn’t work. It’s that it promotes a false understanding of emotion in the first place. It imagines that feelings can be extracted from behavior and converted into data, then sold as truth.

But emotion is not a performance. It’s a process. And no machine—not yet—has the empathy to understand it.

The future of authority must be grounded in mutual understanding—not automated misreading. We must resist the idea that feelings can or should be made machine-readable. Because what starts as a misdiagnosis quickly becomes a justification for punishment, exclusion, or control.

Rather than train machines to feel, let’s teach ourselves to listen.

Let’s stop treating emotion like a glitch in the system—and start recognizing it for what it is: the code beneath it all.

Tiktok failed to load.

Enable 3rd party cookies or use another browser