Psyche is building the foundational model for emotional cognition. Voice-first. Grounded in physiology. Because the most important signals are the ones you can't fake.
Today's AI can reason, generate, and predict. But it can't feel a racing heartbeat, detect a tremor in someone's voice, or sense the stress that never makes it into words.
Emotional intelligence isn't a software problem. It's a measurement problem. The signals that reveal how someone truly feels — heart rate variability, vocal micro-tremors, electrodermal response — are physiological. They happen inside the body. No model gets access by default.
Psyche is building that access layer.
The signal layer is general. Anywhere a person speaks to an interface, there's emotional data being left on the table.
Telehealth
Clinicians on video calls lose the body-language signals they rely on in person. Voice-derived physiology brings them back.
Clinical research
Affective and mental-health studies that need real, in-the-moment signals at scale — not self-report on a five-point scale.
Voice agents
AI assistants and support systems that adapt to frustration, confusion, or distress before it escalates.
Coaching & therapy
Signal-based feedback that gives practitioners a view into what happens when they're not in the room.
Everyone has a phone. No one needs a wearable. Voice carries more emotional information than any other signal you can capture at scale — and it's already being recorded in every telehealth call, support session, and clinical encounter.
Most emotion AI trains on actors, crowdsourced reactions, and performative data. We work with live, non-performative physiological signals — what's actually happening inside someone, not what they're showing the world.
AI capabilities will keep advancing. Reasoning will improve. But no model gains a body. The layer that connects AI to human physiology remains necessary — and becomes more valuable — as everything else gets smarter.
Two-sided architecture
What's happening to the person
What's happening inside the person
Psyche is born out of Carnegie Mellon's Human-Computer Interaction Institute, where our research in wearable sensing and affective computing has been published and peer-reviewed.
A small team in Pittsburgh, building at the intersection of signal processing, machine learning, and human physiology.
We're looking for design partners, clinical collaborators, and people who believe emotional intelligence shouldn't be left out of AI.