Inmates Talk, Aida Listens: The Unexpected Benefits of AI-Driven Interviews in Corrections

By Marjorie Rist, Anna Stewart, and Kellie O’Dowd-Stinchcomb

Submitted for publication in *Journal of Offender Rehabilitation* / *Correctional Mental Health Report*

 

In a justice system where trust is fragile and trauma is common, the idea that incarcerated individuals might be more open with an artificial intelligence (AI) platform than with a human interviewer may sound implausible. But growing research and field data suggest otherwise: AI may elicit more honest and vulnerable responses from people in custody, and that shift is redefining how correctional systems approach assessment, mental health screening, and rehabilitation.

At the center of this shift is Aida, an AI-driven interviewing platform developed specifically for use in jails, prisons, and community supervision. Aida doesn’t just collect data—it listens, responds with empathy, and creates a nonjudgmental environment that enables people to speak freely about trauma, risk, and need.

 

Why Inmates Open Up to AI

Research in clinical and correctional psychology shows that people often feel more comfortable sharing sensitive information with a machine than with a human. A 2014 study published in *Computers in Human Behavior* found that individuals were more likely to disclose personal details, including mental health issues, when interacting with a computer avatar rather than a person, mainly because they perceive less judgment and social evaluation.

Another study from the *Journal of Medical Internet Research* found that veterans were much more likely to reveal symptoms of PTSD to a virtual human interviewer than on traditional assessments (Gustafson et al., 2015). This effect seems to also apply to incarcerated settings.

In correctional environments—where distrust of authority figures is common, and where many have experienced trauma or abuse while housed in institutions—AI’s neutrality becomes a critical strength. Aida doesn’t show facial expressions, doesn’t signal judgment, and doesn’t cut people off. Instead, it patiently listens, asks follow-up questions, and encourages the individual to tell their story in their own words.

 

A Trustworthy Listener, not a Threat

Many individuals in custody feel anxious, suspicious, or scared during traditional intake or mental health interviews. This causes them to underreport suicidal thoughts, trauma history, or mental health issues, risking both their safety and that of the facility.

Aida changes this dynamic. It provides a psychologically safe space for disclosure, using a calm, conversational tone and motivational interviewing strategies proven to reduce resistance. As one case manager in a pilot program noted, “People say things to Aida they’ve never said out loud before.”

And when they do, lives are saved. Aida’s algorithms are trained to recognize linguistic patterns, key phrases, and behavioral indicators associated with the risk of suicide, self-harm, or violence. When triggered, it can promptly escalate the case to mental health staff, bypassing bureaucratic layers that might otherwise delay intervention.

 

From Disclosure to Healing

Disclosure is only the start. When people honestly share their needs—whether it’s addiction, homelessness, trauma, or anger—they open the door for intervention. But this depends on someone listening and being ready to act. Aida makes sure every interview is recorded, transcribed, and summarized into actionable reports, with risk flags and recommended care pathways.

By identifying unmet needs earlier in the correctional process, Aida promotes a trauma-informed approach to rehabilitation. It acknowledges that behavior is often influenced by experience and that healing begins with being listened to.

 

Technology With a Human Purpose

It’s important to stress that Aida doesn’t replace clinicians, counselors, or correctional professionals. Instead, it enhances their ability to connect with people. In many cases, AI-led interviews are followed by clinician reviews or one-on-one sessions, now better informed by more honest and detailed disclosures.

In a system stretched thin by staff shortages and burnout, this model increases capacity while improving care. It enhances mental health screening accuracy, thoroughness of risk assessment, and personalization in case planning.

 

The New Frontier of Correctional Mental Health

As more jurisdictions experiment with AI interviewing tools, one thing is becoming clear: the future of mental health and rehabilitation in corrections is not just about hiring more staff—it’s about working smarter. Aida shows that when people are provided with the right environment, they will engage in conversation. They want to be heard.

And for systems willing to listen, the potential for transformation is immense.

Citations

– Lucas, G. M., Gratch, J., King, A., & Morency, L. P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. *Computers in Human Behavior*, 37, 94–100. https://doi.org/10.1016/j.chb.2014.04.043

– Gustafson, D. H., McTavish, F. M., Chih, M. Y., et al. (2015). A virtual agent to support veterans with posttraumatic stress disorder. *Journal of Medical Internet Research*, 17(10), e228. https://doi.org/10.2196/jmir.3943

– National Institute of Corrections (2020). Trauma-Informed Care in Correctional Settings: What the Research Tells Us. https://nicic.gov/trauma-informed-care-corrections

– Bonta, J., Andrews, D. A. (2017). The Psychology of Criminal Conduct (6th ed.). Routledge.

Latest articles

In today’s correctional systems, the pressure to do more with less has become the new norm. Staff shortages, budget constraints, and overwhelming caseloads are urging agencies to find new tools that support essential operations.

As artificial intelligence continues to integrate into the justice system, correctional leaders face a vital question: How can we incorporate AI into essential processes without undermining the human expertise that makes these systems effective?

Request a demo