Research

Why the psychedelic world suddenly needs an ai patient

If MDMA- and psilocybin-assisted therapies continue on their current regulatory path, they’re likely to move into mainstream medicine in the next few years. But there’s a huge bottleneck: not enough trained clinicians to safely guide people through these non-ordinary states.

Traditional training has a catch:

  • You can’t learn to support someone in a psychedelic state…
  • …without working with someone in a psychedelic state.
  • But no one wants their first deep journey facilitated by a total beginner.

Simulation solves that problem in aviation and surgery. Now, a nonprofit thinks it can help solve it in psychedelics too.

Meet Lucy: The first ai “psychedelic patient”

mental healing psychedelics1

Lucy was created by Fireside Project, the nonprofit behind the Psychedelic Peer Support Line.

Here’s what makes her different from a generic chatbot:

  • Emotionally intelligent, voice-to-voice
    She’s designed to sound and feel like a real person on a journey, changing tone, pacing, and vulnerability based on how you interact with her.
  • Trained on real psychedelic stories
    Lucy is built on a dataset of more than 7,000 anonymized support calls from people during and after psychedelic experiences, giving her a wide range of believable scenarios, from anxious first-timers to seasoned journeyers processing trauma.
  • Focused on training humans, not replacing them
    Unlike AI tools pitched as digital therapists, Lucy’s sole job is to play the patient so humans can practice the difficult art of holding space.

Think of her as a flight simulator for psychedelic support: a place where new practitioners can make mistakes, refine their listening, and build confidence, without putting real people at risk.

How Lucy actually works

On the surface, Lucy feels simple: you log in, speak to her through your mic, and she responds in real time.

Under the hood, several things are happening:

  1. Scenario selection
    Trainees can choose different types of encounters, preparation, in-journey support, or post-journey integration, each with its own emotional tone and complexity.
  2. Dynamic emotional responsiveness
    Lucy doesn’t just spit out canned lines. Her responses shift depending on your voice, affect, and choice of questions. If you’re dismissive, she may shut down. If you’re attuned, she opens up to deeper material.
  3. Built-in feedback
    Over time, the system is being designed to offer feedback: where you validated her experience, where you missed cues, where you jumped to problem-solving instead of staying present.

Lucy is part of a broader wave of AI patient simulators in mental health:

  • Research projects like PATIENT-Ψ use large language models to play CBT patients, helping trainees practice case formulation.
  • Stanford’s TherapyTrainer simulates PTSD patients so clinicians can rehearse exposure-based techniques before trying them with humans.
  • Other teams are building voice-enabled virtual patients for structured depression interviews.

Lucy is the first to specialize in the non-linear, emotionally intense world of psychedelic states.

Why this matters for the future of psychedelic care

If psychedelics continue to move into clinics, we’ll need thousands of practitioners who can:

  • Stay calm when a client thinks time has stopped
  • Navigate spiritual crises, trauma flashbacks, and ego dissolution
  • Recognize when a journey is challenging-but-productive vs. unsafe

Textbooks and weekend trainings can’t fully prepare someone for that.

AI patients like Lucy offer three big advantages:

  1. Scalability
    One “Lucy” can safely train thousands of people worldwide, regardless of local access to legal psychedelics.
  2. Repetition and variety
    Trainees can run many different scenarios,panic, bliss, shame, grief, anger, until the skills become embodied.
  3. Safety for real humans
    Early mistakes happen inside a simulation, not in someone’s most vulnerable moment.

In a best-case scenario, Lucy doesn’t replace human wisdom; she amplifies it, making it more likely that, when real people show up in clinics, their guides are grounded, prepared, and practiced.

The ethical questions we have to ask

Of course, bringing AI this close to altered states raises serious questions.

1. What about people using AI instead of humans?
A recent WIRED feature documented people turning to general AI chatbots as DIY trip sitters, sometimes with positive outcomes, but also with serious risks, since large language models can give confidently wrong or emotionally tone-deaf advice.

Lucy is intentionally different: she trains humans; she doesn’t guide real journeys. But as more AI tools appear, we’ll need clear boundaries and education so people understand what AI can and can’t safely do.

2. Whose experiences does Lucy reflect?
Lucy learns from real calls to Fireside’s support line. That’s powerful, but it also means she inherits the demographics, cultural assumptions, and blind spots of that caller base. If most of those callers are Western, white, and relatively resourced, then Lucy may be less accurate at simulating other cultural experiences with psychedelics.

Designing diverse training data, and being transparent about its limits, will be crucial.

3. Can we really teach empathy with a machine?
There’s growing literature on “empathetic AI,” but most experts agree: real therapeutic safety still depends on human presence, accountability, and ethics.

At its best, Lucy isn’t pretending to be a person forever. She’s a mirror that helps human practitioners hear their own patterns and refine their capacity to care.

CONTACT
Have any questions regarding this blog?

You will probably find your answer in the frequently asked questions.

Do you have another question? Please contact us.

Go to our FAQContact us

  • Sent today
    (order before 14:00)

  • Multiple
    payment options

  • Discreet
    delivery possible

  • Worldwide
    shipping

0
    Your Cart
    Your cart is emptyReturn to Shop