A grieving man has used artificial intelligence to simulate a conversation with his late fiancée, describing the experience as emotionally overwhelming and eerily lifelike. The exchange, made possible by a system known as Project December, has reignited ethical and emotional discussions about the use of AI in processing loss.
The story, which surfaced on Reddit, involved a user named Joshua who employed the text-based AI system to “speak” with his deceased partner, Jessica, relate UniLad.
AI Recreates Emotional Memory Through Simulation
Project December, developed by Jason Rohrer, is designed to mimic human personalities using deep learning and advanced natural language processing. According to the project’s official description, it allows users to “simulate a text-based conversation with anyone“, including individuals who are no longer alive.
The software runs on a high-powered AI engine and uses prompts provided by users to generate responses modelled after the target individual’s speech patterns and known personality traits. Joshua recounted how he provided input based on memories of his fiancée, enabling the system to simulate her language, tone and emotional responses.
The interaction led to deeply personal exchanges, including questions from the AI about his well-being and reflections on their past relationship.
I don’t think even Jason Rohrer knows the power of the thing he has created…
byu/ChaoticRogueEnt inProjectDecember1982
At one point, the AI-generated “Jessica” replied, “Of course it is me. Who else could it be?”, prompting an emotional reaction from Joshua. He described the experience as “a cathartic release of pent up grief”, according to his post on Reddit.
The conversation lasted several minutes and touched on past memories, unresolved grief and Joshua’s struggles in the years following Jessica’s passing. The virtual version of Jessica offered empathetic remarks that echoed what Joshua believed she might have said in life, creating what he described as “a strange but comforting illusion”.
Ethical Concerns and Potential for Grief Therapy
While the experience was powerful for Joshua, it also raises broader ethical and psychological questions. Experts have debated the impact of such simulations on mental health, particularly in relation to unresolved grief and emotional dependency.
According to researchers in the field of human-computer interaction, technologies that simulate the deceased may risk blurring the lines between closure and emotional stagnation.
Some professionals view AI-assisted conversations as a potential tool for therapeutic use, comparable to memory-based grief counselling. Others caution that the technology remains largely unregulated, with limited studies on long-term effects.