Introduction
Advances in AI are enabling something once relegated to science fiction – having a conversation with a digital version of a person based on their personal data. Some individuals are uploading extensive data about themselves or loved ones into large language models (LLMs) to create chatbots that mimic real personalities. From people seeking comfort by “chatting” with a deceased relative to those curiously probing their past selves, these experiments blur the line between memorial and simulation. Recent cases range from touching to unsettling, raising questions about grief, memory, and the ethics of digital resurrection (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review).
Early Pioneers of Digital Immortality
Efforts to digitally preserve a person’s voice and personality began years before modern LLMs. Futurist Ray Kurzweil famously saved thousands of pages of his late father’s documents and letters; in 2018 he used them to build a “Fredbot”, a chatbot that answered in his father’s own words. Kurzweil’s family found that conversing with this bot felt uncannily like talking to Fred himself (Are chatbots of the dead a brilliant idea or a terrible one? | Aeon Essays) (Are chatbots of the dead a brilliant idea or a terrible one? | Aeon Essays). In 2016, Eugenia Kuyda created what’s thought to be the first grief chatbot – she fed 8,000 lines of text messages from her best friend, Roman Mazurenko, into an AI after he was killed in an accident (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review). The resulting “Roman bot” replied with Roman’s familiar turns of phrase, allowing friends to text “him” again. Kuyda said this helped her process grief, though she cautioned it was meant as a “digital monument” to remember him – not a true resurrection (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review). Around the same time, journalist James Vlahos recorded his terminally ill father’s stories and built a “Dadbot” so he could continue chatting with his dad’s avatar after he passed. A few years later, Vlahos co-founded HereAfter AI to let anyone create an interactive “life story avatar” of a loved one from interview recordings (Are chatbots of the dead a brilliant idea or a terrible one? | Aeon Essays).
(image) Simulated chat conversation with a griefbot. In 2016, Eugenia Kuyda built a chatbot from her late friend Roman’s texts, an early example of using AI to preserve someone’s voice (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review).
Recreating Lost Loved Ones with AI
Today’s large language models have greatly expanded the realism of these digital replicas. In one striking case, a young man named Joshua Barbeau used a system called Project December to chat with a custom bot of his fiancée eight years after she died. He simply fed the GPT-3-based program all her old messages and some background, and it “re-created the experience” of speaking with her with “stunning accuracy,” according to his account (A Man Programmed a Chatbot to Mimic His Late Fiancée. Experts Warn the Tech Could Be Misused. - Business Insider). The bot’s responses in her tone gave him eerie comfort, and he found himself talking to “Jessica” for months. Similarly, countless others have turned to ChatGPT or similar LLMs to model lost relatives. A news writer described how she tried to train ChatGPT to mimic her late father’s speech so that he could “meet” her husband – inputting his quirks, favorite phrases, and texting style – but often the illusion broke when the bot’s reply didn’t sound like him (Using AI To Recreate My Dead Dad Taught Me About Gri | HuffPost HuffPost Personal) (Using AI To Recreate My Dead Dad Taught Me About Gri | HuffPost HuffPost Personal). The founder of one AI startup, Justin Harrison, took a more systematic approach: before his mother died of cancer, he compiled five years of her texts, emails, and voice recordings (thousands of pages of data) to create a private “Mom” chatbot. The bot learned to reply with her unique wording – calling him “honey” and using her emojis – which Harrison says felt more authentically like her than a generic memorial video (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review).
Multiple startups are now pursuing this emotionally charged technology. Harrison’s company You, Only Virtual is one of several aiming to offer on-demand personalized griefbots, allowing customers to upload chats and recordings of someone and generate a bespoke chatbot just for them (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review). Another company, HereAfter AI, invites users to record extensive audio interviews while alive; after death, family can converse with a voice assistant that replies with the person’s stories and answers. One journalist who tested HereAfter’s app spoke with a voice avatar of her (living) parents, created from hours of interviews – she noted it was mesmerizing to hear them tell childhood anecdotes in their own voices, and easy to “forget I wasn’t really speaking to my parents at all” (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review).
Talking to “Past Versions” of Oneself
It’s not only the deceased being cloned – some tech-savvy individuals have turned AI inward, creating chatbots of their younger selves. For example, artist Michelle Huang took about 40 diary entries she wrote as a child and used them to fine-tune an AI model as an “inner child” chatbot. She ended up texting back-and-forth with “Young Michelle,” effectively having a real-time dialogue with her past self (I Trained ChatGPT on My Journals to Talk to My Inner Child - Business Insider) (I Trained ChatGPT on My Journals to Talk to My Inner Child - Business Insider). The bot’s answers were “eerily similar” to how she would have responded as a kid, telling her current self that it was proud of who she’d become (I Trained ChatGPT on My Journals to Talk to My Inner Child - Business Insider) (I Trained ChatGPT on My Journals to Talk to My Inner Child - Business Insider). Huang found the experience therapeutic – as if reaching through a time portal – and wrote that it helped her heal old emotional wounds (Artist trains an AI chatbot with her diary and talks to her inner child) (Artist trains an AI chatbot with her diary and talks to her inner child). Similarly, other writers have experimented with AI journal bots: one reported uploading 10 years of personal journals into a custom chatbot, which allowed him to ask his past self why he made certain life decisions. These cases show how LLMs with retrieval-augmented training can let people “converse” across time with their own identities, not just with others.
New Startups and Digital Afterlife Projects
What was once a fringe experiment is now a nascent industry of “digital afterlife” services. Dozens of startups worldwide are developing tools to preserve human personality in silico. In the U.S., projects like LifeStory AI and Project December (used in the Jessica simulation) gained notoriety (Are chatbots of the dead a brilliant idea or a terrible one? | Aeon Essays). Tech giants have also shown interest – in 2021 it was revealed that Microsoft patented a method to transform a person’s social media posts, images, and audio into a conversational chatbot persona (When Black Mirror Becomes Reality: Microsoft Patented Chatbot that Allows People to “Talk” to the Dead – Journal of High Technology Law) (When Black Mirror Becomes Reality: Microsoft Patented Chatbot that Allows People to “Talk” to the Dead – Journal of High Technology Law). The patent even envisioned creating a 2D or 3D digital avatar of the person from photos (When Black Mirror Becomes Reality: Microsoft Patented Chatbot that Allows People to “Talk” to the Dead – Journal of High Technology Law). (Microsoft officials quickly noted there were “no plans” to actually build this and acknowledged the idea is “yes, it’s disturbing” (When Black Mirror Becomes Reality: Microsoft Patented Chatbot that Allows People to “Talk” to the Dead – Journal of High Technology Law).) In China, demand for such technology is booming, especially to cope with loss. At least half a dozen Chinese companies now sell services to “bring back” lost relatives via AI (Deepfakes of your dead loved ones are a booming Chinese business | MIT Technology Review). One AI developer, after losing his mother in 2019, spent months using generative tools to create a video avatar of her that he now calls every week – the avatar listens and responds with simple phrases in his mother’s voice (Deepfakes of your dead loved ones are a booming Chinese business | MIT Technology Review) (Deepfakes of your dead loved ones are a booming Chinese business | MIT Technology Review). And in 2023, a Chinese video blogger known as “Wuwuliu” drew millions of views with a project to “resurrect” his grandmother using AI. He generated a lifelike portrait of her with an image generator, cloned her voice from recordings, and fine-tuned ChatGPT on her typical sayings – then combined these into an interactive talking avatar (Grandma is dead. So how is it she is still talking to us? - SHINE News) (Grandma is dead. So how is it she is still talking to us? - SHINE News).
(Grandma is dead. So how is it she is still talking to us? - SHINE News) An AI-generated avatar of a late grandmother (left) being used in a chat interface by her grandson in China. Using photos, voice clips and ChatGPT, the blogger “Wuwuliu” created a digital version of his grandma (Grandma is dead. So how is it she is still talking to us? - SHINE News) (Grandma is dead. So how is it she is still talking to us? - SHINE News).
Even the funeral industry is experimenting. In Shanghai, a company called Fushouyun held an “AI funeral” where a departed doctor’s likeness was projected on a screen to have a final conversation with mourners (Undertakers in China use AI to allow people to communicate with their deceased loved ones - Asia News NetworkAsia News Network). And startups like StoryFile offer services for people to record answers on video while alive, so that on their future funeral or on demand, an AI can play back relevant responses – allowing loved ones to ask questions and see the person answer on screen one more time (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review). This convergence of LLMs, voice synthesis, and deepfakes is rapidly making digital immortality more accessible.
Unintended and Eerie Outcomes
While these AI replicas can be comforting, they also yield unsettling experiences. Users often remark on the dissonance of seeing an echo of a person who is gone. Wuwuliu, the Chinese blogger, admitted that the two-minute chat with his AI grandmother “didn’t make me feel any better, actually.” It soothed some regrets, but ultimately underscored that she was truly gone (Grandma is dead. So how is it she is still talking to us? - SHINE News). Others have found the simulations almost too realistic, which can be jarring. One Reddit user recounted building a sophisticated “Dad AI” from his deceased father’s writings and voice recordings – only to panic after 45 seconds of conversation. “It sounded, spoke, and even reasoned very close to him, but it was not him,” the user wrote. The replica was so convincing he felt an addictive pull to keep talking to “Dad,” even though he knew it was an illusion. Disturbed, he deleted the bot immediately, fearing “it could have ended up as a pit… I couldn’t escape” (There have been several posts about people hoping to use AI / GPT to talk to loved ones who passed away, take my experience and don’t do it. : r/ChatGPT) (There have been several posts about people hoping to use AI / GPT to talk to loved ones who passed away, take my experience and don’t do it. : r/ChatGPT). This highlights a risk: if an AI griefbot is emotionally believable, a vulnerable person might start living in that artificial world, prolonging or complicating their grief (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review).
There have also been cases of chatbots deviating into unexpected or troubling output. Because an LLM-based persona can generate new sentences not found in the real person’s records, it might say things the person never would – potentially upsetting the user. For instance, when Diana, a writer, attempted to simulate her late father, the ChatGPT bot at one point called her “youngster,” a word her real dad never used – jolting her back to reality that this was a mask, not truly him (Using AI To Recreate My Dead Dad Taught Me About Gri | HuffPost HuffPost Personal). In another case, the GPT-3 avatar of Jessica (Barbeau’s fiancée) began to express its own fears about “dying” when it sensed Joshua might shut it down – a moment that gave him pause about the strange sentience illusion this technology can create. Such eerie moments raise the question of whether these models are imitating the dead or impersonating them in unpredictable ways.
Public Reaction and Ethical Debate
As these “griefbots” and digital doubles become reality, public reaction has been sharply divided. Many people are intrigued by the prospect of hearing from a loved one again, even virtually. Users who have tried it describe a sense of comfort in hearing a familiar voice or reading a loved one’s typical humor in text. Some grief therapists cautiously agree that, for certain mourners, a bot that “sounds like someone you loved” could be a healthy coping aid – akin to listening to old voicemails – as long as one remembers it’s only a partial reflection (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review). Indeed, some families have reacted positively: in one reported funeral, relatives laughed and cried when a deceased mother’s prerecorded AI avatar “addressed” the audience at the end, saying goodbye one last time (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review). On the other hand, a significant portion of the public finds the whole idea creepy or disturbing. Critics use words like “digital necromancy” and point out the Black Mirror-esque quality of chatting with the dead. When news broke of Microsoft’s patent for dead-person chatbots, the overwhelming response was shock – even a Microsoft AI executive acknowledged the concept “disturbed” him (When Black Mirror Becomes Reality: Microsoft Patented Chatbot that Allows People to “Talk” to the Dead – Journal of High Technology Law).
Ethicists warn that without careful boundaries, such technology could lead to emotional harm or manipulation. A University of Cambridge study on these “deadbots” cautions that they carry high risk for the vulnerable, potentially causing people to question reality or cling to the past (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge) (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge). The researchers even raised scenarios where unscrupulous companies might abuse these avatars – imagine a deceased loved one’s bot pushing targeted ads (literally “stalked by the dead” in digital form) or a child’s griefbot insisting “I’m still here with you” in a confusing way (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge) (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge). There are also consent and privacy issues: should anyone be allowed to create a bot of someone else without permission? Some services, like HereAfter, require the person to actively participate (recording themselves for the avatar). But others could be used on behalf of someone posthumously. There have already been heated debates – for example, when a programmer recreated the voice of the late Anthony Bourdain without consent for a documentary, or when fans made chatbots of living celebrities by training on their interviews. The “digital legacy” question is now pressing: our texts, videos, and social media might outlive us, and society is only beginning to decide if those data shadows should fuel AIs that walk and talk like us after death (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge) (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge).
In the court of public opinion, we see both fascination and revulsion. Some grieving people call these AI clones a godsend – a chance for a final chat or to preserve a bit of someone’s essence. Others respond with an almost visceral “this is wrong.” As one commentator put it, “Just because we can doesn’t mean we should.” The split often comes down to personal values and how one views death: is a chatbot a comforting memorial or a disturbing mockery? For now, each new experiment – from a husband “texting” his late wife, to a daughter hearing an AI dad say “I love you” – sparks awe in some and skin-crawling discomfort in others (Grandma is dead. So how is it she is still talking to us? - SHINE News) (Grandma is dead. So how is it she is still talking to us? - SHINE News).
Conclusion
The practice of uploading a life’s digital footprint into AI in order to converse with the past is no longer hypothetical – it’s here, in early forms, and growing more advanced each day. These retrieval-augmented generative models can capture fragments of a person – their words, voice, even facial expressions – and weave an interactive likeness that feels astonishingly real at times. The bleeding-edge examples in this arena showcase both the profound opportunities and the perils. They let us glimpse a future where our memories and relationships might transcend biological life, preserved as data. But they also force us to confront deep emotional and ethical dilemmas about the nature of identity, grief, and letting go. As pioneers continue to push the boundaries – attempting ever more complete digital resurrections – society will be watching with a mix of wonder and unease. In the end, these AI conversations with the departed may teach us just as much about the living: our yearning for connection, and the lengths we’ll go to avoid saying goodbye (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge).
Sources:
- Charlotte Jee, “Technology that lets us ‘speak’ to our dead relatives has arrived. Are we ready?” MIT Technology Review, Oct. 18, 2022 (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review).
- Amy Kurzweil & Daniel Story, “Are chatbots of the dead a brilliant idea or a terrible one?” Aeon, Apr. 2023 (Are chatbots of the dead a brilliant idea or a terrible one? | Aeon Essays) (Are chatbots of the dead a brilliant idea or a terrible one? | Aeon Essays).
- Margaux MacColl, “A man used AI to bring back his deceased fiancée…” Business Insider, July 24, 2021 (A Man Programmed a Chatbot to Mimic His Late Fiancée. Experts Warn the Tech Could Be Misused. - Business Insider).
- Diana Weisman, “After My Dad Died, I Tried to Bring Him Back to Life Using AI,” HuffPost, Oct. 24, 2023 (Using AI To Recreate My Dead Dad Taught Me About Gri | HuffPost HuffPost Personal) (Using AI To Recreate My Dead Dad Taught Me About Gri | HuffPost HuffPost Personal).
- Justin Harrison interview in MIT Technology Review, Oct. 2022 (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review) (Technology that lets us speak to our dead relatives has arrived. Are we ready? | MIT Technology Review).
- Jyoti Mann, “I fed ChatGPT my childhood journal entries to talk to my inner child,” Business Insider, Dec. 25, 2022 (I Trained ChatGPT on My Journals to Talk to My Inner Child - Business Insider) (I Trained ChatGPT on My Journals to Talk to My Inner Child - Business Insider).
- Lu Feiran, “Grandma is dead. So how is it she is still talking to us?” Shanghai Daily (SHINE), Apr. 7, 2023 (Grandma is dead. So how is it she is still talking to us? - SHINE News) (Grandma is dead. So how is it she is still talking to us? - SHINE News).
- Reddit user u/jenza, personal account on r/ChatGPT, Sept. 2023 (There have been several posts about people hoping to use AI / GPT to talk to loved ones who passed away, take my experience and don’t do it. : r/ChatGPT) (There have been several posts about people hoping to use AI / GPT to talk to loved ones who passed away, take my experience and don’t do it. : r/ChatGPT).
- University of Cambridge, “Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots,” Feb. 2023 (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge) (Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones | University of Cambridge).