On the surface, it doesn’t seem all that weird that James Vlahos can call his father, John, whenever he wants. When he has a question concerning his favourite sports team, he can ask him. Or perhaps he wants to inquire about his preferred musical genre. Vlahos may just send his dad a message to check on him whenever the mood strikes. Whatever it is, his father always has a quick response to offer—sometimes a direct response, other times a joke.
The fact is that Vlahos’s father has passed away. He died in February 2017 from lung cancer that had reached stage 4. Vlahos can communicate with someone right now on his phone, but it’s not precisely his father. After the family discovered that John had a fatal illness, father and son constructed an AI chatbot.
Vlahos told The Daily Beast, “We knew we were going to lose and were scrambling to find ways to commemorate him.” I was learning about all of these ways that we may teach machines to talk in human-like ways while I was working on a book about conversational AI.
Vlahos came up with the concept pretty quickly. John might use the same AI technology he was already investigating to revisit his father’s recollections and personality rather than just recording his memories and anecdotes on audio or video. That’s what gave me the idea to create Dadbot, a chatbot for sharing memories, according to Vlahos.
Later, in 2017, he wrote a piece about Dadbot for Wired. Then the news began to circulate. People from all around the world who were dying or who had dying loved ones and wanted to build their own identical chatbots started sending emails and making phone calls. Would you build me a Mombot? Would you build me a Dadbot? This served as the motivation behind the development of HereAfter AI, an online application designed by Vlahos that enables you to “preserve valuable memories about your life and interactively share them with the people you love.”
Similar to how Dadbot operated, HereAfter allows users to capture audio recollections of their lives by asking questions like, “What was your favourite music when you were younger?” The software then develops a conversational AI chatbot of the user using the memories that have been gathered. It’s a sort of virtual doppelganger that can respond to inquiries and share tales about their life long after they pass away through text, voice, and even photographs.
It’s not quite like talking to them, but it’s also not designed to be. You can access the recollections and tales of your deceased loved ones through the chatbot. You can’t, for example, ask it what it thinks of a recently released movie or make up a brand-new bedtime tale to read to you at night. But it can speak to you about the information that was captured when the person was still living.
A growing sector of grief technology, namely HereAfter AI, promises to digitally prolong the lives of our deceased loved ones in exchange for a little price. Another AI-driven firm, StoryFile, takes it a step further by letting you communicate with a deceased person’s video. The business gained notoriety in the summer after Marina Smith, an 87-year-old Holocaust activist and grandmother, used AI to “speak to attendees at her burial.”
She would respond to inquiries from the public as if she were present, speaking in her usual cadence and recalling events from her life as any grandparent could. The only distinction was that, in addition to being present on television and speaking intelligently about her past, she had also recently been cremated, and her bones were now resting in an urn.
Similar to how Dadbot operated, HereAfter allows users to capture audio recollections of their lives by asking questions like, “What was your favourite music when you were younger?” The software then develops a conversational AI chatbot of the user using the memories that have been gathered. It’s a sort of virtual doppelganger that can respond to inquiries and share tales about their life long after they pass away through text, voice, and even photographs.
It’s not quite like talking to them, but it’s also not designed to be. You can access the recollections and tales of your deceased loved ones through the chatbot. You can’t, for example, ask it what it thinks of a recently released movie or make up a brand-new bedtime tale to read to you at night. But it can speak to you about the information that was captured when the person was still living.
A growing sector of grief technology, namely HereAfter AI, promises to digitally prolong the lives of our deceased loved ones in exchange for a little price. Another AI-driven firm, StoryFile, takes it a step further by letting you communicate with a deceased person’s video. The business gained notoriety in the summer after Marina Smith, an 87-year-old Holocaust activist and grandmother, used AI to “speak to attendees at her burial.”
She would respond to inquiries from the public as if she were present, speaking in her usual cadence and recalling events from her life as any grandparent could. The only distinction was that, in addition to being present on television and speaking intelligently about her past, she had also recently been cremated, and her bones were now resting in an urn.
You effectively created a profound fake of your loved one.
— Santa Clara University’s Irina Raicu
“Saying, “I’m going to create a chatbot of my granny so I can talk to her every day and have a relationship with her,’ is a really utilitarian way of thinking about other people. But you can’t, Irina Raicu, the Santa Clara University director of online ethics, told The Daily Beast. You effectively created a profound fake of your loved one.
For Raicu, it is problematic to characterise this experience as a “conversation” with a chatbot. After all, talking to a robot isn’t really possible. There is no one there. It is code lines. By saying that, it furthers the grave misconceptions about artificial intelligence that the general public has. Raicu and other people believe that if a person begins conversing with a chatbot of the deceased, this distinction may be lost. As chatbots become more advanced, the problem is growing worse for society. Because the distinction between reality and simulation is becoming so blurred, some people are even asking whether or not these bots are sentient.
These exchanges could go horribly wrong if you add the trauma of loss—a period in a person’s life when they are at their most raw and exposed.
Alan Wolfelt, a grief counsellor and the director of the Center for Loss and Life Transition, told The Daily Beast, “To me, it’s completely ludicrous.” We already have a culture that shuns sorrow, so I would add these technology to the mix to make it harder for people to express their grief.
Giving an AI chatbot of a loved one to someone who is grieving could be compared to giving a drunk a bottle of wine.
According to Wolfelt, the issue is that it can interfere with the process of accepting the death, which is a vital element of grieving and mourning. You might utilise an AI replica of your loved one to escape reality’s fundamental truths rather than facing the fact that they are dead and gone for good. This could, at best, obstruct a healthy mourning and grieving process, and at worst, turn into a hazardous fantasy.
But not all grief specialists concur with Wolfelt. Natalia Skritskaya, a research scientist at Columbia University’s Center for Prolonged Grief, specialises in helping those who experience chronic grief. According to her, this happens when the loss of a loved one affects a person so severely that their life becomes “like Groundhog’s Day.”
Skritskaya acknowledges that there are dangers. She claimed that giving a person who is experiencing protracted bereavement an AI chatbot of their loved one might be like giving a drunk a bottle of whiskey.
Skritskaya observed that bereaved persons, particularly those experiencing protracted grieving, have a tendency to compartmentalise their feelings and try to ignore the fact that the person has passed away or aspects of reality that call attention to their absence. “They might deceive themselves into believing the individual is still present when they use the chatbot. Technology, I believe, is assisting us in creating these alternate realities more and more. Some minds are even better than others at doing so.
She did, however, emphasise that a lot of this is personal. After all, no two grief processes are the same. While one person may use something like an AI chatbot of a deceased loved one as a method to remember them and reconnect with memories, another person may use it as a simple means of escaping reality.
Wolfelt stated, “If you try to keep someone alive who is dead, you become dead to yourself and everybody around you. “In order for you to survive, you have to allow someone to die.”
You could ask Ruth Bader Ginsburg a question right now if you wanted to.
Ask Ruth Bader Ginsburg, an AI chatbot created by AI21 Labs, is named after the Supreme Court Justice despite her passing in 2020. It was trained using data from over three decades of her legal writing. There is a significant distinction between HereAfter AI and the RBG-bot despite their seeming similarity: the late Supreme Court Justice was unaware of this before to her passing.
Raicu remarked, “Ruth Bader Ginsburg didn’t assent, yet there is an AI of her. “Consent is a major problem, in my opinion. Additionally, I believe that most of the humans who are now being turned into AI did not consent because it is such a novel concept.
One of the many thorny ethical issues at the core of the chatbot technology movement is the topic of permission. A veritable Gordian knot of moral quandaries arises when these bots are given access to the memories and experiences of real people.
In particular, they open the door to the possibility of generative answers like those provided by the RBG-bot or the recent simulated conversation between an AI Steve Jobs and an AI Joe Rogan, even though these AI chatbots provided by HereAfter AI and StoryFile are largely limited to more scripted conversations (it can’t answer a question it doesn’t already have an answer to).
It’s not hard to picture a future in which malicious individuals weaponize these chatbots to get the deceased to say things they would never utter in real life. Regarding matters like Holocaust and other genocide denial, Raicu thinks this could be particularly troublesome.
According to Vlahos, there are no plans at the moment for this kind of generative chatbot experience. He shares Raicu’s dislike of people “putting words in other people’s lips.”
“We do not desire an AI to create memories. We don’t want them making up statements that your relative may have made, the man replied. While technology has advanced, Vlahos warns that it is still not flawless and might lead to humiliating or harmful situations where the AI says something the person would never have spoken if they were still alive.
However, HereAfter AI eventually wants to reach a position where it can respond to a considerably wider range of queries than it can now, according to Vlahos. Instead of merely asking them to “give me a story about your teenage years,” he wants them to be able to “tell me a story about anything embarrassing that occurred to you in 10th grade.”
It’s the commodification of the deceased, she said.
University of Minnesota Duluth student Alexis Elder
It’s about enhancing the AI’s capacity for natural language, he explained, “so our avatars genuinely understand what people want to communicate about.”
Because marketing a product is ultimately what HereAfter AI and its rivals are doing. To assist people get through some of the hardest times of their lives, they aim to provide the finest experience they can. These enterprises are disrupting, innovating, and significantly complicating the grief and loss process, just like Uber did for taxis and buses.
According to Alexis Elder, an AI ethicist at the University of Minnesota-Duluth, “it’s the monetization of the dead.” She thinks it’s problematic that someone’s memories and personality may be packaged and sold in this way because it “encourages us to consider people as similar to their outputs.”
People could believe that Grandpa passed away, she says, and she explains why. But what really matters is that you get to continue conversing with his personality. That’s not actually the case. According to Elder, a person is far more than the sum of their interactions, narratives, or the legal counsel they can provide from beyond the dead. A real person exists. Additionally, it cannot simply be packaged and sold as a toy.
To the creators’ credit, neither Dadbot nor HereAfter AI are intended to accomplish that.
It doesn’t lessen how much I miss him, Vlahos said. The focus is often lost when the subject is sensationalised and reduced to “This is the AI version of your dead relative!” For him, it’s more about keeping his memories of his father alive over time.
That is the more realistic, sci-fi perspective, he remarked. “I believe that a more significant way to see it is that it belongs to the continuum of memory technologies, which also includes writing, photography, and audio recordings. It makes better use of AI to communicate and store memories.
Anyone who has experienced the loss of a loved one can attest to the fact that feelings of anguish, grief, and mourning last a lifetime. There are both difficult and easy times. There are times when you would give anything to hear your grandmother’s voice again, receive an idiotic text from your husband, or interact with your closest friend in the same way that you used to. And they can’t for a great many individuals. The dead are dead. The fact that it must be dealt with by the living doesn’t make it any easier.
The trend of merging technology with mourning hits a very emotional chord, whether it is an AI chatbot of your loved one or a simulated podcast with Steve Jobs. When we’re most lost and all we want is a kind word, some sound advise, or a good laugh from someone we love, we’ll do anything to get them back—even if it includes resurrecting the dead.