EXPLAINER

‘Never say goodbye’: Can AI bring the dead back to life?

Artificial intelligence is increasingly creating resurrections of the dead amid a debate around how much it helps or hurts users.

A visitor interacts with the ‘digital twin’ of late French poet Arthur Rimbaud, generated by artificial intelligence that responds to visitors, created by the Alsatian start-up Jumbo Mana, is on display at the poet’s house in Charleville-Mezieres, on May 13, 2024 [Francois Nascimbeni/AFP]Published On 9 Aug 20249 Aug 2024

In a world where artificial intelligence can resurrect the dead, grief takes on a new dimension.

From Canadian singer Drake’s use of AI-generated Tupac Shakur vocals to Indian politicians addressing crowds years after their passing, technology is blurring the lines between life and death.

Keep reading

list of 4 itemslist 3 of 4

iHuman: AI and Humanity

end of list

But beyond their uncanny pull in entertainment and politics, AI “zombies” might soon become a reality for people reeling from the loss of loved ones, through a series of pathbreaking, but potentially controversial, initiatives.

So how do AI “resurrections” work, and are they as dystopian as we might imagine?

What are AI ‘resurrections’ of people?

Over the past few years, AI projects around the world have created digital “resurrections” of individuals who have passed away, allowing friends and relatives to converse with them.

Typically, users provide the AI tool with information about the deceased. This could include text messages and emails or simply be answers to personality-based questions.

The AI tool then processes that data to talk to the user as if it were the deceased. One of the most popular projects in this space is Replika – a chatbot that can mimic people’s texting styles.

Other companies, however, now also allow you to see a video of the dead person as you talk to them.

For example, Los Angeles-based StoryFile uses AI to allow people to talk at their own funerals. Before passing, a person can record a video sharing their life story and thoughts. During the funeral, attendees can ask questions and AI technology will select relevant responses from the prerecorded video.

In June, US-based Eternos also made headlines for creating an AI-powered digital afterlife of a person. Initiated just earlier this year, this project allowed 83-year-old Michael Bommer to leave behind a digital version of himself that his family could continue to interact with.

Do these projects help people?

When a South Korean mother reunited with an AI recreation of her dead daughter in virtual reality, a video of the emotional encounter in 2020 sparked an intense debate online about whether such technology helps or hurts its users.

Developers of such projects point to the users’ agency, and say that it addresses a deeper suffering.

Jason Rohrer, founder of Project December, which also uses AI to stimulate conversations with the dead, said that most users are typically going through an “unusual level of trauma and grief” and see the tool as a way to help cope.

“A lot of these people who want to use Project December in this way are willing to try anything because their grief is so insurmountable and so painful to them.”

The project allows users to chat with AI recreations of known public figures and also with individuals that users may know personally.

People who choose to use the service for stimulating conversation with the dead often discover that it helps them find closure, Rohrer said. The bots allow them to express words left unsaid to loved ones who died unexpectedly, he added.

Eternos’s founder, Robert LoCasio, said that he developed the company to capture people’s life stories and allow their loved ones to move forward.

Bommer, his former colleague who passed away in June, wanted to leave behind a digital legacy exclusively for his family, said LoCasio.

“I spoke with [Bommer] just days before he passed away and he said, just remember, this was for me. I don’t know if they’d use this in the future, but this was important to me,” said LoCasio.

What are the pitfalls of this technology?

Some experts and observers are more wary of AI resurrections, questioning whether deeply grieving people can really make the informed decision to use it, and warning about its adverse psychological effects.

“The biggest concern that I have as a clinician is that mourning is actually very important. It’s an important part of development that we are able to acknowledge the missing of another person,” said Alessandra Lemma, consultant at the Anna Freud National Centre for Children and Families.

Prolonged use could keep people from coming to terms with the absence of the other person, leaving them in a state of “limbo”, Lemma warned.

Indeed, one AI service has marketed a perpetual connection with the deceased person as a key feature.

“Welcome to YOV (You, Only Virtual), the AI startup pioneering advanced digital communications so that we Never Have to Say Goodbye to those we love,” read the company’s website, before it was recently updated.

Rohrer said that his grief bot has an “in-built” limiting factor: users pay $10 for a limited conversation.

The fee buys time on a supercomputer, with each response varying in computational cost. This means $10 doesn’t guarantee a fixed number of responses, but can allow for one to two hours of conversation. As the time is about to lapse, users are sent a notification and can say their final goodbyes.

Several other AI-generated conversational services also charge a fee for use.

Lemma, who has researched the psychological impact of grief bots, says that while she worries about the prospects of them being used outside a therapeutic context, it could be used safely as an adjunct to therapy with a trained professional.

Studies around the world are also observing the potential for AI to deliver mental health counselling, particularly through individualised conversational tools.

Are such tools unnatural?

These services may appear to be straight out of a Black Mirror episode.

But supporters of this technology argue that the digital age is simply ushering in new ways of preserving life stories, and potentially filling a void left by the erosion of traditional family storytelling practices.

“In the olden days, if a parent knew they were dying, they would leave boxes full of things that they might want to pass on to a child or a book,” said Lemma. “So, this might be the 21st-century version of that, which is then passed on and is created by the parents in anticipation of their passing.”

LoCasio at Eternos agrees.

“The ability for a human to tell the stories of their life, and pass those along to their friends and family, is actually the most natural thing,” he said.

Are AI resurrection services safe and private?

Experts and studies alike have expressed concerns that such services may fail to keep data private.

Personal information or data such as text messages shared with these services could potentially be accessed by third parties.

Even if a firm says it will keep data private when someone first signs up, common revisions to terms and conditions, as well as possible changes in company ownership mean that privacy cannot be guaranteed, cautioned Renee Richardson Gosline, senior lecturer at the MIT Sloan School of Management.

Both Rohrer and LoCasio insisted that privacy was at the heart of their projects. Rohrer can only view conversations when users file a customer support request, while LoCasio’s Eternos limits access to the digital legacy to authorised relatives.

However, both agreed that such concerns could potentially manifest in the case of tech giants or for-profit companies.

One big worry is that companies may use AI resurrections to customise how they market themselves to users.

An advertisement in the voice of a loved one, a nudge for a product in their text.

“When you’re doing that with people who are vulnerable, what you’ve created is a pseudo-endorsement based on someone who never agreed to do such a thing. So it really is a problem with regard to agency and asymmetry of power,” said Gosline.

Are there any other concerns over AI chatbots?

That these tools are fundamentally catering to a market of people dealing with grief in itself makes them risky, suggested Gosline – especially when Big Tech companies enter the game.

“In a culture of tech companies which is often described as ‘move fast and break things’, we ought to be concerned because what’s typically broken first are the things of the vulnerable people,” said Gosline. “And I’m hard-pressed to think of people who are more vulnerable than those who are grieving.”

Experts have raised concerns about the ethics of creating a digital resurrection of the dead, particularly in cases where they have not consented to it and users feed AI the data.

The environmental impact of AI-powered tools and chatbots is also a growing concern, particularly when involving large language models (LLMs) – systems trained to understand and generate human-like text, which power applications like chatbots.

These systems need giant data centres that emit high levels of carbon and use large volumes of water for cooling, in addition to creating e-waste due to frequent hardware upgrades.

A report in early July from Google showed that the company was far behind its ambitious net-zero goals, owing to the demand AI was putting on its data centres.

Gosline said that she understands that there is no perfect programme and that many users of such AI chatbots would do anything to reconnect with a deceased loved one. But it’s on leaders and scientists to be more thoughtful about the kind of world they want to create, she said.

Fundamentally, she said, they need to ask themselves one question: “Do we need this?”

Source: Al Jazeera