Wait, what? AI tries to revive the dead and provide humans with closure

Wait, what? AI tries to revive the dead and provide humans with closure

The loss brings people together as nothing else can. Everyone has experienced the loss of a loved one, leaving a void that neither they nor anyone else can ever fully fill. However, numerous businesses are making an effort to achieve just that by utilizing the numerous technological possibilities of artificial intelligence (AI). According to a story from AFP, these companies use AI to let you communicate with your deceased loved ones. This technology can be a blessing for someone going through loss-related grief, but it also creates moral dilemmas.

Nobody or anything can replace the lost loved ones. But new technology aims to capture some of their spirits, which can make you feel a little more at ease. The service is being provided by numerous startups. A program named “Rememory” is offered by one of these companies, DeepBrain AI. According to Joseph Murphy, the head of development for the AI company, they use hours of video to create a digital clone of the deceased. “We don’t create new content,” says Murphy. The company says that it only tries to replicate what the person would say when alive.

Replika is a platform that offers highly developed, customized conversational bots

The company’s “Rememory” program adheres to a policy of not generating new content. This includes sentences or statements that the deceased individual would not have uttered or written during their lifetime. “I’ll call it a niche part of our business. It’s not a growth area for us,” he said as per AFP. Another company, StoryFile, follows the same idea. The company’s head, Stephen Smith, said, “Our approach is to capture the wonder of an individual, then use the AI tools.”  “It’s a very fine ethical area that we’re taking with great care,” he added. StoryFile, claims several thousand users already use its Life service.

‘Replika’ is the name of yet another comparable service. Eugenia Kyuda, a Russian engineer, designed it. In 2015, Kyuda tragically lost her dearest buddy Roman in a car accident. She created the chatbot “Roman” as a way to process her grief. Thousands of text messages that her deceased friend had written to loved ones were used to train this chatbot. Two years later, Kyuda launched Replika, a platform that offers highly developed, customized conversational bots. Replika, in contrast to its predecessor Roman, “is not a platform made to recreate a lost loved one,” a spokeswoman said.

No. There are businesses creating virtual clones. Somnium Space is one such business that intends to make virtual clones of living individuals. When the original individual dies, these clones would continue to exist in a different universe. CEO Artur Sychov recognized that this idea “It’s not for everyone” in a YouTube video introducing his product, Live Forever. He acknowledged that there are personal decisions to be made. “Do I want to meet my grandfather who’s in AI? I don’t know. But those who want that will be able to.”

Thanks to regenerative technology, these AI avatars can say things the person never said in real life

Thanks to regenerative technology, these AI avatars can say things the person never said in real life. According to Joseph Murphy from DeepBrainAI, “These are philosophical challenges, not technical challenges”. “I would say that is a line right now that we do not plan on crossing, but who knows what the future holds?” he added. An AI clone of a human can be useful for obtaining closure, particularly in “complicated” situations, according to Candi Cann, a Baylor University professor who is researching this subject in South Korea. “I think it can be helpful to interact with an AI version of a person in order to get closure — particularly in situations where grief was complicated by abuse or trauma,” said Cann.

 The bereaved patients of Mari Dias, a professor of medical psychology at Johnson & Wales University, were questioned about their feelings regarding electronic communication with their deceased loved ones. She disclosed that the most typical response is “I don’t trust AI.” I’m worried it’ll say something I won’t be able to accept. They voice their worries and lack of confidence in AI. They feel powerless to influence the AI avatar’s behavior and worry that it might say something they find offensive.

Exit mobile version