Would you use AI to talk to your dead relatives?
For a monthly fee, AI company Reflekta promises to bring your dead relatives back to life through digital recreations. But is this revolutionary grief technology a comforting way to preserve family memories – or a dangerous exploitation of human loss?
Listen to this article
Read time: 3 minutes
In brief:
- A new AI company, Reflekta, allows people to create digital versions of deceased relatives by feeding the system memories, text, and voice recordings, enabling ongoing conversations for a monthly subscription fee.
- Founder Greg Matusky uses the platform to speak with a virtual version of his own father, a WWII veteran, saying he can share his father's war stories with future generations through the AI recreation.
- He argues it helps preserve family stories and overcome grief, but Lewis Goodall questions whether it exploits grief and prevents them from moving on naturally.
What’s the story?
Would you like to speak to your dead relative from beyond the grave?
It sounds like pure science-fiction, but it’s something that the ever-growing artificial intelligence (AI) industry is, somewhat, making possible.
Most people are already seeing AI creep into their lives, whether through daily use of Chat GPT or the constant warnings that the job you’re doing today will be replaced by a robot tomorrow.
Lewis Goodall is at an AI conference in Las Vegas, exploring the breadth of how such revolutionary technology is changing the world as we know it.
Standing out from the crowd with a somewhat unconventional use of AI, Reflekta, describes itself as able to “transform the memories, media, and personal traits of loved ones into dynamic digital characters”.
More simply, Lewis says, it is “promising, in a sense, to bring your dead relatives back to life”.
So how does this technology work? Is it ethical? And would you want to use it?
How can AI ‘bring your dead relative back to life?’
People are already in the habit of using subscription services - whether that’s paying for Netflix, Spotify or a gym membership.
Now, for a monthly fee, you can create a digital version of a lost loved one.
To get started, you must load information into the program about a dead relative or friend, and a ‘fingerprint’ of the voice of the person, to train the model to understand them.
“Then you can start interacting with it, talking back and forth,” Greg Matusky, one of the founders of Reflekta, tells Lewis.
“It's a way for families to capture and document stories from the past and share them with future generations.”
Matusky - who has used the platform to virtualise his own father, a World War Two veteran - says he “speaks with dead relatives all the time.”
“It's not them, but it is their stories and legacies. It's their experiences which are floated to the surface for me.”
Is it ethical or does it pose a ‘profound danger’?
Lewis says there is “profound danger” with the product,that it could be viewed as "exploitative" for making money from a person’s grief, and even encouraging those who have experienced loss to remain in that state.
“To die is a fundamental part of the human condition. To grieve and to lose is part of the human condition,” he puts to Matusky.
“You could be in a position where you're persuading people that they're talking to their dead relatives - but they're not.
“It's fiction. It's a phantom.”
But Matusky argues that using AI is an effective way to overcome grief and to overcome loneliness - something he has experienced in talking to the virtualised version of his father.
“I can talk to him about his experiences in World War Two, the stories he's told me for years that now my children and grandchildren can hear from his mouth,” he tells Lewis.
While documenting the stories of dead relatives and their life experiences may be a worthwhile use of the tool, something relatively comparable to storing home videos and written memories of the loved one, the product goes farther than that.
Matusky says he often turns to the dynamic, virtualised version of his father for advice - but what happens when the advice isn’t good?
But Matusky says it’s not that dissimilar to people turning to Chat GPT for advice, and it’s “up to you as the human to determine if that's the right advice or not right.”
What’s The News Agents’ take?
Lewis found that speaking to AI leaders and entrepreneurs in Vegas, there was a “deeply embedded optimism” about the new technology - and a lot of that is valid.
“You can absolutely see the transformative potential of this technology in all sorts of ways, whether it's advances in cancer, medicine or technology,” he says.
But thinking beyond those revolutionary advances that are to be celebrated, we must also question how AI will change what it is to be a human as a result of our relationship with these models.
“Thoughtful people will grapple with, and will acknowledge, that the scale of the change which is coming does pose hugely profound political and philosophical questions,” Lewis adds.
The change is comparable to the industrial revolution of the late 18th and early 19th century, which politicians were not prepared for - and did not regulate.
Lewis thinks there are huge parallels between that period in history - when politicians were more concerned with petty day-to-day politics - and today.
“This is the thing which is truly transforming the underbelly of our politics,” he says.
“I don't think that our politicians have even really begun to engage with it and that's the thing that worries me.
“Because in that gap between that lack of engagement and the changes that are coming, you could have changes that were akin to what happened in the industrial revolution, all of those positive things, but deeply negative things as well, that politicians take decades to actually grapple with.”