Using AI to Bring the Deceased Back to Life

cover
5 Jun 2024

Imagine being able to converse with Socrates, Leonardo Da Vinci, Galileo, or Einstein. Or what if we were able to communicate with former U.S. Presidents George Washington, Abraham Lincoln, or Franklin Roosevelt? Wouldn’t it be incredible if we could chat with Mary Queen of Scots, King Henry VIII, or Queen Elizabeth II? Or what if we could just speak again with our loved ones?

The idea of using AI to replicate the voice and personality of a deceased person is not new. Some mourners have already begun using it to cope with their grief.

As a recent CNN article notes, “People have wanted to reconnect with deceased loved ones for centuries, whether they’ve visited mediums and spiritualists or leaned on services that preserve their memory. But what’s new now is that AI can make those loved ones say or do things they never said or did in life, raising both ethical concerns and questions around whether this helps or hinders the grieving process.”

Some experts say this type of AI technology “raises important questions about the rights, dignities and consenting power of people who are no longer alive” and “poses ethical concerns about whether a program that caters to the bereaved should be advertising other products on its platform, for example.”

AI ethicists believe there must be regulation put in place and the business of helping people seemingly communicate with their deceased loved ones must operate according to specified guidelines so as not to cause any harm.

A study by the University of Cambridge declares that unscrupulous companies and reckless business practices could cause lasting psychological damage and essentially disrespect the rights of the deceased.

Nonetheless, there are many people who seek to use AI to interact with lost loved ones as they mourn and try to heal. The market is particularly strong in China, where several companies now offer such technologies, and thousands of customers have already paid for them.

It is likely that psychologists would agree that maintaining a deepfake AI version of a loved one is probably unhealthy in the long run, but in the short term, it is possible that it could assist in grieving as long as it is used in a healthy way.

Given that AI and deepfakes are relatively new, it is difficult to gauge where this technology will go. Eventually, a realistic hologram of a loved one could become a possibility, and the ethics of recreating a 3D version of a deceased person will have to be debated and discussed.

Recreating the voice or image of a deceased person could be endearing, amusing, or healing, but it should not be used for malicious purposes such as advertising. Companies and customers will need to ensure this type of usage does not take place and regulations should ensure the safety of AI deepfakes as well as the dignity of the people being recreated.

Perhaps, deepfake AI technology should focus on reviving dead historical figures with whom we can communicate in an educational way and from whom we can learn history, knowledge, and facts. Schoolchildren around the world would immediately enhance their education if they could talk with their science hero, a famous astrologer or philosopher, or some other historical figure they are learning about.

The possibilities are endless – and so are the risks. If we bring the deceased back to life through AI, let’s be sure to do it carefully and with respect.