“My mom didn’t seem very natural, but I still heard the words that she often said: ‘Have you eaten yet?’” Sun recalls of the first interaction. Because generative AI was a nascent technology at the time, the replica of his mom can say only a few pre-written lines. But Sun says that’s what she was like anyway. “She would always repeat those questions over and over again, and it made me very emotional when I heard it,” he says.
There are plenty of people like Sun who want to use AI to preserve, animate, and interact with lost loved ones as they mourn and try to heal. The market is particularly strong in China, where at least half a dozen companies are now offering such technologies and thousands of people have already paid for them. In fact, the avatars are the newest manifestation of a cultural tradition: Chinese people have always taken solace from confiding in the dead.
The technology isn’t perfect—avatars can still be stiff and robotic—but it’s maturing, and more tools are becoming available through more companies. In turn, the price of “resurrecting” someone—also called creating “digital immortality” in the Chinese industry—has dropped significantly. Now this technology is becoming accessible to the general public.
Some people question whether interacting with AI replicas of the dead is actually a healthy way to process grief, and it’s not entirely clear what the legal and ethical implications of this technology may be. For now, the idea still makes a lot of people uncomfortable. But as Silicon Intelligence’s other cofounder, CEO Sima Huapeng, says, “Even if only 1% of Chinese people can accept [AI cloning of the dead], that’s still a huge market.”
AI resurrection
Avatars of the dead are essentially deepfakes: the technologies used to replicate a living person and a dead person aren’t inherently different. Diffusion models generate a realistic avatar that can move and speak. Large language models can be attached to generate conversations. The more data these models ingest about someone’s life—including photos, videos, audio recordings, and texts—the more closely the result will mimic that person, whether dead or alive.
China has proved to be a ripe market for all kinds of digital doubles. For example, the country has a robust e-commerce sector, and consumer brands hire many livestreamers to sell products. Initially, these were real people—but as MIT Technology Review reported last fall—many brands are switching to AI-cloned influencers that can stream 24/7.
In just the past three years, the Chinese sector developing AI avatars has matured rapidly, says Shen Yang, a professor studying AI and media at Tsinghua University in Beijing, and replicas have improved from minutes-long rendered videos to 3D “live” avatars that can interact with people.
This year, Sima says, has seen a tipping point, with AI cloning becoming affordable for most individuals. “Last year, it cost about $2,000 to $3,000, but it now only costs a few hundred dollars,” he says. That’s thanks to a price war between Chinese AI companies, which are fighting to meet the thriving demand for digital avatars in other sectors like streaming.
In fact, demand for applications that re-create the dead has also boosted the capabilities of tools that digitally replicate the living.
Silicon Intelligence offers both services. When Sun and Sima launched the company, they were focused on using text-to-speech technologies to create audio and then using those AI-generated voices in applications such as robocalls.
But after the company replicated Sun’s mother, it pivoted to generating realistic avatars. That decision turned the company into one of the leading Chinese players creating AI-powered influencers.
Its technology has generated avatars for hundreds of thousands of TikTok-like videos and streaming channels, but Sima says more recently it’s seen around 1,000 clients use it to replicate someone who’s passed away. “We started our work on ‘resurrection’ in 2019 and 2020,” he says, but at first people were slow to accept it: “No one wanted to be the first adopters.”
The quality of the avatars has improved, he says, which has boosted adoption. When the avatar looks increasingly lifelike and gives fewer out-of-character answers, it’s easier for users to treat it as their deceased family member. Plus, the idea is getting popularized through more depictions on Chinese TV.
Now Silicon Intelligence offers the replication service for a price between several hundred and several thousand dollars. The most basic product comes as an interactive avatar in an app, and the options at the upper end of the range often involve more customization and better hardware components, such as a tablet or a display screen. There are at least a handful more Chinese companies working on the same technology.
A modern twist on tradition
The business in these deepfakes builds on China’s long cultural history of communicating with the dead.
In Chinese homes, it’s common to put up a portrait of a deceased relative for a few years after the death. Zhang Zewei, founder of a Shanghai-based company called Super Brain, says he and his team wanted to revamp that tradition with an “AI photo frame.” They create avatars of deceased loved ones that are pre-loaded onto an Android tablet, which looks like a photo frame when standing up. Clients can choose a moving image that speaks words drawn from an offline database or from an LLM.
“In its essence, it’s not much different from a traditional portrait, except that it’s interactive,” Zhang says.
Zhang says the company has made digital replicas for over 1,000 clients since March 2023 and charges $700 to $1,400, depending on the service purchased. The company plans to release an app-only product soon, so that users can access the avatars on their phones, and could further reduce the cost to around $140.
The purpose of his products, Zhang says, is therapeutic. “When you really miss someone or need consolation during certain holidays, you can talk to the artificial living and heal your inner wounds,” he says.
And even if that conversation is largely one-sided, that’s in keeping with a strong cultural tradition. Every April during the Qingming festival, Chinese people sweep the tombs of their ancestors, burn joss sticks and fake paper money, and tell them what has happened in the past year. Of course, those conversations have always been one-way.
But that’s not the case for all Super Brain services. The company also offers deepfaked video calls in which a company employee or a contract therapist pretends to be the relative who passed away. Using DeepFace, an open-source tool that analyzes facial features, the deceased person’s face is reconstructed in 3D and swapped in for the live person’s face with a real-time filter.
At the other end of the call is usually an elderly family member who may not know that the relative has died—and whose family has arranged the conversation as a ruse.
Jonathan Yang, a Nanjing resident who works in the tech industry, paid for this service in September 2023. His uncle died in a construction accident, but the family hesitated to tell Yang’s grandmother, who is 93 and in poor health. They worried that she wouldn’t survive the devastating news.
So Yang paid $1,350 to commission three deepfaked calls of his dead uncle. He gave Super Brain a handful of photos and videos of his uncle to train the model. Then, on three Chinese holidays, a Super Brain employee video-called Yang’s grandmother and told her, as his uncle, that he was busy working in a faraway city and wouldn’t be able to come back home, even during the Chinese New Year.
“The effect has met my expectations. My grandma didn’t suspect anything,” Yang says. His family did have mixed opinions about the idea, because some relatives thought maybe she would have wanted to see her son’s body before it was cremated. Still, the whole family got on board in the end, believing the ruse would be best for her health. After all, it’s pretty common for Chinese families to tell “necessary” lies to avoid overwhelming seniors, as depicted in the movie The Farewell.
To Yang, a close follower of the AI industry trends, creating replicas of the dead is one of the best applications of the technology. “It best represents the warmth [of AI],” he says. His grandmother’s health has improved, and there may come a day when they finally tell her the truth. By that time, Yang says, he may purchase a digital avatar of his uncle for his grandma to talk to whenever she misses him.
Is AI really good for grief?
Even as AI cloning technology improves, there are some significant barriers preventing more people from using it to speak with their dead relatives in China.
On the tech side, there are limitations to what AI models can generate. Most LLMs can handle dominant languages like Mandarin and Cantonese, but they aren’t able to replicate the many niche dialects in China. It’s also challenging—and therefore costly—to replicate body movements and complex facial expressions in 3D models.
Then there’s the issue of training data. Unlike cloning someone who’s still alive, which often involves asking the person to record body movements or say certain things, posthumous AI replications must rely on whatever videos or photos are already available. And many clients don’t have high-quality data, or enough of it, for the end result to be satisfactory.
Complicating these technical challenges are myriad ethical questions. Notably, how can someone who is already dead consent to being digitally replicated? For now, companies like Super Brain and Silicon Intelligence rely on the permission of direct family members. But what if family members disagree? And if a digital avatar generates inappropriate answers, who is responsible?
Similar technology caused controversy earlier this year. A company in Ningbo reportedly used AI tools to create videos of deceased celebrities and posted them on social media to speak to their fans. The videos were generated using public data, but without seeking any approval or permission. The result was intense criticism from the celebrities’ families and fans, and the videos were eventually taken down.
“It’s a new domain that only came about after the popularization of AI: the rights to digital eternity,” says Shen, the Tsinghua professor, who also runs a lab that creates digital replicas of people who have passed away. He believes it should be prohibited to use deepfake technology to replicate living people without their permission. For people who have passed away, all of their immediate living family members must agree beforehand, he says.
There could be negative effects on clients’ mental health, too. While some people, like Sun, find their conversations with avatars to be therapeutic, not everyone thinks it’s a healthy way to grieve. “The controversy lies in the fact that if we replicate our family members because we miss them, we may constantly stay in the state of mourning and can’t withdraw from it to accept that they have truly passed away,” says Shen. A widowed person who’s in constant conversation with the digital version of their partner might be held back from seeking a new relationship, for instance.
“When someone passes away, should we replace our real emotions with fictional ones and linger in that emotional state?” Shen asks. Psychologists and philosophers who talked to MIT Technology Review about the impact of grief tech have warned about the danger of doing so.
Sun Kai, at least, has found the digital avatar of his mom to be a comfort. She’s like a 24/7 confidante on his phone. Even though it’s possible to remake his mother’s avatar with the latest technology, he hasn’t yet done that. “I’m so used to what she looks like and sounds like now,” he says. As years have gone by, the boundary between her avatar and his memory of her has begun to blur. “Sometimes I couldn’t even tell which one is the real her,” he says.
And Sun is still okay with doing most of the talking. “When I’m confiding in her, I’m merely letting off steam. Sometimes you already know the answer to your question, but you still need to say it out loud,” he says. “My conversations with my mom have always been like this throughout the years.”
But now, unlike before, he gets to talk to her whenever he wants to.