Latest news with #HereAfterAI


Irish Times
a day ago
- Irish Times
The victim delivered a searing impact statement. Just one thing felt off
It was a routine enough tableau; a judge, sitting at the bench, watching the victim of a violent attack address a courtroom via video as they forgave their attacker and asked for leniency. The judge held the fate of the perpetrator, already found guilty and awaiting sentencing, in their hands. As the video statement ended, the judge commented that he 'loved' it, that he 'heard the forgiveness'. It was a moving moment. The only issue was that the victim had been dead for three and a half years. The video was an AI -generated victim impact statement from a murdered man, Christopher Pelkey. This use of synthetically generated video and audio of a murder victim in an Arizona court last month felt like another 'puffer jacket pope' moment. The viral AI-generated image of Pope Francis in a white Balenciaga-style down jacket fooled millions and catapulted image generation tools into the cultural mainstream. Now, along with popes in puffer jackets, we have another watershed moment in 'ghostbots'. READ MORE Unlike the people it depicts, the 'digital afterlife industry', as it is more formally known, is alive and kicking. Companies with names such as HereAfter AI and You Only Virtual allow users to create digital archives of themselves so that the people they leave behind can interact with 'them' once they are gone. These apps market themselves to the living or bypass the person being digitally cloned altogether. The bereaved are now offered the promise of 'regenerating' their deceased relatives and friends. People out there are, at this moment, interacting with virtual renderings of their mothers and spouses on apps with names such as Re:memory and Replika. They don't need the participation or consent of the deceased. The video used to reanimate Christopher Pelkey was created using widely available tools and a few simple reference points – a YouTube interview and his obituary photo, according to The New York Times . This gives the generated footage the feel of a decent cheapfake rather than a sophisticated deepfake. Watching it, you find yourself in the so-called 'uncanny valley', that feeling you get when interacting with a bot, when your brain knows something is not quite right. This person is too serene, too poreless, too ethereal as they stare into your eyes and talk about their own death. Pelkey's sister wrote the script, imagining the message she believed her brother would have wanted to deliver. This includes the synthetic version of Pelkey addressing 'his' killer: 'It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God, who forgives. I always have and I still do.' [ Why Greeks are in pole position when it comes to artificial intelligence Opens in new window ] I do not doubt that the Pelkey family had good intentions. They had a point they wanted to make, saw a tool to let them do it, and were permitted to do so by the court. They also likely believe they know what their lost loved one would have wanted. But should anyone really have the power to put words in the mouth and voice of the deceased? We often fret about AI image and video generation tools being used to mislead us, to trick us as voters or targets of scams. But deception and manipulation are not the same thing. In that Arizona courtroom there was no intention to deceive: no one thought this was the actual murder victim speaking. Yet that does not diminish its emotional impact. If we can have the murdered plea for peace, does that mean we could also have AI ghosts asking for vengeance, retribution or war? Political actors have embraced generative AI, with its ability to cheaply make persuasive, memorable content. Despite fears it would be used for disinformation, most public use cases are of non-deceptive 'soft fakes'. An attack ad against Donald Trump, for example, featured audio of a synthetic version of his voice saying out loud something he had only written in a tweet. However, the real political AI innovation is happening in India, where last year candidates did things such as create videos of them speaking in languages they do not know, and even generate digital 'endorsements' from long dead figures. One candidate had the voice of his father, who died from Covid in 2020, tell voters; 'Though I died, my soul is still with all of you ... I can assure you that my son, Vijay, will work for the betterment of Kanniyakumari.' Vijay won. People have long tried to speak for the dead, often to further their own ends. AI turbo charges this into a kind of morbid ventriloquism, rendered in high definition and delivered with reverential sincerity. But the danger isn't that we mistake these digital ghosts for the real thing, it's that we know what they are, and still acquiesce to being emotionally manipulated by them. Maybe now we all need to look into whether we need to write a will with a new kind of DNR: Do Not Regenerate.


Time of India
11-06-2025
- Business
- Time of India
AI Resurrections and Their Place in India's Tech Landscape- Could India Lead the AI Afterlife Market? Digital Resurrection Meets Desi Sentiment.
Why India Makes Sense for AI Afterlife Services: Live Events The Ethical Conflict: When the world witnessed Zhang Yiyi, a Chinese father, share a video of his 'conversation with his departed son made possible by an AI-powered avatar,' people on the internet were left in awe and discomfort. The ability to "resurrect" the departed sounds like a concept straight out of science fiction; with AI, however, it has entered the mainstream tech conversations. The " Grief Tech " sector has piqued India's interest due to its potential benefits, especially given the country's profound connections to legacy and to the world of AI resurrections , also known as 'Grief Tech,' a blend of generative AI, deepfake technology, and emotional design that replicates the face, voice, mannerisms, and appearance of the startups like South Korea's DeepBrain AI and the U.S.-based HereAfter AI are already creating grief-tech services, such as interactive memorial avatars that simulate conversations with the deceased. These services blend audio cloning, facial synthesis, and memory curation. Closer to home, startups like Deepsync and Resemble AI (which has Indian engineering roots) are laying the technical foundation for voice cloning and emotion AI in India. While India doesn't yet have a fully dedicated grief tech startup, the building blocks are firmly in offers a unique cultural context for the digital afterlife , like its strong ancestral reverence. Indian society places high emotional and ritualistic value on honoring the departed. Services that could potentially preserve their being or simulate the departed's voice or likeness could appeal to this from satisfying sentimental aspects, India has a massive digital archival potential. Thanks to smartphone penetration, the average urban Indian now leaves behind gigabytes of videos, photos, and voice messages, perfect for AI to digest and generate optimal Indian start-ups, supported by a large AI talent pool and lower development costs, can build tools for this market a lot more cost-effectively than most of their global counterparts. (Economic Survey 2023-24, Nasscom-Deloitte AI talent report)Startups working in synthetic media, such as Deepsync (voice cloning for podcasting), and academic labs at IIIT-Hyderabad and IIT-Madras, have the technical backbone to power such services. Generative AI labs like Sarvam AI are also investing in emotion-sensitive models that could power ethical memory bots . These players might not brand themselves as "grief tech," but they're producing the very engines on which India's version of the digital afterlife could related to death are difficult to talk about and are often associated with immense sensitivity, making the inevitability of controversy surrounding digital resurrection evident. The concept of digital resurrection continues to evoke mixed opinions, with concerns of privacy and ethics. Critics warn against psychological dependency, exploitation of grief, and issues of consent, especially when the departed never agreed on being 'digitally revived.' The emergence of this concept challenges conventional understandings of mourning and memory in an increasingly digital world. It has also raised awareness about the significance of digital wills among the at it through a culture-centric lens, cultural attitudes may vary sharply across regions in India; while some embrace AI for spiritual continuity, others could see it as an interference with 'karmic cycles' or 'dharma. The legal framework in India remains underdeveloped regarding issues such as the data rights of the deceased, regulations surrounding deepfakes, and the management of digital India continues to define its position in the global AI landscape, grief tech could become a niche where Indian startups innovate with both cultural and emotional intelligence; blending memory, technology, and emotion into a service economy tailored to the afterlife.


New York Times
30-03-2025
- Entertainment
- New York Times
Optimization Culture Comes for Grief
An older Korean man named Mr. Lee, dressed in a blazer and slacks, clutches the arms of his chair and leans toward his wife. 'Sweetheart, it's me,' he says. 'It's been a long time.' 'I never expected this would happen to me,' she replies through tears. 'I'm so happy right now.' Mr. Lee is dead. His widow is speaking to an A.I.-powered likeness of him projected onto a wall. 'Please, never forget that I'm always with you,' the projection says. 'Stay healthy until we meet again.' This conversation was filmed as part of a promotional campaign for Re;memory, an artificial intelligence tool created by the Korean start-up DeepBrain AI, which offers professional-grade studio and green-screen recording (as well as relatively inexpensive ways of self-recording) to create lifelike representations of the dead. It's part of a growing market of A.I. products that promise users an experience that closely approximates the impossible: communicating and even "reuniting' with the deceased. Some of the representations — like those offered by HereAfter AI and StoryFile, which also frames its services as being of historical value — can be programmed with the person's memories and voices to produce realistic holograms or chatbots with which family members or others can converse. The desire to bridge life and death is innately human. For millenniums, religion and mysticism have offered pathways for this — blurring the lines of logic in favor of the belief in eternal life. Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times. Thank you for your patience while we verify access. Already a subscriber? Log in. Want all of The Times? Subscribe.