logo
#

Latest news with #ChristopherPelkey

The victim delivered a searing impact statement. Just one thing felt off
The victim delivered a searing impact statement. Just one thing felt off

Irish Times

time20 hours ago

  • Irish Times

The victim delivered a searing impact statement. Just one thing felt off

It was a routine enough tableau; a judge, sitting at the bench, watching the victim of a violent attack address a courtroom via video as they forgave their attacker and asked for leniency. The judge held the fate of the perpetrator, already found guilty and awaiting sentencing, in their hands. As the video statement ended, the judge commented that he 'loved' it, that he 'heard the forgiveness'. It was a moving moment. The only issue was that the victim had been dead for three and a half years. The video was an AI -generated victim impact statement from a murdered man, Christopher Pelkey. This use of synthetically generated video and audio of a murder victim in an Arizona court last month felt like another 'puffer jacket pope' moment. The viral AI-generated image of Pope Francis in a white Balenciaga-style down jacket fooled millions and catapulted image generation tools into the cultural mainstream. Now, along with popes in puffer jackets, we have another watershed moment in 'ghostbots'. READ MORE Unlike the people it depicts, the 'digital afterlife industry', as it is more formally known, is alive and kicking. Companies with names such as HereAfter AI and You Only Virtual allow users to create digital archives of themselves so that the people they leave behind can interact with 'them' once they are gone. These apps market themselves to the living or bypass the person being digitally cloned altogether. The bereaved are now offered the promise of 'regenerating' their deceased relatives and friends. People out there are, at this moment, interacting with virtual renderings of their mothers and spouses on apps with names such as Re:memory and Replika. They don't need the participation or consent of the deceased. The video used to reanimate Christopher Pelkey was created using widely available tools and a few simple reference points – a YouTube interview and his obituary photo, according to The New York Times . This gives the generated footage the feel of a decent cheapfake rather than a sophisticated deepfake. Watching it, you find yourself in the so-called 'uncanny valley', that feeling you get when interacting with a bot, when your brain knows something is not quite right. This person is too serene, too poreless, too ethereal as they stare into your eyes and talk about their own death. Pelkey's sister wrote the script, imagining the message she believed her brother would have wanted to deliver. This includes the synthetic version of Pelkey addressing 'his' killer: 'It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God, who forgives. I always have and I still do.' [ Why Greeks are in pole position when it comes to artificial intelligence Opens in new window ] I do not doubt that the Pelkey family had good intentions. They had a point they wanted to make, saw a tool to let them do it, and were permitted to do so by the court. They also likely believe they know what their lost loved one would have wanted. But should anyone really have the power to put words in the mouth and voice of the deceased? We often fret about AI image and video generation tools being used to mislead us, to trick us as voters or targets of scams. But deception and manipulation are not the same thing. In that Arizona courtroom there was no intention to deceive: no one thought this was the actual murder victim speaking. Yet that does not diminish its emotional impact. If we can have the murdered plea for peace, does that mean we could also have AI ghosts asking for vengeance, retribution or war? Political actors have embraced generative AI, with its ability to cheaply make persuasive, memorable content. Despite fears it would be used for disinformation, most public use cases are of non-deceptive 'soft fakes'. An attack ad against Donald Trump, for example, featured audio of a synthetic version of his voice saying out loud something he had only written in a tweet. However, the real political AI innovation is happening in India, where last year candidates did things such as create videos of them speaking in languages they do not know, and even generate digital 'endorsements' from long dead figures. One candidate had the voice of his father, who died from Covid in 2020, tell voters; 'Though I died, my soul is still with all of you ... I can assure you that my son, Vijay, will work for the betterment of Kanniyakumari.' Vijay won. People have long tried to speak for the dead, often to further their own ends. AI turbo charges this into a kind of morbid ventriloquism, rendered in high definition and delivered with reverential sincerity. But the danger isn't that we mistake these digital ghosts for the real thing, it's that we know what they are, and still acquiesce to being emotionally manipulated by them. Maybe now we all need to look into whether we need to write a will with a new kind of DNR: Do Not Regenerate.

Why AI ‘reanimations' of the dead may not be ethical
Why AI ‘reanimations' of the dead may not be ethical

Fast Company

time3 days ago

  • Entertainment
  • Fast Company

Why AI ‘reanimations' of the dead may not be ethical

Christopher Pelkey was shot and killed in a road range incident in 2021. On May 8, 2025, at the sentencing hearing for his killer, an AI video reconstruction of Pelkey delivered a victim impact statement. The trial judge reported being deeply moved by this performance and issued the maximum sentence for manslaughter. As part of the ceremonies to mark Israel's 77th year of independence on April 30, 2025, officials had planned to host a concert featuring four iconic Israeli singers. All four had died years earlier. The plan was to conjure them using AI-generated sound and video. The dead performers were supposed to sing alongside Yardena Arazi, a famous and still very much alive artist. In the end Arazi pulled out, citing the political atmosphere, and the event didn't happen. In April, the BBC created a deepfake version of the famous mystery writer Agatha Christie to teach a 'maestro course on writing.' Fake Agatha would instruct aspiring murder mystery authors and 'inspire' their 'writing journey.' The use of artificial intelligence to 'reanimate' the dead for a variety of purposes is quickly gaining traction. Over the past few years, we've been studying the moral implications of AI at the Center for Applied Ethics at the University of Massachusetts, Boston, and we find these AI reanimations to be morally problematic. Before we address the moral challenges the technology raises, it's important to distinguish AI reanimations, or deepfakes, from so-called griefbots. Griefbots are chatbots trained on large swaths of data the dead leave behind—social media posts, texts, emails, videos. These chatbots mimic how the departed used to communicate and are meant to make life easier for surviving relations. The deepfakes we are discussing here have other aims; they are meant to promote legal, political, and educational causes. Moral quandaries The first moral quandary the technology raises has to do with consent: Would the deceased have agreed to do what their likeness is doing? Would the dead Israeli singers have wanted to sing at an Independence ceremony organized by the nation's current government? Would Pelkey, the road-rage victim, be comfortable with the script his family wrote for his avatar to recite? What would Christie think about her AI double teaching that class? The answers to these questions can only be deduced circumstantially, from examining the kinds of things the dead did and the views they expressed when alive. And one could ask if the answers even matter. If those in charge of the estates agree to the reanimations, isn't the question settled? After all, such trustees are the legal representatives of the departed. But putting aside the question of consent, a more fundamental question remains. What do these reanimations do to the legacy and reputation of the dead? Doesn't their reputation depend, to some extent, on the scarcity of appearance, on the fact that the dead can't show up anymore? Dying can have a salutary effect on the reputation of prominent people; it was good for John F. Kennedy, and it was good for Israeli Prime Minister Yitzhak Rabin. The fifth-century BC Athenian leader Pericles understood this well. In his famous Funeral Oration, delivered at the end of the first year of the Peloponnesian War, he asserts that a noble death can elevate one's reputation and wash away their petty misdeeds. That is because the dead are beyond reach and their mystique grows postmortem. 'Even extreme virtue will scarcely win you a reputation equal to' that of the dead, he insists. Do AI reanimations devalue the currency of the dead by forcing them to keep popping up? Do they cheapen and destabilize their reputation by having them comment on events that happened long after their demise? In addition, these AI representations can be a powerful tool to influence audiences for political or legal purposes. Bringing back a popular dead singer to legitimize a political event and reanimating a dead victim to offer testimony are acts intended to sway an audience's judgment. It's one thing to channel a Churchill or a Roosevelt during a political speech by quoting them or even trying to sound like them. It's another thing to have 'them' speak alongside you. The potential of harnessing nostalgia is supercharged by this technology. Imagine, for example, what the Soviets, who literally worshipped Lenin's dead body, would have done with a deepfake of their old icon. Good intentions You could argue that because these reanimations are uniquely engaging, they can be used for virtuous purposes. Consider a reanimated Martin Luther King Jr. speaking to our currently polarized and divided nation, urging moderation and unity. Wouldn't that be grand? Or what about a reanimated Mordechai Anielewicz, the commander of the Warsaw Ghetto uprising, speaking at the trial of a Holocaust denier like David Irving? But do we know what MLK would have thought about our current political divisions? Do we know what Anielewicz would have thought about restrictions on pernicious speech? Does bravely campaigning for civil rights mean we should call upon the digital ghost of King to comment on the impact of populism? Does fearlessly fighting the Nazis mean we should dredge up the AI shadow of an old hero to comment on free speech in the digital age? Even if the political projects these AI avatars served were consistent with the deceased's views, the problem of manipulation—of using the psychological power of deepfakes to appeal to emotions—remains. But what about enlisting AI Agatha Christie to teach a writing class? Deepfakes may indeed have salutary uses in educational settings. The likeness of Christie could make students more enthusiastic about writing. Fake Aristotle could improve the chances that students engage with his austere Nicomachean Ethics. AI Einstein could help those who want to study physics get their heads around general relativity. But producing these fakes comes with a great deal of responsibility. After all, given how engaging they can be, it's possible that the interactions with these representations will be all that students pay attention to, rather than serving as a gateway to exploring the subject further. Living on in the living In a poem written in memory of W.B. Yeats, W.H. Auden tells us that after the poet's death Yeats 'became his admirers.' His memory was 'scattered among a hundred cities,' and his work subject to endless interpretation: 'The words of a dead man are modified in the guts of the living.' The dead live on in the many ways we reinterpret their words and works. Auden did that to Yeats, and we're doing it to Auden right here. That's how people stay in touch with those who are gone. In the end, we believe that using technological prowess to concretely bring them back disrespects them and, perhaps more importantly, is an act of disrespect to ourselves—to our capacity to abstract, think, and imagine.

Is anything real anymore? AI testimonials take over the American justice system
Is anything real anymore? AI testimonials take over the American justice system

Time of India

time3 days ago

  • Time of India

Is anything real anymore? AI testimonials take over the American justice system

Generative AI has been developing at a breakneck pace since the high-profile release of ChatGPT in November 2022. The Large Language Model (LLM) garnered massive media recognition for its ability to write complex and coherent responses to simple prompts. Other AI LLMs such as and Microsoft's 'Sydney' (now the AI copilot) also gained media notoriety for the manner in which they seemed to mimic human emotions to an uncanny degree. Written text is not the only area where AI has a disruptive effect, with image generation algorithms such as Midjourney, and video generation programs such as Google Veo progressively blurring the line between what's made by humans, and what's made by AI. Google Veo, in particular, became infamous for generating short videos resembling viral social media posts that had netizens wondering how convincing they looked. These rapid developments in AI technology have led to increased concerns about their disruptive impact on everyday life, and this has now begun to happen in the courtrooms of the United States. AI testimonies are now a part of the US court system AI video is now being introduced as a kind of posthumous testimony in court trials. During a manslaughter sentencing hearing where the victim was an American male named Christopher Pelkey, shot and killed in a road rage incident, an AI video of Perkley played where he gave an impact statement. The video had the AI say 'To Gabriel Horcasidas, the man who shot me, it is a shame we encountered each other that day, under those circumstances…I believe in forgiveness, and a God who forgives and I always have. I still do.' Pelkey's sister, Stacy Wales, had given her own testimony during the sentencing hearing, but didn't feel that her own words alone could properly convey the extent of her grief. Christopher Pelkey was killed in a road rage incident in Chandler in 2021, but last month, artificial intelligence brought him back to life during his killer's sentencing hearing. At the end of the hearing, Gabriel Horcasidas was sentenced to 10.5 years in prison. The ruling has since been appealed, shining a spotlight on the disruptive impact AI tech is already having on America's court system. Speaking to the Associated Press, AI deepfake expert David Evan Harris said that the technology might end up stacking the deck in favour of the wealthy and privileged: 'I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't,' In one of the viral Google Veo videos that took the internet by storm, an AI generated girl says: 'This is wild. I'm AI generated by Veo 3. Nothing is real anymore.' We are Veo 3 just broke the internet.10 wild examples 1. Nothing is real anymore With the increasing normalization of AI technology in everyday life, as well as vital civic avenues such as criminal justice, the impacts of such technologies are sure to be dissected and studied for years to come.

Family uses AI to present message in court from loved one no longer alive
Family uses AI to present message in court from loved one no longer alive

Phone Arena

time12-05-2025

  • Phone Arena

Family uses AI to present message in court from loved one no longer alive

Artificial intelligence (AI) is all around us and while some use it to create funny animated pictures of themselves, others use it to "bring back" their loved ones that have departed. Maybe we've hit a new frontier in courtroom proceedings – recently, a simulated version of a deceased man created with AI spoke directly to his assailant during a sentencing hearing in Arizona. The AI-generated avatar of Christopher Pelkey, developed by his family, was presented in Maricopa County Superior Court, just before Gabriel Paul Horcasitas was sentenced for fatally shooting Pelkey during a 2021 road-rage incident. The digital recreation of Pelkey appeared on video wearing a green sweatshirt and a full beard, standing against a plain white background. Early in the video, the avatar clarified that it was an AI representation, a point made clear by minor audio irregularities and imperfect synchronization of speech and facial movements. In the message, the avatar expressed a sense of tragic irony about the encounter, suggesting that under different circumstances, they might have become friends. Pelkey, a 37-year-old US Army veteran, lost his life in the incident. His family chose to create the AI message to honor his memory and articulate their pain. Since the video was not used as evidence, the court allowed greater flexibility in presenting it during the sentencing phase. Horcasitas, who had already been found guilty of manslaughter and endangerment, received a sentence of ten and a half years in state Wales, Pelkey's sister, wrote the script for the avatar after finding it difficult to fully express her grief in her own words. Although she admitted she could not forgive Horcasitas, she believed her brother would have taken a more empathetic approach. The video, she explained, was meant to remind the court of her brother's humanity and the lasting impact of the tragedy. She worked with her husband and a friend, both of whom are in the tech industry, to produce the use of generative AI in this case introduces a new and emotionally charged way of applying technology in the legal system. While courts have been cautious about AI, especially after incidents where lawyers used fake cases created by AI, this new use without evidence adds another level of Surden, a law professor at the University of Colorado, pointed out that using generative AI in court raises ethical concerns. He explained that simulated content can skip over careful thinking and appeal directly to emotions, making it more powerful – and possibly more problematic – than regular evidence. He stressed that while these tools may seem real, they are still made up and should be treated as such. Image credit – OpenAI Let's hope nobody is in the aforementioned situation, but if you want to give AI videos a try, you could do so by using ChatGPT, for example – it's a straightforward process that combines AI-generated content with video creation to do: Generate a script: Use ChatGPT to create a script for your video. Choose a text-to-video tool: Pick a text-to-videoplatform. Customize and edit: Adjust voiceovers, visuals, and background music. Export and share: Export the video and upload it to platforms like YouTube. First, you'll use ChatGPT to generate a script for your video. Then, you can choose a text-to-video platform (you can find those online) to convert the script into a video, adjusting settings such as voiceovers and visuals. Finally, after previewing and editing the video, you can export it and share it across platforms like YouTube or social media.

Murdered Arizona man ‘returns' to address killer in court
Murdered Arizona man ‘returns' to address killer in court

Free Malaysia Today

time12-05-2025

  • Free Malaysia Today

Murdered Arizona man ‘returns' to address killer in court

Christopher Pelkey was shot and killed in a 2021 road-rage incident. (AP pic) CHANDLER : A simulation of a dead man created by artificial intelligence addressed his killer in an Arizona court this month, in what appears to be one of the first such instances in a US courtroom. Made by his family, an AI-generated avatar of Christopher Pelkey spoke in Maricopa county superior court on May 1, as a judge prepared to sentence Gabriel Paul Horcasitas for shooting and killing Pelkey in a 2021 road-rage incident. 'It is a shame we encountered each other that day in those circumstances,' the Pelkey avatar says in the video. 'In another life, we probably could have been friends.' The Pelkey avatar appears in the video sporting a long beard and green sweatshirt against a white backdrop. He cautions at the start that he is an AI-version of Pelkey, which is apparent through the gaps in audio and slightly mismatched movement of his mouth. Pelkey, a US army veteran, was 37 at the time of the shooting. The video marked a novel use of AI in the legal system, which has viewed the rapidly growing technology with a mix of fascination and trepidation. Courts generally have strict rules on the types of information that can be presented in legal proceedings, and several lawyers have been sanctioned after AI systems created fake cases that they cited in legal briefs. Pelkey's relatives were given more leeway to present the AI-generated video to the judge at sentencing, given that it was not evidence in the case. Horcasitas, who was sentenced to 10.5 years in state prison, had already been convicted on manslaughter and endangerment charges. Pelkey's sister Stacey Wales said she scripted the AI-generated message after struggling to convey years of grief and pain in her own statement. She said she was not ready to forgive Horcasitas, but felt her brother would have a more understanding outlook. 'The goal was to humanise Chris, to reach the judge, and let him know his impact on this world and that he existed,' she told Reuters. Generative AI, Wales said, is 'just another avenue that you can use to reach somebody'. Wales said she worked with her husband and a family friend, who all work in the tech industry, to create it. Harry Surden, a law professor at the University of Colorado, said the use of generative AI material in court raises ethical concerns, as others may seek to use those tools to play on the emotions of judges and juries. The content is a simulation of reality, not the verified evidence that courts typically assess, Surden said. 'What we're seeing is the simulations have gotten so good that it completely bypasses our natural scepticism and goes straight to our emotion,' he said.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store