logo
#

Latest news with #Pelkey

AI ‘reanimations': Making facsimiles of the dead raises ethical quandaries
AI ‘reanimations': Making facsimiles of the dead raises ethical quandaries

Japan Today

time2 days ago

  • Japan Today

AI ‘reanimations': Making facsimiles of the dead raises ethical quandaries

By Nir Eisikovits and Daniel J Feldman Christopher Pelkey was shot and killed in a road range incident in 2021. On May 8, at the sentencing hearing for his killer, an AI video reconstruction of Pelkey delivered a victim impact statement. The trial judge reported being deeply moved by this performance and issued the maximum sentence for manslaughter. As part of the ceremonies to mark Israel's 77th year of independence on April 30, 2025, officials had planned to host a concert featuring four iconic Israeli singers. All four had died years earlier. The plan was to conjure them using AI-generated sound and video. The dead performers were supposed to sing alongside Yardena Arazi, a famous and still very much alive artist. In the end Arazi pulled out, citing the political atmosphere, and the event didn't happen. In April, the BBC created a deep-fake version of the famous mystery writer Agatha Christie to teach a 'maestro course on writing.' Fake Agatha would instruct aspiring murder mystery authors and 'inspire' their 'writing journey.' The use of artificial intelligence to 'reanimate' the dead for a variety of purposes is quickly gaining traction. Over the past few years, we've been studying the moral implications of AI at the Center for Applied Ethics at the University of Massachusetts, Boston, and we find these AI reanimations to be morally problematic. Before we address the moral challenges the technology raises, it's important to distinguish AI reanimations, or deepfakes, from so-called griefbots. Griefbots are chatbots trained on large swaths of data the dead leave behind – social media posts, texts, emails, videos. These chatbots mimic how the departed used to communicate and are meant to make life easier for surviving relations. The deepfakes we are discussing here have other aims; they are meant to promote legal, political and educational causes. Moral quandaries The first moral quandary the technology raises has to do with consent: Would the deceased have agreed to do what their likeness is doing? Would the dead Israeli singers have wanted to sing at an Independence ceremony organized by the nation's current government? Would Pelkey, the road-rage victim, be comfortable with the script his family wrote for his avatar to recite? What would Christie think about her AI double teaching that class? The answers to these questions can only be deduced circumstantially – from examining the kinds of things the dead did and the views they expressed when alive. And one could ask if the answers even matter. If those in charge of the estates agree to the reanimations, isn't the question settled? After all, such trustees are the legal representatives of the departed. But putting aside the question of consent, a more fundamental question remains. What do these reanimations do to the legacy and reputation of the dead? Doesn't their reputation depend, to some extent, on the scarcity of appearance, on the fact that the dead can't show up anymore? Dying can have a salutary effect on the reputation of prominent people; it was good for John F. Kennedy, and it was good for Israeli Prime Minister Yitzhak Rabin. The fifth-century BC Athenian leader Pericles understood this well. In his famous Funeral Oration, delivered at the end of the first year of the Peloponnesian War, he asserts that a noble death can elevate one's reputation and wash away their petty misdeeds. That is because the dead are beyond reach and their mystique grows postmortem. 'Even extreme virtue will scarcely win you a reputation equal to' that of the dead, he insists. Do AI reanimations devalue the currency of the dead by forcing them to keep popping up? Do they cheapen and destabilize their reputation by having them comment on events that happened long after their demise? In addition, these AI representations can be a powerful tool to influence audiences for political or legal purposes. Bringing back a popular dead singer to legitimize a political event and reanimating a dead victim to offer testimony are acts intended to sway an audience's judgment. It's one thing to channel a Churchill or a Roosevelt during a political speech by quoting them or even trying to sound like them. It's another thing to have 'them' speak alongside you. The potential of harnessing nostalgia is supercharged by this technology. Imagine, for example, what the Soviets, who literally worshipped Lenin's dead body, would have done with a deep fake of their old icon. Good intentions You could argue that because these reanimations are uniquely engaging, they can be used for virtuous purposes. Consider a reanimated Martin Luther King Jr., speaking to our currently polarized and divided nation, urging moderation and unity. Wouldn't that be grand? Or what about a reanimated Mordechai Anielewicz, the commander of the Warsaw Ghetto uprising, speaking at the trial of a Holocaust denier like David Irving? But do we know what MLK would have thought about our current political divisions? Do we know what Anielewicz would have thought about restrictions on pernicious speech? Does bravely campaigning for civil rights mean we should call upon the digital ghost of King to comment on the impact of populism? Does fearlessly fighting the Nazis mean we should dredge up the AI shadow of an old hero to comment on free speech in the digital age? Even if the political projects these AI avatars served were consistent with the deceased's views, the problem of manipulation – of using the psychological power of deepfakes to appeal to emotions – remains. But what about enlisting AI Agatha Christie to teach a writing class? Deep fakes may indeed have salutary uses in educational settings. The likeness of Christie could make students more enthusiastic about writing. Fake Aristotle could improve the chances that students engage with his austere Nicomachean Ethics. AI Einstein could help those who want to study physics get their heads around general relativity. But producing these fakes comes with a great deal of responsibility. After all, given how engaging they can be, it's possible that the interactions with these representations will be all that students pay attention to, rather than serving as a gateway to exploring the subject further. Living on in the living In a poem written in memory of W.B. Yeats, W.H. Auden tells us that, after the poet's death, Yeats 'became his admirers.' His memory was now 'scattered among a hundred cities,' and his work subject to endless interpretation: 'the words of a dead man are modified in the guts of the living.' The dead live on in the many ways we reinterpret their words and works. Auden did that to Yeats, and we're doing it to Auden right here. That's how people stay in touch with those who are gone. In the end, we believe that using technological prowess to concretely bring them back disrespects them and, perhaps more importantly, is an act of disrespect to ourselves – to our capacity to abstract, think and imagine. Nir Eisikovits is Professor of Philosophy and Director, Applied Ethics Center, UMass Boston. Daniel J Feldman is Senior Research Fellow, Applied Ethics Center, UMass Boston. The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts. External Link © The Conversation

AI-generated victim shouldn't have been allowed to speak at killer's sentencing
AI-generated victim shouldn't have been allowed to speak at killer's sentencing

Yahoo

time14-05-2025

  • Yahoo

AI-generated victim shouldn't have been allowed to speak at killer's sentencing

Earlier this month in Maricopa County Superior Court, a dead man named Chris Pelkey testified at the sentencing hearing of the man who killed him. Except, of course, he didn't. Except, it looked like he did. Sort of. Gabriel Horcasitas, 54, was convicted of manslaughter and endangerment in the shooting death of 37-year-old Pelkey. The killing was the end result of a road rage incident that occurred on Nov. 13, 2021. Judge Todd Lang allowed Pelkey's family to play an AI-generated version of him making a statement at the sentencing hearing. It begins with Pelkey's digital twin telling viewers that he is an AI image. It then shows some video of the actual Pelkey, then goes back to the AI version, who thanks those who spoke on his behalf, then says to the defendant: 'To Gabriel Horcasitas, the man who shot me: It is a shame we encountered each other that day in those circumstances. In another life we probably could have been friends. I believe in forgiveness and in God, who forgives. I always have. And I still do.' Horcasitas received 12½ years in prison for both charges. Shortly after the sentencing, his attorney, Jason Lamm, said he would appeal Lamm said, 'While victims have a right to address the court, reincarnating Chris Pelkey through AI, and, frankly, putting words in his mouth because nobody would know what he was actually going to say, it just felt wrong on many levels.' That's because, as sincere, moving, heartfelt and even conciliatory that the AI video was, it was … wrong. On many levels. Opinion: Mainstream media aren't telling us the full story on AI Advances in AI allow the dead to 'speak' to us. But only with words that someone else puts in their mouths. In this case, Pelkey's sister, Stacey Wales, wrote her brother's victim impact statement. She told CNN, 'The only thing that kept entering my head that I kept hearing was Chris and what he would say. I had to very carefully detach myself in order to write this on behalf of Chris because what he was saying is not necessarily what I believe, but I know it's what he would think.' For her family and for anyone who loved Pelkey, that is undoubtedly true. And I'd guess that at the funerals or memorial services for some people, an AI visit from the great beyond may afford grieving loved ones a sense of comfort. But a courtroom can't be a place where what someone thinks a deceased person would say is offered up by an AI avatar. Or, as Gary Marchant, an ASU professor and member of the Arizona Supreme Court's committee on AI, put it, 'Even though in this case it was very well-meaning and honest, it can easily cross over to much more dishonest and much more strategic, much more self-serving, so I think we can't [set] that precedent to allow the fake videos into court.' Chris Pelkey clearly was much loved, and his loss was deeply felt. That message was conveyed at the sentencing by others. In life, he could speak for himself. In death, he did not need to. Reach Montini at Like this column? Get more opinions in your email inbox by signing up for our free opinions newsletter, which publishes Monday through Friday. This article originally appeared on Arizona Republic: Chris Pelkey spoke kindly via AI to his killer. It was wrong | Opinion

CSX lost out on $1 million a day in Q1 amid hurricane, tunnel work
CSX lost out on $1 million a day in Q1 amid hurricane, tunnel work

Yahoo

time14-05-2025

  • Business
  • Yahoo

CSX lost out on $1 million a day in Q1 amid hurricane, tunnel work

Rebuilding from hurricane damage and a major tunnel project cost CSX a million dollars a day in lost revenue in the first three months of the year, the railroad's top financial executive told an investor conference. 'We got hit in the first quarter; it was a difficult winter,' said Executive Vice President and Chief Financial Officer Sean Pelkey, speaking at the Bank of America conference in New York. 'There were a hundred million dollars in revenue opportunities we missed, a million dollars a day, because of constraints on our network.' The railroad (NASDAQ: CSX) continues to rebuild its 60-mile line through eastern Tennessee and western North Carolina after Hurricane Helene devastated the region in September 2024. Pelkey said reconstruction is expected to last 'through the better part of this year' before completion in October or November. 'There's a massive amount of work there,' he of trains around the rebuilding of Baltimore's Howard Street tunnel also hurt network performance. Pelkey said tunnel construction to accommodate doublestack trains is expected to be completed in eight months, slightly ahead of schedule. 'Some clearances need to get done, and there is work on some bridges; we're relying on the state for that. We see reopening the beginning of the fourth quarter.' After a difficult start to the year, CSX saw a reset in April. 'We needed fewer cars online, sitting in yards and at customers' facilities,' said Pelkey. 'We saw 80% trip length compliance for the fourth week in a row; we were at the 60 percents earlier this year. We are getting back to scheduled railroading, not that we got away from it, but we were dealing with difficult operating conditions.' Normal seasonality has seen volumes pick up in unit coal trains, metals and fertilizers, among other as the winter was receding, CSX got hit by more severe weather, from flooding to tornadoes across portions of its territory. 'The team was hunkered down in Jacksonville [Florida operations center], and we were like, 'Crap, we gotta deal with this now?' We did see some effects. For example, one of our interchange partners had a bridge out in a key part of our network.' The pause in tariffs agreed to this week by China and the United States is expected to have less of an impact on East Coast-based CSX, Pelkey said. 'There is probably a lot of inventory sitting in West Coast warehouses that will come east. If we see a little lull in intermodal across Chicago, there may be an opportunity to pick up some incremental business and combine trains.' The company recently completed a new contract with the Brotherhood of Locomotive Engineers and Trainmen (BLET), leaving its conductors represented by SMART-TD as the only union without a new deal. Pelkey pointed out that the BLET has a single agreement across the system, while conductors have multiple agreements covering, for example, some employees working east-west trains while others work north-south out of some terminals, a legacy of the carrier's predecessor railroads. Pelkey said the company considers it a priority to have all single-system contracts. Even with a decrease in health care costs, CSX will see wage inflation of 4.24% this year — above the increase in labor costs that are projected to rise less than 3%. 'We can offset that with [freight] pricing gains, for sure, and it comes down over time,' Pelkey said. While freight volume growth was off 1% in the first quarter, it is up 3% so far in the second quarter, and the railroad is expecting growth for all of 2025, said Pelkey, 'We are encouraged by the demand picture, trade and tariffs in terms of trade and demand. Aggregates, grain and intermodal are very, very strong so far this year before hitting that 'air pocket' caused by tariffs.' The cold winter helped CSX move more coal, with carloads up six consecutive quarters to date, accounting for 11% of total carloads. Pelkey said CSX is moving more export coal after the collapse of the Key Bridge in 2024 hurt that business at the Port of Baltimore, a key loadout for international shipments. CSX handled 44 million tons of export coal in 2024 and forecasts 2025 to be better by 10%. The railroad made some system investments, including at the Curtis Bay Pier in Baltimore, for improved reliability. 'We have moved a total 10% more export coal over the past decade,' said Pelkey. Among other commodities:Fertilizer shipments are up 12% quarter to date on steady demand after CSX cycled through the effects of a customer fire in 2024. The metals business serving automotive and construction that had a 60%-70% order fill is now at 90%. Automotive volume is up by single digits, against competition with what Pelkey said was a 'below cost trucking' rate at $1.49 per mile. International intermodal is the biggest driver of gains, in double digits year to date, with domestic intermodal flat to slightly up. Pelkey said intermodal service reliability in the first quarter paid off in volume growth, and he highlighted intermodal provided in cooperation with CPKC (NYSE: CP) and Schneider National (NYSE: SNDR) for cross-border connections linking Mexico, Texas and the U.S. Southeast. While Pelkey did not offer guidance for the second quarter, he said CSX expects sequential growth from the first to second quarter as volumes pick up and service gets better. Pelkey said that 'we still feel good about' the company's Investors Day guidance from November. A total of 40-50 industrial projects are slated to start on the CSX network this year, Pelkey said, with 24 having come online since the beginning of the year and 37 in the pipeline. 'Annual run rate will support 1-2% volume growth,' he said, adding that CSX has 600 projects in various stages of development. The number of freight cars on the network spiked to 140,000 in the first quarter as velocity fell and dwell time increased. To clear yards, Pelkey said, CSX added 45 locomotives that were out of service, rebuilt 20 others and added weekend time for engineering work. Tactically, Pelkey said CSX combines trains where it can to improve business and is leaning into AI and advanced analytics to improve operations. 'Railroads have not unlocked that capability yet,' he said. 'A lot of decisions are made using visibility tools for what's happening now but not in the future. We want to accelerate our focus on that technology, leverage data, and think we can do that this year.' While CSX handled 7.5 million carloads a year two decades ago, that total fell to 3.5 million in 2024. Pelkey said the railroad has capacity to grow volumes in the mid-single digits over the next several years, and has made investments, for example, adding sidings along its Southern corridor, improvements at Cumberland Yard in Maryland to speed processing of cars and spending on its network of Transflo bulk transloading terminals, to capture customer demand. The company has no plans to buy new locomotives but will continue to rebuild, for example, AC4600 and SD60 units as it has for the past five to six years, as well as redeploy power out of storage. The company employs approximately 23,000, and Pelkey said that number is expected to remain stable. ''Flat' employees allows us to grow, with capacity within that head count level. Employee efficiency is a key indicator for us and the industry. We are looking at ways we can drive efficiency.' As for stock buybacks, Pelkey said the company would not give a number but planned to be opportunistic at attractive prices. He added CSX does not have a specific leverage target but 'feels good' about its debt ratings and continues to have conversations with rating agencies. Subscribe to FreightWaves' Rail e-newsletter and get the latest insights on rail freight right in your inbox. Find more articles by Stuart Chirls FRA nominee vows to uphold 2-person train crews Union Pacific President Whited stepping down Layoffs hit BNSF tech staff amid restructuring Railroads want feds to scrap 2-person train crew rule The post CSX lost out on $1 million a day in Q1 amid hurricane, tunnel work appeared first on FreightWaves. Sign in to access your portfolio

AI helps road rage victim confront 'killer' in court
AI helps road rage victim confront 'killer' in court

Arab Times

time10-05-2025

  • Arab Times

AI helps road rage victim confront 'killer' in court

CHANDLER, Ariz., May 10, (AP): There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of fatally shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself - even if it was an AI-generated version. In what's believed to be a first in US courts, Pelkey's family used artificial intelligence to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week that it was a shame they had to meet that day in 2021 under those circumstances - and that the two of them probably could have been friends in another life. "I believe in forgiveness and in God who forgives. I always have and I still do,' Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to share advice for people to make the most of each day and to love each other, not knowing how much time one might have left. While use of artificial intelligence within the court system is expanding, it's typically been reserved for administrative tasks, legal research and case preparation. In Arizona, it's helped inform the public of rulings in significant cases. Using AI to generate victim impact statements marks a new - and legal, at least in Arizona - tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the road rage case, said after watching the video that he imagined Pelkey, who was 37 at the time of his killing, would have felt that way after learning about him. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. Horcasitas, 54, was convicted of manslaughter and sentenced to 10.5 years in prison. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang said. Horcasitas' lawyer, Jason Lamm, told The Associated Press they filed a notice to appeal his sentence within hours of the hearing. Lamm said it's likely that the appellate court will weigh whether the judge improperly relied on the AI video when handing down the sentence. The shooting happened the afternoon of Nov. 13, 2021, as both drivers were stopped at a red light. According to records, Pelkey was shot after getting out of his truck and walking back toward Horcasitas' car. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself. For years, while the case worked its way through the legal system, Wales said she thought about what she would say at the sentencing hearing. She struggled to get words down on paper. But when she thought about what her brother would say to the shooter, knowing he would have forgiven him, the words poured out of her. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Arizona Supreme Court Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the formation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms.

An AI-generated shooting victim forgave his killer in a U.S. court. Could it happen in Canada?
An AI-generated shooting victim forgave his killer in a U.S. court. Could it happen in Canada?

Hamilton Spectator

time10-05-2025

  • Hamilton Spectator

An AI-generated shooting victim forgave his killer in a U.S. court. Could it happen in Canada?

For years, Stacey Wales tried to brainstorm what she would say at the sentencing of her brother's killer. 'I wanted to yell,' Wales told the Star in an interview on Friday. 'I would have these thoughts bubble up, while I was driving or in the shower, often of anger or frustration, and just read them into my phone.' In 2021, Wales' brother, Christopher Pelkey, was fatally shot while at a red light in Chandler, Arizona. His killer, Gabriel Horcasitas, first faced a jury in 2023, but the case ended in a mistrial. After a retrial in March, he was found guilty of manslaughter. When it came time for Wales to put pen to paper, all she could hear was Pelkey's voice. So, she began to write in his words. It worked. Then, with the help of her husband, who has experience using generative artificial intelligence, Wales set off to create a video of her brother's likeness, reading the statement in his own voice. The video was the last of 10 statements read out at the May 1 sentencing hearing. 'To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,' Pelkey's facsimile, donning a grey baseball cap, told in court. 'In another life, we probably could have been friends.' 'I believe in forgiveness, and a God who forgives. I always have and I still do.' It wasn't a perfect likeness. The recreation of Pelkey jolts unnaturally throughout the nearly four-minute video. But it seemed to leave a favourable impression on Maricopa County Superior Court Justice Todd Lang, who described it as 'genuine.' 'I loved that AI,' Lang said. 'Thank you for that. And as angry as you are, and justifiably angry as the family is, I heard the forgiveness. Horcasitas received just over 10.5 years' jail time. The case joins a growing list of U.S. court proceedings in which parties have reached for generative artificial intelligence. In a high-profile example from 2023, former lawyer for President Trump, Michael Cohen, claimed he'd unwittingly sent his attorney fake AI-generated legal citations . More recently, a plaintiff in a New York court tried to employ an AI-generated avatar to argue on his behalf — an attempt that was quickly swatted down by the judge. For Ryan Fritsch, policy counsel of the Law Commission of Ontario, the rise in use 'speaks to the interest and enthusiasm out there for new forms of efficiencies in the criminal justice system.' 'There are some considerable promises,' Fritsch told the Star on Friday. 'But at the same time, concerns should arise if there are not sufficient rules, guardrails or governance models in place.' As it stands, the use of AI in the criminal justice system is more commonly found in policing, often controversially, in which services across the country have employed technology such as facial recognition systems and automatic licence plate readers. In Canadian courts, AI has been less prevalent – though Fritsch says he's starting to see upticks in its use. Just this week, the conduct of an Ontario lawyer was called into question after a judge suspected ChatGPT had been used to craft a factum submitted in civil proceedings. She has since been ordered to attend a hearing with the judge to explain the discrepancies. Where it's becoming most common, he says, is in cases where people are self-represented. 'Right now, what we're mostly seeing is an increasing number of self- and un-represented people relying on generalist AI tools like ChatGPT to make their case for them,' he said. 'And the consequence is that they're actually spending more time disavowing the errors than reaping any benefits.' There are currently no laws specific to the use of artificial intelligence in the Canadian justice system. In the absence of that framework, whether AI-generated material is permitted into a legal case often falls on the individual judge or justice. As a result, some individual courts, police services and legal associations have started to come up with policies. Toronto police, for example, were the first service in Canada to introduce their own AI policy , in 2022. A patchwork of policies, however, can open the court up to unnecessary litigation, says Fritsch, and worsen backlogs and delays. 'Without a framework, there's going to be a lot of struggle for courts, cops and Crowns to interpret how our existing laws, and our civil rights, are going to apply to the use of AI,' Fritsch said. 'And there's going to be a lot of varying opinions on that.' Amending laws to regulate AI will take time, plus there's the 'long leg' problem that court cases come months or years after new technology develops, Fristch said. 'There could be years of misuse in the meantime,' he added. One of the most significant concerns for Fritsch is whether AI technologies can effectively understand and uphold Canadian standards of law. 'We know that AI is prone to bias,' Fritsch said. 'So if it's going to be used, we really need to make sure we're interpreting its use through the lens of the Charter of Rights and Freedoms and procedural fairness.' For example, in the U.S., algorithms have long been used to assess risk in bail and release decisions, but Fritsch says they've been known to miss the mark. 'What we've seen from a couple of cases in the US is some really, really harsh recommendations about people who are facing first offences, or who are who are doing time for minor offences.' As a result, the need for human oversight remains, whether through the due diligence of staff or the discretion of a judge. For most, the criminal justice system is unfamiliar, and navigating its nuances can be a daunting task. For older citizens or otherwise vulnerable populations, AI, if used properly and transparently, 'could actually increase access and justice for a lot of people,' Fritsch said. The most common case for the use of AI in the public sector is efficiency, says Shion Guha, assistant professor at the University of Toronto's Faculty of Information – something the courts are not known for. 'A lot of public sector agencies are basically looking towards generative AI as a way to reduce administrative overhead,' Guha told the Star Friday. 'The idea is that this will increase human efficiency and reduce costs.' Those promises, he says, have not been properly vetted, though. 'There hasn't been any formal, finished research on whether or not this evaluative statement is true.' In the absence of laws governing AI use, it's hard to say — it would come down to the presiding judge or justice, says Fritsch. In the Arizona case, he said, the judge likely admitted the video on the basis it served as an expression of the family's feelings, not as a statement from Pelkey. 'I think the court, in their generosity, likely admitted it as almost a courtesy, and it might not be given a whole lot of weight.' While Wales wrote the script for her brother's video, Fritsch pointed out that AI could also be used to generate the statements read out by a person's likeliness, further complicating the issue. 'AI that can be trained on the sum total of all the comments a person may have made on social media or in emails or texts over years, and then used to simulate the person,' Fritsch said. 'There's no doubt it would not be admitted for the truth of its contents — because it's all made up — but might it be allowed for, say, compassionate reasons only, and with no bearing on the sentencing?' he asked. 'Who knows?'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store