logo
#

Latest news with #Thorn

BREAKING: Judge orders release of Mahmoud Khalil
BREAKING: Judge orders release of Mahmoud Khalil

NBC News

time2 days ago

  • Business
  • NBC News

BREAKING: Judge orders release of Mahmoud Khalil

Tech News Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties Thorn, a nonprofit that provides detection and moderation software related to child safety, said it canceled its contract with X after the platform stopped paying it. June 18, 2025, 4:57 PM EDT / Updated June 18, 2025, 5:58 PM EDT By Ben Goggin When Elon Musk took over Twitter in 2022, he said that addressing the problem of child sexual abuse material on the platform was his ' top priority.' Three years later, the problem appears to be escalating, as anonymous, seemingly automated X accounts flood hashtags with hundreds of posts per hour advertising the sale of the illegal material. At the same time, Thorn, a California-based nonprofit organization that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X. Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities. X said Wednesday that it was moving toward using its own technology to address the spread of child abuse material. Some of Thorn's tools are designed to address the very issue that appears to be growing on the platform. 'We recently terminated our contract with X due to nonpayment,' Cassie Coccaro, head of communications at Thorn, told NBC News. 'And that was after months and months of outreach, flexibility, trying to make it work. And ultimately we had to stop the contract.' Many aspects of the child exploitation ads issue, which NBC News first reported on in January 2023, remain the same on the platform. Sellers of child sexual abuse material (CSAM) continue to use hashtags based on sexual keywords to advertise to people looking to buy CSAM. Their posts direct prospective buyers to other platforms where users are asked for money in return for the child abuse material. Other aspects are new: Some accounts now appear to be automated (also known as bots), while others have taken advantage of 'Communities,' a relatively new feature launched in 2021 that encourages X users to congregate in groups 'closer to the discussions they care about most.' Using Communities, CSAM advertisers have been able to post into groups of tens of thousands of people devoted to topics like incest, seemingly without much scrutiny. The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material. Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was 'a bit old hat' at this point, and that X's response 'has been woefully insufficient.' 'It seems to be a little bit of a game of Whac-A-Mole that goes on,' he said. 'There doesn't seem to be a particular push to really get to the root cause of the issue.' X says it has a zero-tolerance policy 'towards any material that features or promotes child sexual exploitation.' A spokesperson for X directed NBC News to a post from its @Safety account detailing what the company says are new efforts to find and remove child abuse material. 'At X, we have zero tolerance for child sexual exploitation in any form. Until recently, we leveraged partnerships that helped us along the way,' the company said in the post. 'We are proud to provide an important update on our continuous work detecting Child Sexual Abuse Material (CSAM) content, announcing today that we have launched additional CSAM hash matching efforts. 'This system allows X to hash and match media content quickly and securely, keeping the platform safer without sacrificing user privacy,' the post continued. 'This is enabled by the incredible work of our safety engineering team, who have built state of the art systems to further strengthen our enforcement capabilities.' The company said that the system would allow the company to automatically detect known CSAM and remove it, though it was not clear how it differs from existing hashing technology. The spokesperson did not respond to questions about Thorn's allegations regarding the payments. A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute. Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today. Historically, Twitter and then X have attempted to block certain hashtags associated with child exploitation. When NBC News first reported on the use of X to market CSAM, X's head of trust and safety said the company knew it had work to do and would be making changes, including the development of automated systems to detect and block hashtags. In January 2024, X CEO Linda Yaccarino testified to the Senate Judiciary Committee that the company had strengthened its enforcement 'with more tools and technology to prevent bad actors from distributing, searching for, or engaging with [child sexual exploitation] content across all forms of media.' In May 2024, X said it helped Thorn test a tool to 'proactively detect text-based child sexual exploitation.' The 'self-hosted solution was deployed seamlessly into our detection mechanisms, allowing us to hone in on high-risk accounts and expand child sexual exploitation text detection coverage,' X said. Pailes Halai, Thorn's senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn's software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn't clear if they ever fully implemented it. 'They took part in the beta with us last year,' he said. 'So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it's not very clear to us at this point how and if they used it.' Without Thorn, it's not entirely clear what child safety mechanisms X is currently employing. 'Our technology is designed with safety in mind,' Halai said. 'It's up to the platform to enforce and use the technology appropriately … What we do know on our side is it's designed to catch the very harms that you're talking about.' Halai said Thorn didn't take the termination of its contract with X lightly. 'It was very much a last-resort decision for us to make,' he said. 'We provided the services to them. We did it for as long as we possibly could, exhausted all possible avenues and had to terminate, ultimately, because, as a nonprofit, we're not exactly in the business of helping to sustain something for a company like X, where we're actually incurring huge costs.' Currently, some hashtags, like #childporn, are blocked when using X's search function, but other hashtags are open to browse and are filled with posts advertising CSAM for sale. NBC News found posts appearing to peddle CSAM in 23 hashtags that are oftentimes used together in the posts. NBC News only identified two hashtags that were blocked by X. The hashtags that were available to be posted to and viewed during an NBC News' review of the platform ranged from references to incest and teenagers to slightly more coded terms, like combinations of words with the name of the defunct video chat platform Omegle, which shut down in 2023 after a child sex exploitation lawsuit. Some hashtags contained jumbled letters and only contained posts advertising CSAM, indicating that they were created with the exclusive purpose of housing the advertisements. Some usernames of accounts posting the ads were simply a jumble of words associated with CSAM content on the platform, mixing names of social media platforms with other keywords. Many of the users linked directly to Telegram channels in their posts or their account bios and included explicit references to CSAM. Some posts linked to Discord channels or solicited direct messages to secure Discord links. Telegram and Discord have distinct positions in the internet's child exploitation ecosystem, offering semiprivate and private venues for people looking to sell or buy child exploitation material. NBC News previously reported on 35 cases in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord. A Discord representative said, 'Discord has zero tolerance for child sexual abuse material, and we take immediate action when we become aware of it, including removing content, banning users, and reporting to the National Center for Missing and Exploited Children (NCMEC).' The company said in response to NBC News' outreach that it removed multiple servers 'for policy violations unrelated to the sale of CSAM.' A representative for Telegram said 'CSAM is explicitly forbidden by Telegram's terms of service and such content is removed whenever discovered.' The representative pointed to the company's partnership with the U.K.-based Internet Watch Foundation, which maintains a database of known CSAM and provides tools to detect and remove it. While some of the X accounts posted publicly, others solicited and offered CSAM through X's Communities feature, where users create groups based on specific topics. NBC News observed groups with tens of thousands of members in which CSAM was solicited or was offered to be sold. In a group with over 70,000 members devoted to 'incest confessions,' multiple users posted multiple times linking to Telegram channels, explicitly referencing CSAM. 'I'm selling 6cp folder for only 90$,' one user wrote, linking to a Telegram account. CP is a common online abbreviation for 'child pornography.' CSAM has been a perpetual problem on the internet and social media, with many companies employing specialized teams and building automated systems to identify and remove abuse content and those spreading it. But Musk also instituted drastic cuts to the company's trust and safety teams and disbanded the company's Trust and Safety Council. In 2023, the company said that it was detecting more CSAM than in previous years and that it had increased staffing devoted to the issue despite larger trust and safety layoffs. Richardson, C3P's director of information technology, said that while X will sometimes remove accounts that are flagged to it for violating rules around CSAM, 'a new account pops up in two seconds, so there's not a lot of in-depth remediation to the problem. That's just sort of the bare minimum that we're looking at here.' He said an increasing reliance on artificial intelligence systems for moderation, if X is using them, could be in part to blame for such oversights. According to Richardson, AI systems are good at sorting through large datasets and flagging potential issues, but that, currently, systems will inevitably over- or under-moderate without human judgment at the end. 'There should be an actual incident response when someone is selling child sexual abuse material on your service, right? We've become completely desensitized to that. We're dealing with the sale of children being raped,' Richardson said. 'You can't automate your way out of this problem.' Ben Goggin Ben Goggin is the deputy tech editor for NBC News.

Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties
Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties

NBC News

time4 days ago

  • Business
  • NBC News

Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties

Tech News Thorn, a non-profit that provides detection and moderation software related to child safety, said they canceled their contract with X after the platform stopped paying them. June 18, 2025, 4:57 PM EDT By Ben Goggin When Elon Musk took over Twitter in 2022, he said that addressing the problem of child sexual abuse material on the platform was his ' top priority.' Three years later, the problem appears to be escalating, as anonymous, seemingly automated X accounts flood hashtags with hundreds of posts per hour advertising the sale of the illegal the same time, Thorn, a California-based nonprofit that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X. Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities. Some of Thorn's tools are designed to address the very issue that appears to be growing on the platform. 'We recently terminated our contract with X due to nonpayment,' Cassie Coccaro, head of communications at Thorn, told NBC News. 'And that was after months and months of outreach, flexibility, trying to make it work. And ultimately we had to stop the contract.' In response to requests for comment, X did not address its relationship with Thorn or the ongoing issue of accounts using the platform to market child sexual abuse material (CSAM). Many aspects of the child exploitation ads issue, which NBC News first reported on in January 2023, remain the same on the platform. Sellers of child sexual abuse material (CSAM) continue to use hashtags based on sexual keywords to advertise to people looking to buy CSAM. Their posts direct prospective buyers to other platforms where users are asked for money in return for the child abuse material. Other aspects are new: Some accounts now appear to be automated (also known as bots), while others have taken advantage of 'Communities,' a relatively new feature launched in 2021 that encourages X users to congregate in groups 'closer to the discussions they care about most.' Using Communities, CSAM advertisers have been able to post into groups of tens of thousands of people devoted to topics like incest, seemingly without much scrutiny. The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material. Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was 'a bit old hat' at this point, and that X's response 'has been woefully insufficient.' 'It seems to be a little bit of a game of Whac-A-Mole that goes on,' he said. 'There doesn't seem to be a particular push to really get to the root cause of the issue.'X says it has a zero tolerance policy 'towards any material that features or promotes child sexual exploitation.' A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute. Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today. Historically, Twitter and then X have attempted to block certain hashtags associated with child exploitation. When NBC News first reported on the use of X to market CSAM, X's head of trust and safety said the company knew it had work to do and would be making changes, including the development of automated systems to detect and block hashtags. In January 2024, X CEO Linda Yaccarino testified to the Senate Judiciary Committee that the company had strengthened its enforcement 'with more tools and technology to prevent bad actors from distributing, searching for, or engaging with [child sexual exploitation] content across all forms of media.' In May 2024, X said it helped Thorn test a tool to 'proactively detect text-based child sexual exploitation.' The 'self-hosted solution was deployed seamlessly into our detection mechanisms, allowing us to hone in on high-risk accounts and expand child sexual exploitation text detection coverage, X said ' Pailes Halai, Thorn's senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn's software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn't clear if they ever fully implemented it. 'They took part in the beta with us last year,' he said. 'So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it's not very clear to us at this point how and if they used it.' Without Thorn, it's not entirely clear what child safety mechanisms X is currently employing. 'Our technology is designed with safety in mind,' Halai said. 'It's up to the platform to enforce and use the technology appropriately … What we do know on our side is it's designed to catch the very harms that you're talking about.' Halai said Thorn didn't take the termination of its contract with X lightly. 'It was very much a last-resort decision for us to make,' he said. 'We provided the services to them. We did it for as long as we possibly could, exhausted all possible avenues and had to terminate, ultimately, because, as a nonprofit, we're not exactly in the business of helping to sustain something for a company like X, where we're actually incurring huge costs.' Currently, some hashtags, like #childporn, are blocked when using X's search function, but other hashtags are open to browse and are filled with posts advertising CSAM for sale. NBC News found posts appearing to peddle CSAM in 23 hashtags that are oftentimes used together in the posts. NBC News only identified two hashtags that were blocked by X. The hashtags that were available to be posted to and viewed during an NBC News' review of the platform ranged from references to incest and teenagers to slightly more coded terms, like combinations of words with the name of the defunct video chat platform Omegle, which shut down in 2023 after a child sex exploitation lawsuit. Some hashtags contained jumbled letters and only contained posts advertising CSAM, indicating that they were created with the exclusive purpose of housing the advertisements. Some usernames of accounts posting the ads were simply a jumble of words associated with CSAM content on the platform, mixing names of social media platforms with other keywords. Many of the users linked directly to Telegram channels in their posts or their account bios and included explicit references to CSAM. Some posts linked to Discord channels or solicited direct messages to secure Discord links. Telegram and Discord have distinct positions in the internet's child exploitation ecosystem, offering semiprivate and private venues for people looking to sell or buy child exploitation material. NBC News previously reported on 35 cases in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord. A Discord representative said, ''Discord has zero tolerance for child sexual abuse material, and we take immediate action when we become aware of it, including removing content, banning users, and reporting to the National Center for Missing and Exploited Children (NCMEC).' The company said in response to NBC News' outreach that it removed multiple servers 'for policy violations unrelated to the sale of CSAM.' A representative for Telegram said 'CSAM is explicitly forbidden by Telegram's terms of service and such content is removed whenever discovered.' The representative pointed to the company's partnership with the U.K.-based Internet Watch Foundation, which maintains a database of known CSAM and provides tools to detect and remove it. While some of the X accounts posted publicly, others solicited and offered CSAM through X's Communities feature, where users create groups based on specific topics. NBC News observed groups with tens of thousands of members in which CSAM was solicited or was offered to be sold. In a group with over 70,000 members devoted to 'incest confessions,' multiple users posted multiple times linking to Telegram channels, explicitly referencing CSAM. 'I'm selling 6cp folder for only 90$,' one user wrote, linking to a Telegram account. CP is a common online abbreviation for 'child pornography.' CSAM has been a perpetual problem on the internet and social media, with many companies employing specialized teams and building automated systems to identify and remove abuse content and those spreading it. But Musk also instituted drastic cuts to the company's trust and safety teams, and disbanded the company's Trust and Safety Council. In 2023, the company said that it was detecting more CSAM than in previous years and that it had increased staffing devoted to the issue despite larger trust and safety layoffs. Richardson, C3P's director of information technology, said that while X will sometimes remove accounts that are flagged to it for violating rules around CSAM, 'a new account pops up in two seconds, so there's not a lot of in depth remediation to the problem. That's just sort of the bare minimum that we're looking at here.' He said an increasing reliance on artificial intelligence systems for moderation, if X is using them, could be in part to blame for such oversights. According to Richardson, AI systems are good at sorting through large datasets and flagging potential issues, but that, currently, systems will inevitably over- or under-moderate without human judgment at the end. 'There should be an actual incident response when someone is selling child sexual abuse material on your service, right? We've become completely desensitized to that. We're dealing with the sale of children being raped,' Richardson said. 'You can't automate your way out of this problem.' Ben Goggin Ben Goggin is the deputy tech editor for NBC News.

Fate of Yankee trade ship Tonquin brought to life in play at Tofino's Village Green
Fate of Yankee trade ship Tonquin brought to life in play at Tofino's Village Green

Hamilton Spectator

time6 days ago

  • Entertainment
  • Hamilton Spectator

Fate of Yankee trade ship Tonquin brought to life in play at Tofino's Village Green

By Nora O'Malley Local Journalism Initiative Reporter Tofino, B.C. – Children playing 'Tla-o-qui-aht warriors' paddled in cardboard cutouts of dugout canoes around the wooden pirate ship play structure at Tofino's Village Green to recount the fate of the Tonquin. The 269-ton American trade ship sank to the bottom of Clayoquot Sound in 1811 after being overwhelmed by the warriors – and blew up. As told by Tla-o-qui-aht First Nation's Gisele Martin and her father Joe Martin on June 11, the Tonquin's goal was to establish a trade post and claim the region as part of the United States of America. The Tonquin's captain Jonathan Thorn, who was played by Tofino resident Hugo Hall, was brash, and not well-liked by his crew. Thorn wanted to trade for sea otter furs with Gisele's great, great grandfather Nookmis. But when Nookmis told him the price for one pelt was three blankets, 30 beads, 30 buckets and three knifes, Thorn scoffed and shoved the otter pelt in Nookmis' face. In the novel Astoria by American historian Washington Irving, which chronicles the entire journey of the Tonquin, Thorn is said to have 'slapped' the chief in the face. The next day, angry Tla-o-qui-aht warriors boarded the ship and threw the captain overboard. 'The captain got clubbed by the women and disappeared under water,' Gisele regaled the audience on the sunny June 11 afternoon. One crew member, James Lewis, who was played by Clayoquot Action's Dan Lewis, allegedly scuttled to the bottom of the ship and lit five tons of gun powder. 'KA-BOOM!' Joe exclaimed as the children ran around the mock Tonquin ship with sparklers. 'Sparks flew and Nookmis got thrown overboard.' Tonquin's crew and roughly 100 brave Tla-o-qui-aht warriors perished in the sea. Martin says Lewis became the first 'suicide bomber' of Clayoquot Sound. 'People in Opitsaht could see the mass of the ship for three years poking out of the water. During that time, Tla-o-qui-aht became very diligent about protecting this coast,' said Gisele. It wasn't until 20 years later that Tla-o-qui-aht started having a relationship with some of the British trading companies. 'That's why Tofino is here today and that's also why this is not part of the United States today. We've never sold this land. We've never ceded it; we've never signed it away in a treaty,' said Gisele, noting Tla-o-qui-aht's fight to protect Meares Island from old growth logging, preserving the source of Tofino's drinking water. '[I]n 1984 Tla-o-qui-aht took the government all the way to the Supreme Court of Canada. In their own courts, the government could not prove that they owned this land.' Forty-one years ago, Tla-o-qui-aht First Nation, with support from the Nuu-chah-nulth Tribal Council (NTC), famously declared Meares Island the 'Wanachis Hilth-huu-is Tribal Park' under Nuu-chah-nulth law. The Meares Declaration protected the old-growth forest from being logged, and is recognized as one of the largest demonstrations of civil disobedience in North America. Prior to the conservation stance, there was no 'tribal park' in existence under provincial or federal legislation. The wreck of the Tonquin was never found… But one day in the spring of 2000, a local crab fisherman found his trap hooked on the end of an old, old anchor – that anchor, encrusted with blue trading beads, is believed to be the Tonquin's. The anchor is on display at the Village Green in the gazebo to this day and belongs to the Tla-o-qui-aht. -30- Error! Sorry, there was an error processing your request. There was a problem with the recaptcha. Please try again. You may unsubscribe at any time. By signing up, you agree to our terms of use and privacy policy . This site is protected by reCAPTCHA and the Google privacy policy and terms of service apply. Want more of the latest from us? Sign up for more at our newsletter page .

Asia Morning Briefing: Risk of Escalating Israel-Iran Conflict Keeps BTC Around 105K Says QCP
Asia Morning Briefing: Risk of Escalating Israel-Iran Conflict Keeps BTC Around 105K Says QCP

Yahoo

time6 days ago

  • Business
  • Yahoo

Asia Morning Briefing: Risk of Escalating Israel-Iran Conflict Keeps BTC Around 105K Says QCP

Welcome to Asia Morning Briefing, a daily summary of top stories during U.S. hours and an overview of market moves and analysis. For a detailed overview of U.S. markets, see CoinDesk's Crypto Daybook Americas. As Asia opens the trading week, BTC is changing hands at around $ 105,000, stuck in this range due to market uncertainty about whether the Israel-Iran conflict will escalate into a broader regional war, according to a recent note from trading firm QCP. QCP wrote in a Friday note published on Telegram that risk reversals have "flipped decisively," with front-end BTC puts now commanding premiums of up to 5 volatility points over equivalent calls, a clear indicator of heightened investor anxiety and increased hedging against downside risks. The firm said that despite this defensive shift in positioning, BTC has demonstrated notable resilience. Even amid recent volatility, which saw over $1 billion in long positions liquidated across major crypto assets, on-chain data shows that institutional buying continues to provide meaningful support. QCP emphasizes that markets remain "stuck in a bind," awaiting clarity on geopolitical outcomes, and warns that the digital asset complex will likely remain tightly linked to headline-driven sentiment shifts for the foreseeable future. With all that in mind, however, Glassnode data provides some reassurance to investors concerned about longer-term directionality. Although recent volatility underscores short-term anxiety, bitcoin's current cycle gain of 656%, while lower than previous bull markets, is notably impressive given its significantly larger market capitalization today. Previous cycles returned 1076% (2015–2018) and 1007% (2018–2022), suggesting investor demand is still pacing closely with BTC's maturation, even as near-term macro jitters dominate market sentiment. The OP_Return debate was less important than what a "loud but small group of critics" wanted everyone to think, Galaxy Research's Alex Thorn wrote in a recent note. Thorn described critics' reactions as "wild accusations of the 'death of Bitcoin'" and argued that such hyperbole was misplaced given historically low mempool congestion. On-chain data shows that the mempool is virtually empty compared to a year ago, and the notion that a congested blockchain is suffocating BTC, as was the prevailing narrative in 2023, now appears significantly overstated. In the note, Thorn further highlighted the irony of labeling arbitrary data as "spam," reminding observers that Bitcoin's creator, Satoshi Nakamoto, famously included arbitrary text, the "chancellor on brink of second bailout" headline, in the Bitcoin's blockchain's very first block. Instead, Thorn argued, Bitcoin's community attention would be better focused on potential upgrades like CheckTemplateVerify (CTV), a proposed opcode enabling strict spending conditions ("covenants"). "We continue to believe [CTV] is a conservative but powerful opcode that would greatly enhance the ability to build better, safer methods of custody," he wrote, noting that around 20% of Bitcoin's hashrate already signaled support for the upgrade. Bitcoin upgrades require extensive consensus-building, reflecting its open-source ethos, and Thorn emphasized that cautious, deliberate evolution remains critical for broader adoption and scalability. Bybit is entering the decentralized exchange space with Byreal, an on-chain trading platform built on Solana, Ben Zhou, Bybit's CEO announced via X over the weekend. Byreal's testnet is scheduled to launch on June 30, with the mainnet rollout expected later this year. Zhou said that Byreal is designed to combine centralized exchange features such as high liquidity and fast execution with the transparency and composability of DeFi. The platform will also include a fair launchpad system and curated yield vaults linked to Solana-native assets like bbSOL. BTC: Bitcoin held near $105,000 after more than $1 billion in leveraged positions were liquidated, led by a $200 million long on Binance, as rising Israel-Iran tensions triggered a sharp selloff, a flight from altcoins, and a brief but intense bout of volatility. ETH: Ethereum rose 2% to around $2,550 after finding strong support at $2,510, showing resilience amid Israel-Iran tensions and broader market volatility, with continued institutional inflows supporting the uptrend. Nikkei 225: Asia-Pacific markets rose Monday, led by Japan's Nikkei 225 gaining 0.87 percent, as investors weighed escalating Israel-Iran tensions, while oil and gold prices surged on safe haven demand. Gold: Gold climbed to $3,447 in early Asian trading Monday, hitting a one-month high as Middle East tensions and rising expectations of a September Fed rate cut outweighed strong U.S. consumer sentiment data. Chart of the Week: Bitcoin's Summer Lull Still Offers 'Inexpensive' Trading Opportunity (CoinDesk) Trump Strategist Outlines How Bitcoin Helped Republicans Win the 2024 Election (Decrypt) Will the Cardano Foundation Buy BTC? (CoinDesk) Sign in to access your portfolio

Florida beefs up laws against deepfake porn, luring children, sex trafficking. What to know
Florida beefs up laws against deepfake porn, luring children, sex trafficking. What to know

Yahoo

time11-06-2025

  • Politics
  • Yahoo

Florida beefs up laws against deepfake porn, luring children, sex trafficking. What to know

Florida has taken some big steps in the fight against sexually explicit deepfakes. On June 10, Gov. Ron DeSantis signed five bills designed to protect children against sexual crimes. The bills expand Florida laws against luring or enticing children, add more registration and reporting requirements for sexual predators and offenders and mandate minimum terms for subsequent offenses, add harsher penalties — including the death penalty — for anyone convicted of human trafficking for sexual exploitation of children under 12 or individuals who are mentally incapacitated, and provide an enforceable framework to remove deepfake material from online platforms. The bill against deepfakes is called "Brooke's Law" after Brooke Curry, the daughter of former Jacksonville mayor Lenny Curry, who was 16 when a teenage boy she didn't know used a picture from her Instagram account to create an image she later described as 'embarrassing, vulgar, rude and against everything I stand for.' Curry, now 18, testified at a Florida House of Representatives committee hearing for passage of House Bill 1161 and was on hand when DeSantis signed the bill. 'Florida has zero tolerance for criminals who exploit children,' DeSantis said. 'Throughout my time in office, we've worked with the legislature to strengthen penalties for child abuse, hold predators accountable, and ensure that Florida remains a safe place to raise a family.' Here's what to know. "Deepfakes" are fake images or video created through graphics software or AI generators of real people. Faces taken from social media posts or other pictures available online are photoshopped onto adult movie actresses or models or used to generate explicit AI-generated content without consent, which is then shared around school, sent to family members or employers, or uploaded to websites for millions to see. Software to create convincing deepfake nudes has become increasingly accessible and the content it produces is often indistinguishable from real images and video. It's used as a tool of abuse, humiliation and harassment that disproportionately targets teenage girls and women. Celebrities such as Taylor Swift, Jenna Ortega and Megan Thee Stallion have seen false sexual imagery of themselves spread across the web, and women politicians are frequent targets. One in eight teens age 13 to 17 personally know someone who has been victimized by deepfake nudes, according to a report from Thorn, a nonprofit company focused on childhood safety online. One in 17 said they were a direct victim. Deepfake penalties: Students used AI to create nude photos of their classmates. For some, arrests came next. Law enforcement was slow to address the problem as they worked out how to address the everchanging murky world of cybercrime, and many websites refused to remove images or video after they were reported. If they were removed, offenders would share them somewhere else, further traumatizing the victim. In May, President Donald Trump signed the bipartisan Take It Down Act to federally criminalize publication of non-consensual intimate imagery, also known as NCII. The law requires social media platforms and similar websites to remove nonconsensual intimate imagery — defined as including realistic, computer-generated pornographic images and videos that depict identifiable, real people — within 48 hours of notice from a victim. Florida's HB 1161, Removal of Altered Sexual Depictions Posted without Consent, provides victims with a legal mechanism to fight deepfakes by requiring specified websites and online services to establish a process for victims to request removals, with a clear and conspicuous notice of the process in easy-to-understand language. Once a victim makes a written request for removal, the platform must remove the content and any copies within 48 hours. Failure to reasonably comply will be considered an unfair or a deceptive act or practice under the Florida Deceptive and Unfair Trade Practices Act, subject to cease and desist orders and civil penalties of up to $10,000 for each violation, plus actual damages and attorney's fees and costs. The bill provides liability protections for platforms that act in good faith. The platforms are required to establish a clear and prominent process for reporting such content. Email providers, information services and websites whose content is not user-generated are not included. The bill is effectively immediately. HB 777 expands the laws on luring and enticing children: Expands the age of the victim involved to be any child under 14 (previously it was a child under 12) Prohibits a person 18 years of age or older from intentionally luring or enticing, or attempting to lure or entice, a child under the age of 14 into or out of a structure, dwelling, or conveyance for other than a lawful purpose 1st violation: Bumped up from a first-degree misdemeanor to a third-degree felony 2nd or subsequent violation: Bumped up from a third-degree felony to a second-degree felony If committed by an offender with a previous violation of certain offenses: Bumped up from a third-degree felony to a second-degree felony Expands the scope of the offense by including "or out of" buildings and vehicles, not just into one Prohibits ignorance of the victim's age, misrepresentative of the victim's age by another person, or what the defendant sincerely believed the victim's age as a legal defense This bill takes effect Oct. 1, 2025. HB 1351 adds sexual predator and offender reporting requirements to block some reporting loopholes, including: Requires registrants to report their occupation, business name, employment address, and employment phone number Requires sexual offenders and predators to report in-state travel residences within 48 hours either online through the Florida Department of Law Enforcement (FDLE)'s online system or in person with the sheriff's office, removes a requirement to report it to the Department of Highway Safety and Motor Vehicles Requires local law enforcement agencies to conduct address verifications of sexual offenders at least one time per calendar year and sexual predators four times per calendar year to ensure the accuracy of the information Clarifies that 'permanent residence,' as far as sexual predator and sexual offender registration and reporting requirements go, means the person's home or other place where the person primarily lives This bill takes effect Oct. 1, 2025. HB 1455 provides mandatory minimum sentences for certain sexual offenses when committed by registered offenders or predators: Requires a court to impose a mandatory minimum sentence if a person who has previously been convicted of a specified sexual offense is convicted of committing a subsequent specified sexual offense, even if the mandatory minimum exceeds the maximum authorized sentence 10 years for: Lewd or lascivious molestation of a victim under 16 years of age Lewd or lascivious molestation of an elderly or disabled person Online solicitation of a minor, traveling to meet a minor, or prohibited computer usage Possession or transmitting of child pornography 15 years for possession of child pornography with the intent to promote 20 years for: Use of a child in a sexual performance Promoting a sexual performance by a child Buying or selling minors Specifies that except in the case of a pardon or conditional medical release, a person sentenced must serve the full minimum sentence This bill takes effect Oct. 1, 2025. SB 1804 establishes a new felony offense, "Capital Human Trafficking of Vulnerable Persons for Sexual Exploitation," and makes it a capital offense punishable by life imprisonment without the possibility of parole or death. Under the law, Capital Human Trafficking of Vulnerable Persons for Sexual Exploitation is committed by a person 18 years or older who knowingly initiates, organizes, plans, finances, directs, manages, or supervises a venture that has subjected a child less than 12 years of age, or a person who is mentally defective or mentally incapacitated. The U.S. Supreme Court has held that death sentences are limited to murder cases with at least one aggravating factor. "No one has been executed for a non-murder offense in this country since 1964," an analysis of the bill states, and life imprisonment without the possibility of parole is the current maximum sentence for capital sexual battery due to a string of court cases in both the U.S. and Florida Supreme Courts. According to six other states — Georgia, Louisiana, Montana, Oklahoma, South Carolina and Texas — and the U.S. military allow for death sentences for various specifications of underage rape. There have been no executions under those laws so far, and two people sentenced in Louisiana had their sentences overturned by the U.S. Supreme Court. This bill takes effect Oct. 1, 2025. Steve Patterson, Florida Times-Union, contributed to this story. This article originally appeared on Tallahassee Democrat: Florida strengthens laws against deepfake nudes, sex trafficking

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store