logo
Donald Trump is saving California from itself

Donald Trump is saving California from itself

Telegrapha day ago

Gavin Newsom has changed direction once again. After a brief feint as a Maga-whispering moderate, California's governor has 'woken up' in the wake of the LA immigration riots to become the self-anointed leader of the anti-Trump #Resistance.
Just weeks ago, Newsom had launched a podcast, inviting Right-wing firebrands like Charlie Kirk, Michael Savage, and Steve Bannon as his initial guests. Progressives detested this shift. But now that he has effectively denounced Donald Trump as a 'dictator', The Daily Beast and MSNBC have been quick to celebrate his reinvention. The progressive clerisy's homepage, The Atlantic, recently dubbed Newsom 'the nation's foremost Trump foil'.
Although changing colour might help with this chameleon's bid for the 2028 Democrat presidential nomination, it's not good news for the long-suffering people of California. The working and middle classes don't benefit from his performative talk of avoiding tariffs and ignoring federal immigration law. What Newsom should be looking at is how to bolster California's struggling economy, which lags way behind rivals such as Texas and Florida in crucial areas like job creation.
And that would mean making peace with at least some of Donald Trump's agenda.
To be sure, the president's tariffs appear to be hurting California's ports and tech companies dependent on overseas manufacturing, but the state clearly needs some sort of economic paradigm change. Virtually every high wage sector has lost jobs since 2022, including business services and information, the supposed linchpins of the state's economy. All this occurred before Trump's chaotic tariff barrage.
Trump's commitment to investment in new military technology and space exploration, as well as reshoring manufacturing more generally, also opens enormous opportunities for California's heavily Latino blue collar workers. Should Newsom choose to embrace the president's policies, that is.
Consider space. Boosted by a huge surge of investment, space industry global revenues are up tenfold since the early 2000s, from $175 billion (£130.4 billion) in 2005 to almost $385 billion (£286.9 billion) in 2017. By 2040, the industry's annual revenues globally are projected to surpass a trillion dollars. California has a 19 per cent international share in the sector, as well as 40 per cent of the industry in the US.
With Trump's backing, that could grow even further. California already enjoys by far the country's largest cohort of aerospace engineers, typically earning salaries around three times the national average. Many are employed by large contractors, but the most exciting developments can be seen in places like El Segundo, which calls itself 'the aerospace capital of the world', and Douglas Park, next to the Long Beach airport.
If Newsom would wake up from his dogmatic slumbers, he would realise that 'deep tech' firms in space and aerospace likely have a far better future than traditional consumer and media-oriented firms like Salesforce, Meta, and Google. In part due to artificial intelligence, all have announced major cutbacks. Even many 'creative jobs' – actors, writers, journalists – could be threatened by AI generated content.
In contrast, hardware engineers, skilled machinists, and the builders and designers of spacecraft, drones, space mining operations and new engine systems could share an expansive future. The aerospace boom is being driven by more than just a few brilliant geeks backed up by H1-B visa indentured servants. Aerospace firms have their share of PhDs, but they also employ welders and other production workers. In a state that has been very hard on blue collar workers, this should be embraced, even if it reflects Trumpian priorities.
There are further opportunities for California among Trump's policy objectives. The president wants to revive the US shipbuilding industry, and California was once critical to constructing America's 'arsenal of democracy'. One place that could benefit is Solano County in the Bay Area, which once was home to Liberty ship production.
Even virulently anti-Trump Hollywood could see advantages. This Newsom-aligned industry is now losing employment at a fearful rate, down more than one-third over the past 10 years, with 18,000 full time positions disappearing in just the past three. Tariffs may not be what the industry needs – it's already too dependent on cheaper, highly subsidised foreign productions – but the people who work in it would benefit if California and the Trump White House devised an incentive package to reverse off-shore production.
And then there is housing, a prime concern for most Californians. The federal government is the nation's biggest landowner and owns roughly half of California. Republicans have floated the idea of selling federal lands as an option for closing the deficit. Federal lands adjacent to the state's large urban areas also could create, in selected places, an opportunity for new housing that could dodge many of California's currently stifling regulations.
But perhaps Trump's biggest gift would be to push California politics back towards the centre, including on immigration. Due to Trumpian cutbacks, Newsom is being forced to abandon his dream of providing free health services to all undocumented immigrants. Now that the state is suffering a severe deficit, Washington is unlikely to send money to preserve Newsom's dreamscape.
Of course Newsom blames the current budget deficit on Trump, although he does not explain why many other states, including archrivals Texas and Florida, enjoy surpluses. California would do far better if its governor focused on how to take advantage of Trump's initiatives. After all, Maga will be in office at least until 2028. Californians can enjoy the fruits of Trump's policies even as they grumble darkly about him.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

BREAKING: Judge orders release of Mahmoud Khalil
BREAKING: Judge orders release of Mahmoud Khalil

NBC News

time30 minutes ago

  • NBC News

BREAKING: Judge orders release of Mahmoud Khalil

Tech News Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties Thorn, a nonprofit that provides detection and moderation software related to child safety, said it canceled its contract with X after the platform stopped paying it. June 18, 2025, 4:57 PM EDT / Updated June 18, 2025, 5:58 PM EDT By Ben Goggin When Elon Musk took over Twitter in 2022, he said that addressing the problem of child sexual abuse material on the platform was his ' top priority.' Three years later, the problem appears to be escalating, as anonymous, seemingly automated X accounts flood hashtags with hundreds of posts per hour advertising the sale of the illegal material. At the same time, Thorn, a California-based nonprofit organization that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X. Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities. X said Wednesday that it was moving toward using its own technology to address the spread of child abuse material. Some of Thorn's tools are designed to address the very issue that appears to be growing on the platform. 'We recently terminated our contract with X due to nonpayment,' Cassie Coccaro, head of communications at Thorn, told NBC News. 'And that was after months and months of outreach, flexibility, trying to make it work. And ultimately we had to stop the contract.' Many aspects of the child exploitation ads issue, which NBC News first reported on in January 2023, remain the same on the platform. Sellers of child sexual abuse material (CSAM) continue to use hashtags based on sexual keywords to advertise to people looking to buy CSAM. Their posts direct prospective buyers to other platforms where users are asked for money in return for the child abuse material. Other aspects are new: Some accounts now appear to be automated (also known as bots), while others have taken advantage of 'Communities,' a relatively new feature launched in 2021 that encourages X users to congregate in groups 'closer to the discussions they care about most.' Using Communities, CSAM advertisers have been able to post into groups of tens of thousands of people devoted to topics like incest, seemingly without much scrutiny. The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material. Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was 'a bit old hat' at this point, and that X's response 'has been woefully insufficient.' 'It seems to be a little bit of a game of Whac-A-Mole that goes on,' he said. 'There doesn't seem to be a particular push to really get to the root cause of the issue.' X says it has a zero-tolerance policy 'towards any material that features or promotes child sexual exploitation.' A spokesperson for X directed NBC News to a post from its @Safety account detailing what the company says are new efforts to find and remove child abuse material. 'At X, we have zero tolerance for child sexual exploitation in any form. Until recently, we leveraged partnerships that helped us along the way,' the company said in the post. 'We are proud to provide an important update on our continuous work detecting Child Sexual Abuse Material (CSAM) content, announcing today that we have launched additional CSAM hash matching efforts. 'This system allows X to hash and match media content quickly and securely, keeping the platform safer without sacrificing user privacy,' the post continued. 'This is enabled by the incredible work of our safety engineering team, who have built state of the art systems to further strengthen our enforcement capabilities.' The company said that the system would allow the company to automatically detect known CSAM and remove it, though it was not clear how it differs from existing hashing technology. The spokesperson did not respond to questions about Thorn's allegations regarding the payments. A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute. Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today. Historically, Twitter and then X have attempted to block certain hashtags associated with child exploitation. When NBC News first reported on the use of X to market CSAM, X's head of trust and safety said the company knew it had work to do and would be making changes, including the development of automated systems to detect and block hashtags. In January 2024, X CEO Linda Yaccarino testified to the Senate Judiciary Committee that the company had strengthened its enforcement 'with more tools and technology to prevent bad actors from distributing, searching for, or engaging with [child sexual exploitation] content across all forms of media.' In May 2024, X said it helped Thorn test a tool to 'proactively detect text-based child sexual exploitation.' The 'self-hosted solution was deployed seamlessly into our detection mechanisms, allowing us to hone in on high-risk accounts and expand child sexual exploitation text detection coverage,' X said. Pailes Halai, Thorn's senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn's software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn't clear if they ever fully implemented it. 'They took part in the beta with us last year,' he said. 'So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it's not very clear to us at this point how and if they used it.' Without Thorn, it's not entirely clear what child safety mechanisms X is currently employing. 'Our technology is designed with safety in mind,' Halai said. 'It's up to the platform to enforce and use the technology appropriately … What we do know on our side is it's designed to catch the very harms that you're talking about.' Halai said Thorn didn't take the termination of its contract with X lightly. 'It was very much a last-resort decision for us to make,' he said. 'We provided the services to them. We did it for as long as we possibly could, exhausted all possible avenues and had to terminate, ultimately, because, as a nonprofit, we're not exactly in the business of helping to sustain something for a company like X, where we're actually incurring huge costs.' Currently, some hashtags, like #childporn, are blocked when using X's search function, but other hashtags are open to browse and are filled with posts advertising CSAM for sale. NBC News found posts appearing to peddle CSAM in 23 hashtags that are oftentimes used together in the posts. NBC News only identified two hashtags that were blocked by X. The hashtags that were available to be posted to and viewed during an NBC News' review of the platform ranged from references to incest and teenagers to slightly more coded terms, like combinations of words with the name of the defunct video chat platform Omegle, which shut down in 2023 after a child sex exploitation lawsuit. Some hashtags contained jumbled letters and only contained posts advertising CSAM, indicating that they were created with the exclusive purpose of housing the advertisements. Some usernames of accounts posting the ads were simply a jumble of words associated with CSAM content on the platform, mixing names of social media platforms with other keywords. Many of the users linked directly to Telegram channels in their posts or their account bios and included explicit references to CSAM. Some posts linked to Discord channels or solicited direct messages to secure Discord links. Telegram and Discord have distinct positions in the internet's child exploitation ecosystem, offering semiprivate and private venues for people looking to sell or buy child exploitation material. NBC News previously reported on 35 cases in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord. A Discord representative said, 'Discord has zero tolerance for child sexual abuse material, and we take immediate action when we become aware of it, including removing content, banning users, and reporting to the National Center for Missing and Exploited Children (NCMEC).' The company said in response to NBC News' outreach that it removed multiple servers 'for policy violations unrelated to the sale of CSAM.' A representative for Telegram said 'CSAM is explicitly forbidden by Telegram's terms of service and such content is removed whenever discovered.' The representative pointed to the company's partnership with the U.K.-based Internet Watch Foundation, which maintains a database of known CSAM and provides tools to detect and remove it. While some of the X accounts posted publicly, others solicited and offered CSAM through X's Communities feature, where users create groups based on specific topics. NBC News observed groups with tens of thousands of members in which CSAM was solicited or was offered to be sold. In a group with over 70,000 members devoted to 'incest confessions,' multiple users posted multiple times linking to Telegram channels, explicitly referencing CSAM. 'I'm selling 6cp folder for only 90$,' one user wrote, linking to a Telegram account. CP is a common online abbreviation for 'child pornography.' CSAM has been a perpetual problem on the internet and social media, with many companies employing specialized teams and building automated systems to identify and remove abuse content and those spreading it. But Musk also instituted drastic cuts to the company's trust and safety teams and disbanded the company's Trust and Safety Council. In 2023, the company said that it was detecting more CSAM than in previous years and that it had increased staffing devoted to the issue despite larger trust and safety layoffs. Richardson, C3P's director of information technology, said that while X will sometimes remove accounts that are flagged to it for violating rules around CSAM, 'a new account pops up in two seconds, so there's not a lot of in-depth remediation to the problem. That's just sort of the bare minimum that we're looking at here.' He said an increasing reliance on artificial intelligence systems for moderation, if X is using them, could be in part to blame for such oversights. According to Richardson, AI systems are good at sorting through large datasets and flagging potential issues, but that, currently, systems will inevitably over- or under-moderate without human judgment at the end. 'There should be an actual incident response when someone is selling child sexual abuse material on your service, right? We've become completely desensitized to that. We're dealing with the sale of children being raped,' Richardson said. 'You can't automate your way out of this problem.' Ben Goggin Ben Goggin is the deputy tech editor for NBC News.

BREAKING NEWS Mahmoud Khalil gets incredible news three months after ICE threw Columbia activist in jail
BREAKING NEWS Mahmoud Khalil gets incredible news three months after ICE threw Columbia activist in jail

Daily Mail​

time41 minutes ago

  • Daily Mail​

BREAKING NEWS Mahmoud Khalil gets incredible news three months after ICE threw Columbia activist in jail

A Columbia University activist was ordered freed by a judge three months after Immigration and Customs Enforcement (ICE) took him into custody over claims he is a Hamas supporter. Mahmoud Khalil must be freed on bail, Judge Michael E. Farbiaz of the Federal District Court in Newark, New Jersey, ruled on Friday. Farbiaz ruled that none of the Trump administration's allegations against Khalil justified his continued detention. Developing story, check back for updates...

Conservation group makes $60M land deal to end mining threat outside Okefenokee Swamp
Conservation group makes $60M land deal to end mining threat outside Okefenokee Swamp

The Independent

time43 minutes ago

  • The Independent

Conservation group makes $60M land deal to end mining threat outside Okefenokee Swamp

A conservation group said Friday it has reached a $60 million deal to buy land outside the Okefenokee Swamp from a mining company that environmentalists spent years battling over a proposed mine that opponents feared could irreparably damage an ecological treasure. The Conservation Fund said it will buy all 7,700 acres (31.16 square kilometers) that Alabama-based Twin Pines owns outside the Okefenokee National Wildlife Refuge in southeast Georgia, halting the company's mining plans. 'It's a big undertaking, but it was also an existential threat to the entire refuge," said Stacy Funderburke, the Conservation Fund's vice president for the central Southeast. 'We've done larger deals for larger acres, but dollar-wise this is the largest deal we've ever done in Georgia." A Twin Pines spokesman did not immediately respond to an email message seeking comment. Twin Pines of Birmingham, Alabama, had worked since 2019 to obtain permits to mine titanium dioxide, a pigment used to whiten products from paint to toothpaste, less than 3 miles (5 kilometers) from the southeastern boundary of the Okefenokee refuge near the Georgia-Florida line. The Okefenokee is the largest U.S. refuge east of the Mississippi River, covering nearly 630 square miles (1,630 square kilometers) in southeast Georgia. It is home to abundant alligators, stilt-legged wood storks and more than 400 other animal species. The mine appeared to be on the cusp of winning final approval early last year. Georgia regulators issued draft permits in February 2024 despite warnings from scientists that mining near the Okefenokee's bowl-like rim could damage its ability to hold water and increase the frequency of withering droughts. Twin Pines insisted it could mine without damaging the swamp. Regulators with the Georgia Department of Environmental Protection agreed, concluding last year that mining should have a 'minimal impact' on the refuge. The agency revealed recently that work on final permits had stalled because Twin Pines had yet to submit a surety bond or equivalent financial assurance to show that it had $2 million set aside for future restoration of the mining site. It said the company was informed of the requirement 16 months ago.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store