logo
Black spots, satellites and Elon: new technology could fill Australia's mobile gaps – but can it be relied on?

Black spots, satellites and Elon: new technology could fill Australia's mobile gaps – but can it be relied on?

The Guardian01-03-2025

Bernie Byrnes, a farmer from the southern tablelands of New South Wales, can tell when he is about to hit a mobile black spot on a highway.
'There's spots on the side of the highway where reception falls out. And you can see it if you're familiar with the vehicles that travel. People will pull over in the same spot to finish a conversation, finish a meeting. And you know that they're familiar with the reception there,' he says.
Successive governments have plugged millions of taxpayer dollars into improving mobile coverage across Australia. But the perennial problem facing the country is its sheer size, compared against its population size, and whether it is cost-effective to build towers for places with low populations, or places people briefly travel through, such as highways.
The universal outdoor mobile obligation, to be introduced by the Labor government as legislation if it wins the next election, would expand existing obligations for triple zero access across the nation to include outdoor voice and SMS coverage, and also aim to improve the availability of mobile services during disasters and power outages.
The arrival of new satellite services could help fill the gap.
Low-earth orbit (LEO) satellites fly between 500 and 2,000 kilometres above sea level, and rotate the Earth multiple times a day.
While the world's craving for a constant connection has led to these technologies being developed and used globally, the proliferation of a growing number of smaller satellites and service providers has caused concern among astronomers, around issues such as light pollution and space junk.
Sign up for a weekly email featuring our best reads
But they are quickly filling the gap in telecommunications services for remote places where it is too cost-prohibitive to roll out fixed networks or mobile towers, and offer better quality service than geostationary satellites such as the National Broadband Network's (NBN) Sky Muster satellites.
NBN Co is in the advanced stages of selecting an LEO provider to eventually replace the ageing satellite service in operation now – but meanwhile, more than 200,000 people in Australia have signed up to Elon Musk's Starlink service as an alternative.
Satellite services have initially been limited to fixed services. But advances in the technology are shifting, allowing people to have direct-to-device access to satellite connections. This means that – provided the person is standing outside – they could access satellite on their mobile phone for texting and calls.
In the recent Los Angeles wildfires, T-Mobile used its direct-to-device service in partnership with Starlink to allow customers using existing 4G handsets to make calls and send text messages despite traditional mobile network outages.
The Albanese government is banking on this advancement to plug black spots across Australia and make networks more resilient in the event of power outages or natural disasters.
There is scant detail on how much it would cost. Officials in Senate estimates hearings last week could not put a price tag on the new universal outdoor mobile obligation, saying cost was 'a matter for future budget consideration by government'. Current funding for the existing universal service obligations is $270m a year.
The government plans to introduce legislation in 2025 after consultation – putting the timeline beyond the next election – and it would not be in place until late 2027.
Sign up to Five Great Reads
Each week our editors select five of the most interesting, entertaining and thoughtful reads published by Guardian Australia and our international colleagues. Sign up to receive it in your inbox every Saturday morning
after newsletter promotion
It has flagged the policy needs to include support for 'public interest objectives and competition outcomes'.
Competition is a major concern. The only commercial LEO operator in Australia right now is Starlink. While others, such as Amazon's Project Kuiper, are planned for Australia, concerns were raised in Senate estimates by the Greens senator Sarah Hanson-Young as to whether sovereign risk analysis had been done.
'If Starlink is the only company, US-owned, what does that mean if somebody – Elon Musk, somebody else, I don't know – decides that it's not a service to be offered to Australia?' Hanson-Young asked.
James Chisholm, the infrastructure department deputy secretary, said extensive analysis had been conducted, and that despite Starlink playing an important role, the policy 'is sending a clear signal' on welcoming other entrants into the market.
David Howell, a resident of Mount Wilson in the Blue Mountains in NSW, says he estimates about 50% of the approximately 70 properties in his town do not get mobile coverage at home.
'Any improvement we can get in mobile reception would be fantastic,' he says. What's especially important, he adds, is for the community to be able to contact one another during emergency situations, when mobile towers may be down and landlines aren't working.
'Anything will help because it's very important we be able to contact the community [about] what's going on, and whether to evacuate or not.'
Byrnes says improved coverage would give peace of mind, and make work more efficient.
'The more that you can rely on technology, the more efficient that you can become,' he says. 'If we're waiting for a carrier to come and pick up stock or wool or whatever, if they're running late, we rely on them to give us a heads up, or vice versa.
'If we're sitting around, it's dead time for everyone. And so if we've got reliable phone reception, we can let people know.'

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Everything we know about Tesla's robotaxi launch in Austin
Everything we know about Tesla's robotaxi launch in Austin

NBC News

time3 hours ago

  • NBC News

Everything we know about Tesla's robotaxi launch in Austin

Tesla 's long-overdue robotaxi is finally hitting the streets this weekend, but the rollout may face some roadblocks. The Elon Musk -led electric vehicle company is expected to roll out robotaxis in Austin, Texas, on June 22, with the first driverless trip from the factory to a customer house expected on his birthday, June 28. Musk shared news of the tentative debut in a post on social media platform X last week. Here's what we know about the Tesla event so far. When and where The launch will include a limited number of Tesla vehicles debuting in Austin on June 22. The initial rides will be in the Model Y and not the CyberCab that was unveiled in October. Access to the vehicles and rides is by invitation only. Some influencers on social media have reported receiving early access invites to test out the new service starting Sunday. The rides will also occur in a geofenced area of the city and remote drivers will be monitoring as a backup. Musk told CNBC's David Faber that robotaxis will only operate in the parts of Austin that the company would 'consider to be the safest' and said Tesla will be 'watching' the cars in remote operations centers. Can Musk deliver on promises? Musk has long touted a driverless robotaxi, and the pressure is on the billionaire to deliver on his promises. As early as 2019, Musk said he was 'very confident' that robotaxis would launch in 2020. In May, Musk confirmed plans to debut the service in Austin this month, with launches later set for Los Angeles and San Francisco. At the time, Musk said the service would launch with 10 vehicles circulating Austin. 'It's prudent for us to start with a small number, confirm that things are going well and then scale it up,' he told CNBC's Faber. Wall Street analysts such as Wedbush's Dan Ives believe robotaxis will usher in the 'golden era of autonomous for Tesla' that could power its market capitalization to more than $2 trillion by the end of next year. That's about double its market value from Wednesday's close. 'There will be many setbacks ... but given its unmatched scale and scope globally we believe Tesla has the opportunity to own the autonomous market and down the road license its technology to other auto players both in the U.S. and around the globe,' he wrote in a note. Tesla, once seen as a self-driving tech leader, is now a laggard, trying to catch up to Alphabet -owned Waymo in the U.S. Waymo, which said it reached 10 million trips in May, is already operating a fleet of commercial robotaxis across the U.S. and is also seeking permission to test its autonomous vehicles, with a human safety driver on board, in New York City. Regulatory hurdles and opposition Tesla faces a bumpy road ahead, littered with regulatory hurdles and pushback from lawmakers. On Wednesday, a group of Democratic lawmakers in Texas called on Tesla to push off its robotaxi launch until Sept. 1, when Texas rolls out a new slate of self-driving laws. 'We believe this is in the best interest of both public safety and building public transit operation,' the group said in a letter addressed to Tesla's field quality director, Eddie Gates. They also asked for 'detailed information demonstrating that Tesla will be compliant with the new law' if it goes ahead with the launch. Public safety advocates protested the launch in Austin earlier this month. A group known as The Dawn Project, a tech safety organization that is critical of Tesla's autonomous capabilities, demonstrated a Tesla Model Y with currently available 'Full Self Driving' software running past a stopped school bus and hitting a child-sized mannequin. The group said it was a situation where the software misread the elements in the road. 'Any human ... following the law would have stopped when they saw the school bus stopped with the lights flashing. They would have stopped,' Dawn Project founder Dan O'Dowd told CNBC's ' Squawk on the Street ' on Friday. O'Dowd, who also runs Green Hills Software, a company that sells technology to Tesla competitors, told CNBC that the software is 'nowhere near done' and shouldn't be taking to the streets. 'This software does not know how to recognize a school bus,' he said. Tesla's FSD capabilities, which feature a standard FSD or FSD supervised, include automatic steering and parking, but have been connected to accidents and fatalities, according to data tracked by the National Highway Traffic Safety Administration. Tesla under fire Tesla's brand has taken several hits in recent months with a decline in sales and reputational damage linked to Musk's political activities. Musk was a major supporter of President Donald Trump, funneling hundreds of millions into his reelection campaign and later spearheading his Department of Government Efficiency effort aimed at cutting costs. He left the department at the end of May. Musk's close ties to Trump's White House have caused owners to part with the brand and in some cases led to violence, with showrooms and vehicles targeted in arson and vandalism attacks. But the relationship between Musk and Trump soured earlier this month after the tech titan berated the president's spending bill on X, leading to a drastic sell-off in the stock. He later apologized for his social media posts, saying some 'went too far.' The EV maker is also seeing a global sales decline weighing on key markets such as the U.S. and Europe. Vehicle sales in Europe tanked 49% from a year ago in April, while global first-quarter deliveries dropped 13%. The decline was tied to a combination of Musk's politics and heightened competition in the EV market.

As Musk's 'robotaxi' rollout approaches, Democratic lawmakers in Texas try to throw up a roadblock
As Musk's 'robotaxi' rollout approaches, Democratic lawmakers in Texas try to throw up a roadblock

The Independent

time4 hours ago

  • The Independent

As Musk's 'robotaxi' rollout approaches, Democratic lawmakers in Texas try to throw up a roadblock

A group of Democratic lawmakers in Texas is asking Elon Musk to delay the planned rollout of driverless 'robotaxis' in the state this weekend to assure that the vehicles are safe. In a letter, seven state legislators asked Tesla to wait until September when a new law takes effect that will require several checks before autonomous vehicles can be deployed without a human in the driver's seat. Tesla is slated to begin testing a dozen of what it calls robotaxis for paying customers on Sunday in a limited area of Austin, Texas. 'We are formally requesting that Tesla delay autonomous robotaxi operations until the new law takes effect on September 1, 2025,' the letter from Wednesday, June 18, reads. 'We believe this is in the best interest of both public safety and building public trust in Tesla's operations.' It's not clear if the letter will have much impact. Republicans have been a dominant majority in the Texas Legislature for more than 20 years. State lawmakers and Republican Gov. Greg Abbott have generally embraced Musk and the jobs and investment he has brought to Texas, from his SpaceX rocket program on the coast, to his Tesla factory in Austin. The company, which is headquartered in Austin, did not responded immediately to a request for comment from The Associated Press. The law will require companies to secure approval from the state motor vehicles department to operate autonomous cars with passengers. That approval, in turn, would depend on sufficient proof that the cars won't pose a high risk to others if the self-driving system breaks down, among other reassurances. Companies would also have to file detailed plans for how first responders should handle the cars if there is a problem, such as an accident. The letter asked Tesla to assure the legislators it has met all the requirements of the law even if it decides to go ahead with the test run this weekend. The letter was earlier reported by Reuters. Musk has made the robotaxi program a priority at Tesla and a failure would likely be highly damaging to the company's stock, which has already tumbled 20% this year. Musk's political views and his affiliation with the Trump administration have drastically reduced sales of Tesla, particularly in Europe, where Musk's endorsement of Germany's far-right Alternative for Germany party in February's election drew broad condemnation. Tesla shares bottomed out in March and have rebounded somewhat in recent months. Much of the rise reflects optimism that robotaxis will not only be deployed without a hitch, but that the service will quickly expand to other cities and eventually dominate the self-driving cab business. Rival Waymo is already picking up passengers in Austin and several other cities, and recently boasted of surpassing 10 million paid rides. In afternoon trading Friday, Tesla shares were largely unchanged at $320. ________ AP reporter Jim Vertuno contributed from Austin.

BREAKING: Judge orders release of Mahmoud Khalil
BREAKING: Judge orders release of Mahmoud Khalil

NBC News

time5 hours ago

  • NBC News

BREAKING: Judge orders release of Mahmoud Khalil

Tech News Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties Thorn, a nonprofit that provides detection and moderation software related to child safety, said it canceled its contract with X after the platform stopped paying it. June 18, 2025, 4:57 PM EDT / Updated June 18, 2025, 5:58 PM EDT By Ben Goggin When Elon Musk took over Twitter in 2022, he said that addressing the problem of child sexual abuse material on the platform was his ' top priority.' Three years later, the problem appears to be escalating, as anonymous, seemingly automated X accounts flood hashtags with hundreds of posts per hour advertising the sale of the illegal material. At the same time, Thorn, a California-based nonprofit organization that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X. Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities. X said Wednesday that it was moving toward using its own technology to address the spread of child abuse material. Some of Thorn's tools are designed to address the very issue that appears to be growing on the platform. 'We recently terminated our contract with X due to nonpayment,' Cassie Coccaro, head of communications at Thorn, told NBC News. 'And that was after months and months of outreach, flexibility, trying to make it work. And ultimately we had to stop the contract.' Many aspects of the child exploitation ads issue, which NBC News first reported on in January 2023, remain the same on the platform. Sellers of child sexual abuse material (CSAM) continue to use hashtags based on sexual keywords to advertise to people looking to buy CSAM. Their posts direct prospective buyers to other platforms where users are asked for money in return for the child abuse material. Other aspects are new: Some accounts now appear to be automated (also known as bots), while others have taken advantage of 'Communities,' a relatively new feature launched in 2021 that encourages X users to congregate in groups 'closer to the discussions they care about most.' Using Communities, CSAM advertisers have been able to post into groups of tens of thousands of people devoted to topics like incest, seemingly without much scrutiny. The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material. Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was 'a bit old hat' at this point, and that X's response 'has been woefully insufficient.' 'It seems to be a little bit of a game of Whac-A-Mole that goes on,' he said. 'There doesn't seem to be a particular push to really get to the root cause of the issue.' X says it has a zero-tolerance policy 'towards any material that features or promotes child sexual exploitation.' A spokesperson for X directed NBC News to a post from its @Safety account detailing what the company says are new efforts to find and remove child abuse material. 'At X, we have zero tolerance for child sexual exploitation in any form. Until recently, we leveraged partnerships that helped us along the way,' the company said in the post. 'We are proud to provide an important update on our continuous work detecting Child Sexual Abuse Material (CSAM) content, announcing today that we have launched additional CSAM hash matching efforts. 'This system allows X to hash and match media content quickly and securely, keeping the platform safer without sacrificing user privacy,' the post continued. 'This is enabled by the incredible work of our safety engineering team, who have built state of the art systems to further strengthen our enforcement capabilities.' The company said that the system would allow the company to automatically detect known CSAM and remove it, though it was not clear how it differs from existing hashing technology. The spokesperson did not respond to questions about Thorn's allegations regarding the payments. A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute. Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today. Historically, Twitter and then X have attempted to block certain hashtags associated with child exploitation. When NBC News first reported on the use of X to market CSAM, X's head of trust and safety said the company knew it had work to do and would be making changes, including the development of automated systems to detect and block hashtags. In January 2024, X CEO Linda Yaccarino testified to the Senate Judiciary Committee that the company had strengthened its enforcement 'with more tools and technology to prevent bad actors from distributing, searching for, or engaging with [child sexual exploitation] content across all forms of media.' In May 2024, X said it helped Thorn test a tool to 'proactively detect text-based child sexual exploitation.' The 'self-hosted solution was deployed seamlessly into our detection mechanisms, allowing us to hone in on high-risk accounts and expand child sexual exploitation text detection coverage,' X said. Pailes Halai, Thorn's senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn's software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn't clear if they ever fully implemented it. 'They took part in the beta with us last year,' he said. 'So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it's not very clear to us at this point how and if they used it.' Without Thorn, it's not entirely clear what child safety mechanisms X is currently employing. 'Our technology is designed with safety in mind,' Halai said. 'It's up to the platform to enforce and use the technology appropriately … What we do know on our side is it's designed to catch the very harms that you're talking about.' Halai said Thorn didn't take the termination of its contract with X lightly. 'It was very much a last-resort decision for us to make,' he said. 'We provided the services to them. We did it for as long as we possibly could, exhausted all possible avenues and had to terminate, ultimately, because, as a nonprofit, we're not exactly in the business of helping to sustain something for a company like X, where we're actually incurring huge costs.' Currently, some hashtags, like #childporn, are blocked when using X's search function, but other hashtags are open to browse and are filled with posts advertising CSAM for sale. NBC News found posts appearing to peddle CSAM in 23 hashtags that are oftentimes used together in the posts. NBC News only identified two hashtags that were blocked by X. The hashtags that were available to be posted to and viewed during an NBC News' review of the platform ranged from references to incest and teenagers to slightly more coded terms, like combinations of words with the name of the defunct video chat platform Omegle, which shut down in 2023 after a child sex exploitation lawsuit. Some hashtags contained jumbled letters and only contained posts advertising CSAM, indicating that they were created with the exclusive purpose of housing the advertisements. Some usernames of accounts posting the ads were simply a jumble of words associated with CSAM content on the platform, mixing names of social media platforms with other keywords. Many of the users linked directly to Telegram channels in their posts or their account bios and included explicit references to CSAM. Some posts linked to Discord channels or solicited direct messages to secure Discord links. Telegram and Discord have distinct positions in the internet's child exploitation ecosystem, offering semiprivate and private venues for people looking to sell or buy child exploitation material. NBC News previously reported on 35 cases in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord. A Discord representative said, 'Discord has zero tolerance for child sexual abuse material, and we take immediate action when we become aware of it, including removing content, banning users, and reporting to the National Center for Missing and Exploited Children (NCMEC).' The company said in response to NBC News' outreach that it removed multiple servers 'for policy violations unrelated to the sale of CSAM.' A representative for Telegram said 'CSAM is explicitly forbidden by Telegram's terms of service and such content is removed whenever discovered.' The representative pointed to the company's partnership with the U.K.-based Internet Watch Foundation, which maintains a database of known CSAM and provides tools to detect and remove it. While some of the X accounts posted publicly, others solicited and offered CSAM through X's Communities feature, where users create groups based on specific topics. NBC News observed groups with tens of thousands of members in which CSAM was solicited or was offered to be sold. In a group with over 70,000 members devoted to 'incest confessions,' multiple users posted multiple times linking to Telegram channels, explicitly referencing CSAM. 'I'm selling 6cp folder for only 90$,' one user wrote, linking to a Telegram account. CP is a common online abbreviation for 'child pornography.' CSAM has been a perpetual problem on the internet and social media, with many companies employing specialized teams and building automated systems to identify and remove abuse content and those spreading it. But Musk also instituted drastic cuts to the company's trust and safety teams and disbanded the company's Trust and Safety Council. In 2023, the company said that it was detecting more CSAM than in previous years and that it had increased staffing devoted to the issue despite larger trust and safety layoffs. Richardson, C3P's director of information technology, said that while X will sometimes remove accounts that are flagged to it for violating rules around CSAM, 'a new account pops up in two seconds, so there's not a lot of in-depth remediation to the problem. That's just sort of the bare minimum that we're looking at here.' He said an increasing reliance on artificial intelligence systems for moderation, if X is using them, could be in part to blame for such oversights. According to Richardson, AI systems are good at sorting through large datasets and flagging potential issues, but that, currently, systems will inevitably over- or under-moderate without human judgment at the end. 'There should be an actual incident response when someone is selling child sexual abuse material on your service, right? We've become completely desensitized to that. We're dealing with the sale of children being raped,' Richardson said. 'You can't automate your way out of this problem.' Ben Goggin Ben Goggin is the deputy tech editor for NBC News.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store