Researchers taking the wind out of cyclone devastation
Although Alfred was an ex-tropical cyclone by the time it reached Queensland in March, it still managed to cause over a billion dollars' damage.
For most, the toll had it landed as a Category 5 system generating winds in excess of 250km/h, doesn't bear contemplating.
Yet for an elite team of Australian National University researchers, it's the kind of scenario that lives rent free in their heads.
The group is on course to establish how aerosols might hold the key to stopping destructive cyclones in their tracks.
The small airborne particles have been shown to stunt storm development, according to the study's lead author Associate Professor Roslyn Prinsley.
With climate change making cyclones more dangerous, she is convinced innovative solutions have become crucial.
"Others have looked at the impact of aerosols on a fully grown cyclone, when it might be about to hit land," she explained.
"We thought, it may be easier to stop them before they start."
Prof Prinsley and her colleagues have already shown it's possible.
The trick lies in understanding the complex physics of how clouds form, including how tiny particles interact, how heat is released and how these processes impact one another other.
Past efforts to modify storms have failed because researchers couldn't reliably predict their behaviour. Without accurate forecasting models, attempts to alter cloud formation have largely proved to be guesswork.
However understanding how aerosols of different sizes disrupt extreme weather systems at the formation stage has provided the way forward.
"We found coarse aerosols initially dampen vortex acceleration, while fine or ultrafine aerosols boost it first but later weaken it more than coarse aerosols," Prof Prinsley said.
"Getting these aerosols to where they're needed is another challenge we're looking at - it would require several aircraft to disperse the aerosols over a few hours."
She is confident Australia will become a global leader in the somewhat obscure scientific space, with the coastline off Western Australia providing a ripe testing ground.
Cyclones that form there, the ones that will never hit land, are the best to test.
The ANU team is collaborating with a Silicon Valley start-up also aiming to weaken cyclones before they threaten lives.
The Australian research is the only long-term solution," according to Aeolus co-founder Koki Mashita.
"In many parts around the world, the intensification of these events due to climate change has already led to significant increases in insurance premiums.
"As we look into the next few decades, properties will truly become uninsurable and we will need to intervene."
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
3 hours ago
- Yahoo
Your AI use could have a hidden environmental cost
Sign up for CNN's Life, But Greener newsletter. Our limited newsletter series guides you on how to minimize your personal role in the climate crisis — and reduce your eco-anxiety. Whether it's answering work emails or drafting wedding vows, generative artificial intelligence tools have become a trusty copilot in many people's lives. But a growing body of research shows that for every problem AI solves, hidden environmental costs are racking up. Each word in an AI prompt is broken down into clusters of numbers called 'token IDs' and sent to massive data centers — some larger than football fields — powered by coal or natural gas plants. There, stacks of large computers generate responses through dozens of rapid calculations. The whole process can take up to 10 times more energy to complete than a regular Google search, according to a frequently cited estimation by the Electric Power Research Institute. So, for each prompt you give AI, what's the damage? To find out, researchers in Germany tested 14 large language model (LLM) AI systems by asking them both free-response and multiple-choice questions. Complex questions produced up to six times more carbon dioxide emissions than questions with concise answers. In addition, 'smarter' LLMs with more reasoning abilities produced up to 50 times more carbon emissions than simpler systems to answer the same question, the study reported. 'This shows us the tradeoff between energy consumption and the accuracy of model performance,' said Maximilian Dauner, a doctoral student at Hochschule München University of Applied Sciences and first author of the Frontiers in Communication study published Wednesday. Typically, these smarter, more energy intensive LLMs have tens of billions more parameters — the biases used for processing token IDs — than smaller, more concise models. 'You can think of it like a neural network in the brain. The more neuron connections, the more thinking you can do to answer a question,' Dauner said. Complex questions require more energy in part because of the lengthy explanations many AI models are trained to provide, Dauner said. If you ask an AI chatbot to solve an algebra question for you, it may take you through the steps it took to find the answer, he said. 'AI expends a lot of energy being polite, especially if the user is polite, saying 'please' and 'thank you,'' Dauner explained. 'But this just makes their responses even longer, expending more energy to generate each word.' For this reason, Dauner suggests users be more straightforward when communicating with AI models. Specify the length of the answer you want and limit it to one or two sentences, or say you don't need an explanation at all. Most important, Dauner's study highlights that not all AI models are created equally, said Sasha Luccioni, the climate lead at AI company Hugging Face, in an email. Users looking to reduce their carbon footprint can be more intentional about which model they chose for which task. 'Task-specific models are often much smaller and more efficient, and just as good at any context-specific task,' Luccioni explained. If you are a software engineer who solves complex coding problems every day, an AI model suited for coding may be necessary. But for the average high school student who wants help with homework, relying on powerful AI tools is like using a nuclear-powered digital calculator. Even within the same AI company, different model offerings can vary in their reasoning power, so research what capabilities best suit your needs, Dauner said. When possible, Luccioni recommends going back to basic sources — online encyclopedias and phone calculators — to accomplish simple tasks. Putting a number on the environmental impact of AI has proved challenging. The study noted that energy consumption can vary based on the user's proximity to local energy grids and the hardware used to run AI partly why the researchers chose to represent carbon emissions within a range, Dauner said. Furthermore, many AI companies don't share information about their energy consumption — or details like server size or optimization techniques that could help researchers estimate energy consumption, said Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside who studies AI's water consumption. 'You can't really say AI consumes this much energy or water on average — that's just not meaningful. We need to look at each individual model and then (examine what it uses) for each task,' Ren said. One way AI companies could be more transparent is by disclosing the amount of carbon emissions associated with each prompt, Dauner suggested. 'Generally, if people were more informed about the average (environmental) cost of generating a response, people would maybe start thinking, 'Is it really necessary to turn myself into an action figure just because I'm bored?' Or 'do I have to tell ChatGPT jokes because I have nothing to do?'' Dauner said. Additionally, as more companies push to add generative AI tools to their systems, people may not have much choice how or when they use the technology, Luccioni said. 'We don't need generative AI in web search. Nobody asked for AI chatbots in (messaging apps) or on social media,' Luccioni said. 'This race to stuff them into every single existing technology is truly infuriating, since it comes with real consequences to our planet.' With less available information about AI's resource usage, consumers have less choice, Ren said, adding that regulatory pressures for more transparency are unlikely to the United States anytime soon. Instead, the best hope for more energy-efficient AI may lie in the cost efficacy of using less energy. 'Overall, I'm still positive about (the future). There are many software engineers working hard to improve resource efficiency,' Ren said. 'Other industries consume a lot of energy too, but it's not a reason to suggest AI's environmental impact is not a problem. We should definitely pay attention.'
Yahoo
5 hours ago
- Yahoo
Meet the tiny Australian Moth that travels 1,000 km and navigates using the stars
An Australian moth follows the stars during its yearly migration, using the night sky as a guiding compass, according to a new study. When temperatures heat up, nocturnal Bogong moths fly about 1,000 kilometres to cool down in caves by the Australian Alps. They later return home to breed and die. Birds routinely navigate by starlight, but the moths are the first known invertebrates, or creatures without a backbone, to find their way across such long distances using the stars. Scientists have long wondered how the moths travel to a place they've never been. A previous study hinted that Earth's magnetic field might help steer them in the right direction, along with some kind of visual landmark as a guide. Related Does cutting off rhinos' horns protect them from poachers? New study supports controversial approach Since stars appear in predictable patterns each night, scientists suspected they might help lead the way. They placed moths in a flight simulator that mimicked the night sky above them and blocked out the Earth's magnetic field, noting where they flew. Then they scrambled the stars and saw how the moths reacted. When the stars were as they should be, the moths flapped in the right direction. But when the stars were in random places, the moths were disoriented. Their brain cells also got excited in response to specific orientations of the night sky. The findings were published Wednesday in the journal Nature. It 'was a very clean, impressive demonstration that the moths really are using a view of the night sky to guide their movements,' said Kenneth Lohmann, who studies animal navigation at the University of North Carolina at Chapel Hill and was not involved with the new research. Researchers don't know what features of the night sky the moths use to find their way. It could be a stripe of light from the Milky Way, a colourful nebula or something else entirely. Whatever it is, the insects seem to rely on that, along with Earth's magnetic field, to make their journey. Related Rare snail that can 'slurp up earthworms like noodles' caught on camera laying an egg from its neck Scientists use special 'squeezing' and electrical probes to collect sperm from endangered kākāpō Other animals harness the stars as a guide. Birds take celestial cues as they soar through the skies, and dung beetles roll their remains short distances while using the Milky Way to stay on course. It's an impressive feat for Bogong moths, whose brains are smaller than a grain of rice, to rely on the night sky for their odyssey, said study author David Dreyer with Lund University in Sweden. 'It's remarkable that an animal with such a tiny brain can actually do this,' Dreyer said.


CNN
7 hours ago
- CNN
How your AI prompts could harm the environment
AI Sustainability Climate change EconomyFacebookTweetLink Follow Sign up for CNN's Life, But Greener newsletter. Our limited newsletter series guides you on how to minimize your personal role in the climate crisis — and reduce your eco-anxiety. Whether it's answering work emails or drafting wedding vows, generative artificial intelligence tools have become a trusty copilot in many people's lives. But a growing body of research shows that for every problem AI solves, hidden environmental costs are racking up. Each word in an AI prompt is broken down into clusters of numbers called 'token IDs' and sent to massive data centers — some larger than football fields — powered by coal or natural gas plants. There, stacks of large computers generate responses through dozens of rapid calculations. The whole process can take up to 10 times more energy to complete than a regular Google search, according to a frequently cited estimation by the Electric Power Research Institute. So, for each prompt you give AI, what's the damage? To find out, researchers in Germany tested 14 large language model (LLM) AI systems by asking them both free-response and multiple-choice questions. Complex questions produced up to six times more carbon dioxide emissions than questions with concise answers. In addition, 'smarter' LLMs with more reasoning abilities produced up to 50 times more carbon emissions than simpler systems to answer the same question, the study reported. 'This shows us the tradeoff between energy consumption and the accuracy of model performance,' said Maximilian Dauner, a doctoral student at Hochschule München University of Applied Sciences and first author of the Frontiers in Communication study published Wednesday. Typically, these smarter, more energy intensive LLMs have tens of billions more parameters — the biases used for processing token IDs — than smaller, more concise models. 'You can think of it like a neural network in the brain. The more neuron connections, the more thinking you can do to answer a question,' Dauner said. Complex questions require more energy in part because of the lengthy explanations many AI models are trained to provide, Dauner said. If you ask an AI chatbot to solve an algebra question for you, it may take you through the steps it took to find the answer, he said. 'AI expends a lot of energy being polite, especially if the user is polite, saying 'please' and 'thank you,'' Dauner explained. 'But this just makes their responses even longer, expending more energy to generate each word.' For this reason, Dauner suggests users be more straightforward when communicating with AI models. Specify the length of the answer you want and limit it to one or two sentences, or say you don't need an explanation at all. Most important, Dauner's study highlights that not all AI models are created equally, said Sasha Luccioni, the climate lead at AI company Hugging Face, in an email. Users looking to reduce their carbon footprint can be more intentional about which model they chose for which task. 'Task-specific models are often much smaller and more efficient, and just as good at any context-specific task,' Luccioni explained. If you are a software engineer who solves complex coding problems every day, an AI model suited for coding may be necessary. But for the average high school student who wants help with homework, relying on powerful AI tools is like using a nuclear-powered digital calculator. Even within the same AI company, different model offerings can vary in their reasoning power, so research what capabilities best suit your needs, Dauner said. When possible, Luccioni recommends going back to basic sources — online encyclopedias and phone calculators — to accomplish simple tasks. Putting a number on the environmental impact of AI has proved challenging. The study noted that energy consumption can vary based on the user's proximity to local energy grids and the hardware used to run AI partly why the researchers chose to represent carbon emissions within a range, Dauner said. Furthermore, many AI companies don't share information about their energy consumption — or details like server size or optimization techniques that could help researchers estimate energy consumption, said Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside who studies AI's water consumption. 'You can't really say AI consumes this much energy or water on average — that's just not meaningful. We need to look at each individual model and then (examine what it uses) for each task,' Ren said. One way AI companies could be more transparent is by disclosing the amount of carbon emissions associated with each prompt, Dauner suggested. 'Generally, if people were more informed about the average (environmental) cost of generating a response, people would maybe start thinking, 'Is it really necessary to turn myself into an action figure just because I'm bored?' Or 'do I have to tell ChatGPT jokes because I have nothing to do?'' Dauner said. Additionally, as more companies push to add generative AI tools to their systems, people may not have much choice how or when they use the technology, Luccioni said. 'We don't need generative AI in web search. Nobody asked for AI chatbots in (messaging apps) or on social media,' Luccioni said. 'This race to stuff them into every single existing technology is truly infuriating, since it comes with real consequences to our planet.' With less available information about AI's resource usage, consumers have less choice, Ren said, adding that regulatory pressures for more transparency are unlikely to the United States anytime soon. Instead, the best hope for more energy-efficient AI may lie in the cost efficacy of using less energy. 'Overall, I'm still positive about (the future). There are many software engineers working hard to improve resource efficiency,' Ren said. 'Other industries consume a lot of energy too, but it's not a reason to suggest AI's environmental impact is not a problem. We should definitely pay attention.'