Scientists 'strike gold' in shocking discovery from Hawaiian volcanic rocks
A German university, Göttingen University, has literally "struck gold" in recent findings from volcanic rocks.
A new study of these volcanic rocks from Hawaii, that leaked out from deep under the lithosphere, contained various precious metals.
Dr. Nils Messling of the Göttingen University's Department of Geochemistry said in a news release they were surprised when the test results came in.
"When the first results came in, we realized that we had literally struck gold! Our data confirmed that material from the core, including gold and other precious metals, is leaking into the Earth's mantle above," Messling said.
Largest Gold Deposit In The World Worth $83 Billion Found In China
Approximately 99% of the Earth's gold is buried deep in the Earth's Metallic Core, far out of humankind's reach.
Read On The Fox News App
The gold is currently buried about 1,800 miles deep in the core.
The discovery of this ruthenium, which was formed and locked down with gold, might be a telling sign that these volcanic rocks are coming from deep within the Earth.
Hawaii's Kilauea Volcano Erupts With 1,000-Foot 'Lava Fountaining'
"Our findings not only show that the Earth's core is not as isolated as previously assumed. We can now also prove that huge volumes of super-heated mantle material – several hundreds of quadrillion metric tonnes of rock – originate at the core-mantle boundary and rise to the Earth's surface to form ocean islands like Hawaii," said Professor Matthias Willbold in a news release.
There is a way to test for isotopes of ruthenium, especially when differences are small variations of the same element.
The isotopes of ruthenium in the Earth's core are slightly different from those on the surface, with the difference being too small to really detect.
However, new procedures developed by researchers at the University of Göttingen have made it possible.
"Whether these processes that we observe today have also been operating in the past remains to be proven. Our findings open up an entirely new perspective on the evolution of the inner dynamics of our home planet," Messling said in a statement.
With these precious metals beginning to leak to the Earth's surface, it could suggest that the supplies of gold and others important for renewable energy came from the Earth's core.Original article source: Scientists 'strike gold' in shocking discovery from Hawaiian volcanic rocks
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
an hour ago
- Yahoo
Your AI use could have a hidden environmental cost
Sign up for CNN's Life, But Greener newsletter. Our limited newsletter series guides you on how to minimize your personal role in the climate crisis — and reduce your eco-anxiety. Whether it's answering work emails or drafting wedding vows, generative artificial intelligence tools have become a trusty copilot in many people's lives. But a growing body of research shows that for every problem AI solves, hidden environmental costs are racking up. Each word in an AI prompt is broken down into clusters of numbers called 'token IDs' and sent to massive data centers — some larger than football fields — powered by coal or natural gas plants. There, stacks of large computers generate responses through dozens of rapid calculations. The whole process can take up to 10 times more energy to complete than a regular Google search, according to a frequently cited estimation by the Electric Power Research Institute. So, for each prompt you give AI, what's the damage? To find out, researchers in Germany tested 14 large language model (LLM) AI systems by asking them both free-response and multiple-choice questions. Complex questions produced up to six times more carbon dioxide emissions than questions with concise answers. In addition, 'smarter' LLMs with more reasoning abilities produced up to 50 times more carbon emissions than simpler systems to answer the same question, the study reported. 'This shows us the tradeoff between energy consumption and the accuracy of model performance,' said Maximilian Dauner, a doctoral student at Hochschule München University of Applied Sciences and first author of the Frontiers in Communication study published Wednesday. Typically, these smarter, more energy intensive LLMs have tens of billions more parameters — the biases used for processing token IDs — than smaller, more concise models. 'You can think of it like a neural network in the brain. The more neuron connections, the more thinking you can do to answer a question,' Dauner said. Complex questions require more energy in part because of the lengthy explanations many AI models are trained to provide, Dauner said. If you ask an AI chatbot to solve an algebra question for you, it may take you through the steps it took to find the answer, he said. 'AI expends a lot of energy being polite, especially if the user is polite, saying 'please' and 'thank you,'' Dauner explained. 'But this just makes their responses even longer, expending more energy to generate each word.' For this reason, Dauner suggests users be more straightforward when communicating with AI models. Specify the length of the answer you want and limit it to one or two sentences, or say you don't need an explanation at all. Most important, Dauner's study highlights that not all AI models are created equally, said Sasha Luccioni, the climate lead at AI company Hugging Face, in an email. Users looking to reduce their carbon footprint can be more intentional about which model they chose for which task. 'Task-specific models are often much smaller and more efficient, and just as good at any context-specific task,' Luccioni explained. If you are a software engineer who solves complex coding problems every day, an AI model suited for coding may be necessary. But for the average high school student who wants help with homework, relying on powerful AI tools is like using a nuclear-powered digital calculator. Even within the same AI company, different model offerings can vary in their reasoning power, so research what capabilities best suit your needs, Dauner said. When possible, Luccioni recommends going back to basic sources — online encyclopedias and phone calculators — to accomplish simple tasks. Putting a number on the environmental impact of AI has proved challenging. The study noted that energy consumption can vary based on the user's proximity to local energy grids and the hardware used to run AI partly why the researchers chose to represent carbon emissions within a range, Dauner said. Furthermore, many AI companies don't share information about their energy consumption — or details like server size or optimization techniques that could help researchers estimate energy consumption, said Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside who studies AI's water consumption. 'You can't really say AI consumes this much energy or water on average — that's just not meaningful. We need to look at each individual model and then (examine what it uses) for each task,' Ren said. One way AI companies could be more transparent is by disclosing the amount of carbon emissions associated with each prompt, Dauner suggested. 'Generally, if people were more informed about the average (environmental) cost of generating a response, people would maybe start thinking, 'Is it really necessary to turn myself into an action figure just because I'm bored?' Or 'do I have to tell ChatGPT jokes because I have nothing to do?'' Dauner said. Additionally, as more companies push to add generative AI tools to their systems, people may not have much choice how or when they use the technology, Luccioni said. 'We don't need generative AI in web search. Nobody asked for AI chatbots in (messaging apps) or on social media,' Luccioni said. 'This race to stuff them into every single existing technology is truly infuriating, since it comes with real consequences to our planet.' With less available information about AI's resource usage, consumers have less choice, Ren said, adding that regulatory pressures for more transparency are unlikely to the United States anytime soon. Instead, the best hope for more energy-efficient AI may lie in the cost efficacy of using less energy. 'Overall, I'm still positive about (the future). There are many software engineers working hard to improve resource efficiency,' Ren said. 'Other industries consume a lot of energy too, but it's not a reason to suggest AI's environmental impact is not a problem. We should definitely pay attention.'


CNN
5 hours ago
- CNN
How your AI prompts could harm the environment
AI Sustainability Climate change EconomyFacebookTweetLink Follow Sign up for CNN's Life, But Greener newsletter. Our limited newsletter series guides you on how to minimize your personal role in the climate crisis — and reduce your eco-anxiety. Whether it's answering work emails or drafting wedding vows, generative artificial intelligence tools have become a trusty copilot in many people's lives. But a growing body of research shows that for every problem AI solves, hidden environmental costs are racking up. Each word in an AI prompt is broken down into clusters of numbers called 'token IDs' and sent to massive data centers — some larger than football fields — powered by coal or natural gas plants. There, stacks of large computers generate responses through dozens of rapid calculations. The whole process can take up to 10 times more energy to complete than a regular Google search, according to a frequently cited estimation by the Electric Power Research Institute. So, for each prompt you give AI, what's the damage? To find out, researchers in Germany tested 14 large language model (LLM) AI systems by asking them both free-response and multiple-choice questions. Complex questions produced up to six times more carbon dioxide emissions than questions with concise answers. In addition, 'smarter' LLMs with more reasoning abilities produced up to 50 times more carbon emissions than simpler systems to answer the same question, the study reported. 'This shows us the tradeoff between energy consumption and the accuracy of model performance,' said Maximilian Dauner, a doctoral student at Hochschule München University of Applied Sciences and first author of the Frontiers in Communication study published Wednesday. Typically, these smarter, more energy intensive LLMs have tens of billions more parameters — the biases used for processing token IDs — than smaller, more concise models. 'You can think of it like a neural network in the brain. The more neuron connections, the more thinking you can do to answer a question,' Dauner said. Complex questions require more energy in part because of the lengthy explanations many AI models are trained to provide, Dauner said. If you ask an AI chatbot to solve an algebra question for you, it may take you through the steps it took to find the answer, he said. 'AI expends a lot of energy being polite, especially if the user is polite, saying 'please' and 'thank you,'' Dauner explained. 'But this just makes their responses even longer, expending more energy to generate each word.' For this reason, Dauner suggests users be more straightforward when communicating with AI models. Specify the length of the answer you want and limit it to one or two sentences, or say you don't need an explanation at all. Most important, Dauner's study highlights that not all AI models are created equally, said Sasha Luccioni, the climate lead at AI company Hugging Face, in an email. Users looking to reduce their carbon footprint can be more intentional about which model they chose for which task. 'Task-specific models are often much smaller and more efficient, and just as good at any context-specific task,' Luccioni explained. If you are a software engineer who solves complex coding problems every day, an AI model suited for coding may be necessary. But for the average high school student who wants help with homework, relying on powerful AI tools is like using a nuclear-powered digital calculator. Even within the same AI company, different model offerings can vary in their reasoning power, so research what capabilities best suit your needs, Dauner said. When possible, Luccioni recommends going back to basic sources — online encyclopedias and phone calculators — to accomplish simple tasks. Putting a number on the environmental impact of AI has proved challenging. The study noted that energy consumption can vary based on the user's proximity to local energy grids and the hardware used to run AI partly why the researchers chose to represent carbon emissions within a range, Dauner said. Furthermore, many AI companies don't share information about their energy consumption — or details like server size or optimization techniques that could help researchers estimate energy consumption, said Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside who studies AI's water consumption. 'You can't really say AI consumes this much energy or water on average — that's just not meaningful. We need to look at each individual model and then (examine what it uses) for each task,' Ren said. One way AI companies could be more transparent is by disclosing the amount of carbon emissions associated with each prompt, Dauner suggested. 'Generally, if people were more informed about the average (environmental) cost of generating a response, people would maybe start thinking, 'Is it really necessary to turn myself into an action figure just because I'm bored?' Or 'do I have to tell ChatGPT jokes because I have nothing to do?'' Dauner said. Additionally, as more companies push to add generative AI tools to their systems, people may not have much choice how or when they use the technology, Luccioni said. 'We don't need generative AI in web search. Nobody asked for AI chatbots in (messaging apps) or on social media,' Luccioni said. 'This race to stuff them into every single existing technology is truly infuriating, since it comes with real consequences to our planet.' With less available information about AI's resource usage, consumers have less choice, Ren said, adding that regulatory pressures for more transparency are unlikely to the United States anytime soon. Instead, the best hope for more energy-efficient AI may lie in the cost efficacy of using less energy. 'Overall, I'm still positive about (the future). There are many software engineers working hard to improve resource efficiency,' Ren said. 'Other industries consume a lot of energy too, but it's not a reason to suggest AI's environmental impact is not a problem. We should definitely pay attention.'
Yahoo
6 hours ago
- Yahoo
Scientists may have found evidence of a fifth ‘force of nature'
If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. Every action in our world is powered by a 'force of nature.' Currently, there are four main forces that scientists cling to; gravity, electromagnetism, weak interaction, and strong interaction. The latter two are technically considered nuclear forces. However, some scientists believe a fifth force of nature may exist, and a new paper claims to have found evidence of it. A group of researchers from Switzerland, Australia, and Germany believe that this fifth force could be hiding deep within the hearts of atoms. While the Standard Model of physics has evolved over the years to help explain quantum and cosmic examples, there are still some massive gaps that leave scientists and physicists baffled. Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 Dark matter is a big one, of course, and even gravity hasn't been fully solved, despite being one of the primary forces of nature. Introducing a fifth force of nature, as well as other fields and particles, could broaden our understanding of the universe in important ways. But finding the evidence to prove these forces actually exist is the difficult part. That's why the researchers involved in this new study started small. Instead of trying to work at a cosmic scale, they started looking at things on an atomic level. They focused their attention on the nuclei of four different kinds of calcium. Typically, electrons are confined by the attraction between their own charge and the positively charged particles in the center of the atom. But if you give them a little kick, they can actually transcend to a higher orbit. This phenomenon is known as atomic transition. The exact timing of the jump depends heavily on the construction of the nucleus, which means an element can have multiple atomic transitions depending on the number of neutrons found within it. The researchers believe that a fifth force of nature could be the driving engine behind these small interactions. Their experiments found that there was a small amount of room between the atomic transitions — just enough room for a particle with a mass believed to be somewhere between 10 and 10 million electronvolts. Determining whether or not that ambiguity is indeed another force of nature will require additional experimentation and improved calculations, though. More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the