logo
#

Latest news with #NatureHumanBehaviour

Ice Age cave find upends what we know about Australia's first people
Ice Age cave find upends what we know about Australia's first people

The Independent

time5 days ago

  • Science
  • The Independent

Ice Age cave find upends what we know about Australia's first people

rare artefacts dating to the last ice age at a cave in Australia 's Blue Mountains, providing definitive proof that the rugged ranges were once occupied by the continent's first people. Researchers working with First Nations community members found that Dargan Shelter, a frigid site at an elevation of about 1073m (3280 ft) west of Sydney, was occupied by early humans 20,000 years ago. The findings, published in the journal Nature Human Behaviour, provide the oldest evidence yet of occupation above 700 metres in Australia. It upends previous beliefs that the Blue Mountain ranges were too difficult to occupy during the last ice age, and also hints that such icy landscapes may not have been a hurdle for early human migration. The research also raises further questions about the ingenuity of early indigenous Australians that enabled them to adapt to these inhospitable conditions. During the last Ice Age, frigid conditions extended to the upper reaches of the Blue Mountains above 600 metres with temperatures at least 8.2 degrees cooler than today, and vegetation much sparser than modern times. Little firewood would have been available in this region during the ice age, and water sources would have been frozen through winter, scientists say. 'Until now, we thought the Australian high country was too difficult to occupy during the last ice age,' said archaeologist Wayne Brennan from the University of Sydney. 'Yet, despite the harsh conditions, our research demonstrates people were moving in and through this high elevation landscape, which is approximately 400m above the treeline,' Dr Brennan said. In the latest excavations, archaeologists unearthed nearly 700 artefacts at the cave site dating to the last Ice Age, including features of a hearth. Many of these were prehistoric tools likely used by Australia's first people for cutting or scraping, researchers say. 'It was the excellent state of preservation that enabled us to construct such a robust chronology for Dargan Cave spanning the last 20,000 years,' said Philip Piper, another author of the study. Most of the claystone tools unearthed were made locally, but one seems to have come from the Jenolan Caves area, which is about 50km (31 miles) away from the Dargan Shelter site, indicating ancient people were travelling from the north and south. While the Blue Mountains range is a UNESCO World Heritage-listed site recognised for its plant and animal diversity, there have been no safeguards to protect the cultural heritage of its indigenous people, researchers say. 'Our people have walked, lived and thrived in the Blue Mountains for thousands of years and we knew the cave was there,' said study author and Dharug woman Leanne Watson Redpath. 'It is not only a tangible connection to our ancestors who used it as a meeting place for sharing, storytelling and survival, but is a part of our cultural identity. We need to respect and protect our heritage for the benefit of all Australians,' she said. Scientists are still unsure which early people accessed the mountains during the last Ice Age. They suspect multiple indigenous groups may have been connected to the region. 'We hope that by combining our traditional knowledge with scientific research, we can protect these invaluable storehouses of our history for generations to come,' Dr Brennan said.

AI art can't match human creativity, yet — researchers – DW – 06/11/2025
AI art can't match human creativity, yet — researchers – DW – 06/11/2025

DW

time11-06-2025

  • Science
  • DW

AI art can't match human creativity, yet — researchers – DW – 06/11/2025

Generative AI models are bad at representing things that require human senses, like smell and touch. Their creativity is 'hollow and shallow,' say experts. Anyone can sit down with an artificial intelligence (AI) program, such as ChatGPT, to write a poem, a children's story, or a screenplay. It's uncanny: the results can seem quite "human" at first glance. But don't expect anything with much depth or sensory "richness", as researchers explain in a new study. They found that the Large Language Modes (LLMs) that currently power Generative AI tools are unable to represent the concept of a flower in the same way that humans do. In fact, the researchers suggest that LLMs aren't very good at representing any 'thing' that has a sensory or motor component — because they lack a body and any organic human experience. "A large language model can't smell a rose, touch the petals of a daisy or walk through a field of wildflowers. Without those sensory and motor experiences, it can't truly represent what a flower is in all its richness. The same is true of some other human concepts," said Qihui Xu, lead author of the study at Ohio State University, US. The study suggests that AI's poor ability to represent sensory concepts like flowers might also explain why they lack human-style creativity. "AI doesn't have rich sensory experiences, which is why AI frequently produces things that satisfy a kind of minimal definition of creativity, but it's hollow and shallow," said Mark Runco, a cognitive scientist at Southern Oregon University, US, who was not involved in the study. The study was published in the journal Nature Human Behaviour , June 4, 2025. What are the challenges to book preservation? To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video AI poor at representing sensory concepts The more scientists probe the inner workings of AI models, the more they are finding just how different their 'thinking' is compared to that of humans. Some say AIs are so different that they are more like alien forms of intelligence. Yet objectively testing the conceptual understanding of AI is tricky. If computer scientists open up a LLM and look inside, they won't necessarily understand what the millions of numbers changing every second really mean. Xu and colleagues aimed to test how well LLMs can 'understand' things based on sensory characteristics. They did this by testing how well LLMs represent words with complex sensory meanings, measuring factors, such as how emotionally arousing a thing is or whether you can mentally visualize a thing, and movement or action-based representations. For example, they analyzed the extent to which humans experience flowers by smelling, or experience them using actions from the torso, such as reaching out to touch a petal. These ideas are easy for us to grasp, since we have intimate knowledge of our noses and bodies, but it's harder for LLMs, which lack a body. Overall, LLMs represent words well — but those words lack any connection to the senses or motor actions that we experience or feel as humans. But when it comes to words that have connections to things we see, taste or interact with using our body, that's where AI fails to convincingly capture human concepts. What's meant by 'AI art is hollow' AI creates representations of concepts and words by analyzing patterns from a dataset that is used to train it. This idea underlies every algorithm or task, from writing a poem, to predicting whether an image of a face is you or your neighbor. Most LLMs are trained on text data scraped from the internet, but some LLMs are also trained on visual learning, from still-images and videos. Xu and colleagues found that LLMs with visual learning exhibited some similarity with human representations in visual-related dimensions. Those LLMs beat other LLMs trained just on text. But this test was limited to visual learning — it excluded other human sensations, like touch or hearing. This suggests that the more sensory information an AI model receives as training data, the better it can represent sensory aspects. AI's impact on the working world To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video AI keeps learning and improving The authors noted that LLMs are continually improving and said it was likely that AI will get better at capturing human concepts in the future. Xu said that when future LLMs are augmented with sensor data and robotics, they may be able to actively make inferences about and act upon the physical world. But independent experts DW spoke to suggested the future of sensory AI remained unclear. "It's possible an AI trained on multisensory information could deal with multimodal sensory aspects without any problem," said Mirco Musolesi, a computer scientist at University College London, UK, who was not involved in the study. However, Runco said even with more advanced sensory capabilities, AI will still understand things like flowers completely differently to humans. Our human experience and memory are tightly linked with our senses — it's a brain-body interaction that stretches beyond the moment. The smell of a rose or the silky feel of its petals, for example, can trigger joyous memories of your childhood or lustful excitement in adulthood. AI programs do not have a body, memories or a 'self'. They lack the ability to experience the world or interact with it as animals and human-animals do — which, said Runco, means "the creative output of AI will still be hollow and shallow." Edited by: Zulfikar Abbany

Study finds genes influencing one's sensitivity to environment, symptoms of mental disorders likely to express
Study finds genes influencing one's sensitivity to environment, symptoms of mental disorders likely to express

Time of India

time11-06-2025

  • Health
  • Time of India

Study finds genes influencing one's sensitivity to environment, symptoms of mental disorders likely to express

New Delhi, A study has uncovered genes that govern how sensitive one is about their environment, levels of which can influence symptoms they present of mental disorders. An international team of researchers, led by those at King's College London, UK, analysed nearly 10,900 pairs of identical twins from 11 studies and examined how changes in sensitivity to environment can influence one's chances of presenting symptoms of ADHD, autism, anxiety and depression, psychosis and neuroticism. Findings, published in the journal Nature Human Behaviour, show that genes related to molecules important for neurodevelopment, immune function and the central nervous system were related with autistic traits. Genes that influence how one reacts to stress were found to be linked with depressive symptoms. Further, genes involved in regulating catecholamines -- hormones, such as dopamine and adrenaline, involved in responding to stress -- were linked to psychotic-like experiences, the researchers found. "Differences in individuals' sensitivity to life experiences can explain why the same negative or positive experiences may have varying effects on people's mental health, depending on their genetic make-up," first author Elham Assary, a postdoctoral researcher at King's College London, said. An interaction between one's genes and their environment is considered to make up a diverse range of traits across species. The 'nature vs nurture' debate in psychology is concerned about how much of an individual's characteristics is due to genetics (nature), and how much due to environment (nurture). "Our findings suggest that specific genetic variants influence how environmental exposures impact psychiatric and neurodevelopmental symptoms," Assary said. Studies often look at identical twins, as they carry almost entirely identical genetic material -- this would mean that differences in their characteristics would be more likely due to the environment they experience. "Some people are more sensitive to their circumstances, and this can be positive in good circumstances but can make life more challenging than for others in stressful circumstances," senior author Thalia Eley, professor of developmental behavioural genetics at King's College London, UK, said. However, discerning which genes are involved in determining what characteristics and symptoms one expresses has proved challenging, especially for complex psychological traits, the team said. "We identified 13 genome-wide significant associations, including genes related to stress reactivity for depression, growth factor-related genes for autistic traits and catecholamine uptake-related genes for psychotic-like experiences," the authors wrote. Results from the study "provide an important step forward in disentangling gene-environment interactions for psychiatric traits and provide a framework for similar investigations in other traits," senior author Patricia Munroe, professor of molecular medicine at Queen Mary University of London, UK, said.

Twins study shows how genetic response to environment impacts health
Twins study shows how genetic response to environment impacts health

Euronews

time10-06-2025

  • Health
  • Euronews

Twins study shows how genetic response to environment impacts health

It's an age-old question: is nature or nurture more responsible for how we turn out in life? Scientists have long believed that some combination of our genes and environment – our diets, lifestyles, traumatic events, and much more – shape our personalities and health outcomes. Now, new research indicates there's another step involved, with our genes affecting how we respond to our life experiences – and these pathways making it more or less likely that we will grapple with a slew of psychological conditions. The study, which was published in the journal Nature Human Behaviour, analysed data from nearly 22,000 identical twins across 11 studies, in what researchers said was the largest study yet to map the entire DNA of identical twins. They identified genetic-environmental links to conditions as wide-ranging as anxiety, depression, psychotic experiences, neuroticism, autism, and attention deficit hyperactivity disorder (ADHD). 'These findings confirm that genes influence psychiatric and neurodevelopmental traits partly through affecting how people respond to the world around them,' Thalia Eley, one of the study's authors and a professor of developmental behavioural genetics at King's College London, said in a statement. The researchers looked at identical twins because they have almost the exact same genetic code, making it possible to zero in on how people's DNA and lived experiences interact – and what that overlap means for our well-being. For example, if identical twins had genes that made them more sensitive to environmental factors, they were expected to be different from each other because they each had unique life experiences that set them on different paths, mental health-wise. But if identical twins had genes that made them less sensitive to outside factors, they were expected to have more similar traits to one another. Knowing this, the researchers identified particular genes that seemed to carry more weight than others. Growth-related genes were associated with autistic traits, the study found, while genes related to stress reactivity were tied to depression and genes that help regulate stress hormones were tied to psychotic-like experiences. The researchers said the findings could be used to better understand how the DNA-lifestyle nexus shapes people's health outcomes, and what that means for people struggling with serious mental health or neurological issues. 'Some people are more sensitive to their circumstances, and this can be positive in good circumstances, but can make life more challenging than for others in stressful circumstances,' Eley said.

Things Humans Still Do Better Than AI: Understanding Flowers
Things Humans Still Do Better Than AI: Understanding Flowers

Gizmodo

time05-06-2025

  • General
  • Gizmodo

Things Humans Still Do Better Than AI: Understanding Flowers

While it might feel as though artificial intelligence is getting dangerously smart, there are still some basic concepts that AI doesn't comprehend as well as humans do. Back in March, we reported that popular large language models (LLMs) struggle to tell time and interpret calendars. Now, a study published earlier this week in Nature Human Behaviour reveals that AI tools like ChatGPT are also incapable of understanding familiar concepts, such as flowers, as well as humans do. According to the paper, accurately representing physical concepts is challenging for machine learning trained solely on text and sometimes images. 'A large language model can't smell a rose, touch the petals of a daisy or walk through a field of wildflowers,' Qihui Xu, lead author of the study and a postdoctoral researcher in psychology at Ohio State University, said in a university statement. 'Without those sensory and motor experiences, it can't truly represent what a flower is in all its richness. The same is true of some other human concepts.' The team tested humans and four AI models—OpenAI's GPT-3.5 and GPT-4, and Google's PaLM and Gemini—on their conceptual understanding of 4,442 words, including terms like flower, hoof, humorous, and swing. Xu and her colleagues compared the outcomes to two standard psycholinguistic ratings: the Glasgow Norms (the rating of words based on feelings such as arousal, dominance, familiarity, etc.) and the Lancaster Norms (the rating of words based on sensory perceptions and bodily actions). The Glasgow Norms approach saw the researchers asking questions like how emotionally arousing a flower is, and how easy it is to imagine one. The Lancaster Norms, on the other hand, involved questions including how much one can experience a flower through smell, and how much a person can experience a flower with their torso. In comparison to humans, LLMs demonstrated a strong understanding of words without sensorimotor associations (concepts like 'justice'), but they struggled with words linked to physical concepts (like 'flower,' which we can see, smell, touch, etc.). The reason for this is rather straightforward—ChatGPT doesn't have eyes, a nose, or sensory neurons (yet) and so it can't learn through those senses. The best it can do is approximate, despite the fact that they train on more text than a person experiences in an entire lifetime, Xu explained. 'From the intense aroma of a flower, the vivid silky touch when we caress petals, to the profound visual aesthetic sensation, human representation of 'flower' binds these diverse experiences and interactions into a coherent category,' the researchers wrote in the study. 'This type of associative perceptual learning, where a concept becomes a nexus of interconnected meanings and sensation strengths, may be difficult to achieve through language alone.' In fact, the LLMs trained on both text and images demonstrated a better understanding of visual concepts than their text-only counterparts. That's not to say, however, that AI will forever be limited to language and visual information. LLMs are constantly improving, and they might one day be able to better represent physical concepts via sensorimotor data and/or robotics, according to Xu. She and her colleagues' research carries important implications for AI-human interactions, which are becoming increasingly (and, let's be honest, worryingly) intimate. For now, however, one thing is certain: 'The human experience is far richer than words alone can hold,' Xu concluded.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store