logo
#

Latest news with #essayWriting

This is your brain on ChatGPT
This is your brain on ChatGPT

Yahoo

time13 hours ago

  • Science
  • Yahoo

This is your brain on ChatGPT

When you buy through links on our articles, Future and its syndication partners may earn a commission. Sizzle. Sizzle. That's the sound of your neurons frying over the heat of a thousand GPUs as your generative AI tool of choice cheerfully churns through your workload. As it turns out, offloading all of that cognitive effort to a robot as you look on in luxury is turning your brain into a couch potato. That's what a recently published (and yet to be peer-reviewed) paper from some of MIT's brightest minds suggests, anyway. The study examines the "neural and behavioral consequences" of using LLMs (Large Language Models) like ChatGPT for, in this instance, essay writing. The findings raise serious questions about how long-term use of AI might affect learning, thinking, and memory. More worryingly, we recently witnessed it play out in real life. The study, titled: Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task, involved 54 participants split into three groups: LLM group: Instructed to complete assignments using only ChatGPT, and no other websites or tools. Search engine group: Allowed to use any website except LLMs, even AI-enhanced answers were forbidden. Brain-only group: Relying only on their own knowledge. Across three sessions, these groups were tasked with writing an essay about one of three changing topics. An example of the essay question for the topic of "Art" was: "Do works of art have the power to change people's lives?" Participants then had 20 minutes to answer the question related to their chosen topic in essay form, all while wearing an Enobio headset to collect EEG signals from their brain. In a fourth session, LLM and Brain-only groups were swapped to measure any potential lasting impact of prior sessions. The results? Across the first three tests, Brain-only writers had the most active, widespread brain engagement during the task, while LLM-assisted writers showed the lowest levels of brain activity across the board (although routinely completed the task fastest). Search engine-assisted users generally fell somewhere in between the two. In short, Brain-only writers were actively engaging with the assignment, producing more creative and unique writing while actually learning. They were able to quote their essays afterwards and felt strong ownership of their work. Alternatively, LLM users engaged less over each session, began to uncritically rely on ChatGPT more as the study went on, and felt less ownership of the results. Their work was judged to be less unique, and participants often failed to accurately quote from their own work, suggesting reduced long-term memory formation. Researchers referred to this phenomenon as "metacognitive laziness" — not just a great name for a Prog-Rock band, but also a perfect label for the hazy distance between autopilot and Copilot, where participants disengage and let the AI do the thinking for them. But it was the fourth session that yielded the most worrying results. According to the study, when the LLM and Brain-only group traded places, the group that previously relied on AI failed to bounce back to pre-LLM levels tested before the study. To put it simply, sustained use of AI tools like ChatGPT to "help" with tasks that require critical thinking, creativity, and cognitive engagement may erode our natural ability to access those processes in the future. But we didn't need a 206-page study to tell us that. On June 10, an outage lasting over 10 hours saw ChatGPT users cut off from their AI assistant, and it provoked a disturbing trend of people openly admitting, sans any hint of awareness, that without access to OpenAI's chatbot, they'd suddenly forgotten how to work, write, or function. This study may have used EEG caps and grading algorithms to prove it, but most of us may already be living its findings. When faced with an easy or hard path, many of us would assume that only a particularly smooth-brained individual would willingly take the more difficult, obtuse route. However, as this study claims, the so-called easy path may be quietly sanding down our frontal lobes in a lasting manner — at least when it comes to our use of AI. That's especially frightening when you think of students, who are adopting these tools en masse, with OpenAI itself pushing for wider embrace of ChatGPT in education as part of its mission to build "an AI-Ready Workforce." A 2023 study conducted by revealed that a third of U.S. college students surveyed used ChatGPT for schoolwork during the 2022/23 academic year. In 2024, a survey from the Digital Education Council claimed that 86% of students across 16 countries use artificial intelligence in their studies to some degree. AI's big sell is productivity, the promise that we can get more done, faster. And yes, MIT researchers have previously concluded that AI tools can boost worker productivity by up to 15%, but the long-term impact suggests codependency over competency. And that sounds a lot like regression. At least for the one in front of the computer. Sizzle. Sizzle. Is Microsoft misleading users about Copilot? New claims point the finger at AI productivity Why OpenAI engineers are turning down $100 million from Meta, according to Sam Altman Google's latest Gemini 2.5 models are its biggest response to ChatGPT yet — and they're already live

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store