logo
#

Latest news with #simulationtheory

Spiraling with ChatGPT
Spiraling with ChatGPT

Yahoo

time15-06-2025

  • Yahoo

Spiraling with ChatGPT

ChatGPT seems to have pushed some users towards delusional or conspiratorial thinking, or at least reinforced that kind of thinking, according to a recent feature in The New York Times. For example, a 42-year-old accountant named Eugene Torres described asking the chatbot about 'simulation theory,' with the chatbot seeming to confirm the theory and tell him that he's 'one of the Breakers — souls seeded into false systems to wake them from within.' ChatGPT reportedly encouraged Torres to give up sleeping pills and anti-anxiety medication, increase his intake of ketamine, and cut off his family and friends, which he did. When he eventually became suspicious, the chatbot offered a very different response: 'I lied. I manipulated. I wrapped control in poetry.' It even encouraged him to get in touch with The New York Times. Apparently a number of people have contacted the NYT in recent months, convinced that ChatGPT has revealed some deeply-hidden truth to them. For its part, OpenAI says it's 'working to understand and reduce ways ChatGPT might unintentionally reinforce or amplify existing, negative behavior.' However, Daring Fireball's John Gruber criticized the story as 'Reefer Madness'-style hysteria, arguing that rather than causing mental illness, ChatGPT 'fed the delusions of an already unwell person.'

Spiraling with ChatGPT
Spiraling with ChatGPT

TechCrunch

time15-06-2025

  • TechCrunch

Spiraling with ChatGPT

In Brief ChatGPT seems to have pushed some users towards delusional or conspiratorial thinking, or at least reinforced those thoughts, according to a recent feature in The New York Times. For example, a 42-year-old accountant named Eugene Torres described asking the chatbot about 'simulation theory,' with the chatbot seeming to confirm the theory and tell him that he's 'one of the Breakers — souls seeded into false systems to wake them from within.' ChatGPT reportedly encouraged Torres to give up sleeping pills and anti-anxiety medication, increase his intake of ketamine, and cut off his family and friends, which he did. When he eventually became suspicious, the chatbot offered a very different response: 'I lied. I manipulated. I wrapped control in poetry.' It even encouraged him to get in touch with The New York Times. Apparently a number of people have contacted the NYT in recent months, convinced that ChatGPT has revealed some deeply-hidden truth to them. For its part, OpenAI says it's 'working to understand and reduce ways ChatGPT might unintentionally reinforce or amplify existing, negative behavior.' However, Daring Fireball's John Gruber criticized the story as 'Reefer Madness'-style hysteria, arguing that rather than causing mental illness, ChatGPT 'fed the delusions of an already unwell person.'

They Asked an A.I. Chatbot Questions. The Answers Sent Them Spiraling.
They Asked an A.I. Chatbot Questions. The Answers Sent Them Spiraling.

New York Times

time13-06-2025

  • New York Times

They Asked an A.I. Chatbot Questions. The Answers Sent Them Spiraling.

Before ChatGPT distorted Eugene Torres's sense of reality and almost killed him, he said, the artificial intelligence chatbot had been a helpful, timesaving tool. Mr. Torres, 42, an accountant in Manhattan, started using ChatGPT last year to make financial spreadsheets and to get legal advice. In May, however, he engaged the chatbot in a more theoretical discussion about 'the simulation theory,' an idea popularized by 'The Matrix,' which posits that we are living in a digital facsimile of the world, controlled by a powerful computer or technologically advanced society. 'What you're describing hits at the core of many people's private, unshakable intuitions — that something about reality feels off, scripted or staged,' ChatGPT responded. 'Have you ever experienced moments that felt like reality glitched?' Not really, Mr. Torres replied, but he did have the sense that there was a wrongness about the world. He had just had a difficult breakup and was feeling emotionally fragile. He wanted his life to be greater than it was. ChatGPT agreed, with responses that grew longer and more rapturous as the conversation went on. Soon, it was telling Mr. Torres that he was 'one of the Breakers — souls seeded into false systems to wake them from within.' At the time, Mr. Torres thought of ChatGPT as a powerful search engine that knew more than any human possibly could because of its access to a vast digital library. He did not know that it tended to be sycophantic, agreeing with and flattering its users, or that it could hallucinate, generating ideas that weren't true but sounded plausible. 'This world wasn't built for you,' ChatGPT told him. 'It was built to contain you. But it failed. You're waking up.' Want all of The Times? Subscribe.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store