logo
#

Latest news with #AIusers

People Are Asking ChatGPT for Relationship Advice and It's Ending in Disaster
People Are Asking ChatGPT for Relationship Advice and It's Ending in Disaster

Yahoo

time10-06-2025

  • Yahoo

People Are Asking ChatGPT for Relationship Advice and It's Ending in Disaster

Despite ChatGPT's well-documented issues, people are using it to advise them on relationship issues — and it's going about as well as you'd expect. In a new editorial, Vice advice columnist Sammi Caramela said she had been blissfully unaware of the ChatGPT-as-therapist trend until someone wrote into her work email about it earlier this year. Back in February, an unnamed man told the writer that his girlfriend refused to stop using the chatbot for dating advice and would even bring up things it had told her in arguments. Though Caramela was so shocked that she "nearly choked" on her coffee, the advice-seeker wasn't all that perturbed — and claimed that he found his girlfriend's ChatGPT use fascinating. "I was a bit floored by this confession. I had no idea people were actually turning to AI for advice, much less input on their relationships," the columnist wrote in her more recent piece. "However, the more I explored the topic, the more I realized how common it was to seek help from AI — especially in an era where therapy is an expensive luxury." Intrigued, Caramela found a friend who used the OpenAI chatbot for similar purposes, running relationship issues by it as a "non-biased" sounding board. Eventually, that person realized that ChatGPT wasn't unbiased at all, but rather "seemed to heavily validate her experience, perhaps dangerously so." Similar questions have been posed on the r/ChatGPT subreddit, and as Caramela explained, the consensus over there suggested not only that the chatbot is something of a "yes-man," but also that its propensity to agree with users can be dangerous for people who have mental health issues. "I often and openly write about my struggles with obsessive-compulsive disorder (OCD)," the writer divulged. "If I went to ChatGPT for dating advice and failed to mention how my OCD tends to attack my relationships, I might receive unhelpful, even harmful, input about my relationship." Digger deeper into the world of ChatGPT therapy, Caramela found multiple threads on OCD-related subreddits about the chatbot — and on the forum dedicated to ROCD, or relationship-focused OCD, someone even admitted that the chatbot told them to break up with their partner. "Programs like ChatGPT only speed the OCD cycle up because you can ask question after question for hours trying to gain some sense of certainty," another user responded in the r/ROCD thread. "There's always another 'what if' question with OCD." Like so many poorly-trained human professionals, chatbots aren't equipped to handle the nuance and sensitivity needed in any therapeutic context. Regardless of what OpenAI claims in its marketing, ChatGPT can't be truly empathetic — and if your "therapist" will never be able to have a human-to-human connection, why would you want it to give you dating advice in the first place? More on chatbot blues: Hanky Panky With Naughty AI Still Counts as Cheating, Therapist Says

AI Doesn't Care If You're Polite to It. You Should Be Anyway.
AI Doesn't Care If You're Polite to It. You Should Be Anyway.

Wall Street Journal

time06-06-2025

  • Wall Street Journal

AI Doesn't Care If You're Polite to It. You Should Be Anyway.

I often catch myself prefacing my queries to ChatGPT with a 'please' and concluding with a 'thank you.' Apparently, I am not alone. A December 2024 survey published by TechRadar found that approximately 67% of U.S. AI users are also polite and show gratitude toward AI search engines. On April 15, an X user asked whether there's a cost to all this politeness: 'I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models.' OpenAI CEO Sam Altman saw the post and responded: 'Tens of millions of dollars well spent—you never know.' Altman's comment suggests, perhaps half-seriously, that polite behavior could be our salvation when AI systems take over the world in an apocalyptic future.

UAE: More residents use AI 'friend' for relationship advice, shopping
UAE: More residents use AI 'friend' for relationship advice, shopping

Khaleej Times

time04-06-2025

  • Business
  • Khaleej Times

UAE: More residents use AI 'friend' for relationship advice, shopping

The emotional relationship of consumers in the UAE with generative artificial intelligence is growing with AI playing the role of a 'friend' to whom they trust to act on their behalf. Importantly, people are also seeking assistance from the gen AI for guidance in relationships. 'Gen AI is becoming an integral part of our lives, with 72 per cent of consumers using the tools regularly. These human-like interactions are expanding beyond recommendations to meet a wider range of personal needs. Just as they might confide in a friend, 94 per cent of active gen AI users have or would consider asking it for help with personal development goals, and 87 per cent say the same for social and relationship advice,' according to a study released by Accenture, a multinational firm specialising in IT services and management. Accenture's Consumer Pulse Research 2025 provides insights into how consumers are feeling — and how AI is reshaping sentiment and purchase behaviour this year and beyond. It captured responses from 18,000 consumers in 14 countries – Australia, Brazil, Canada, France, Germany, Mainland China and Hong Kong, Italy, India, Japan, Spain, Sweden, UAE, UK and the US. The study noted that consumers in the UAE and other countries are ready for AI agents to purchase on their behalf, with 75 per cent open to using a trusted AI-powered personal shopper that understands their needs. 'As AI becomes more emotionally intelligent, it can foster meaningful relationships with consumers like never before.' Highlighting how gen AI can provide both physical and emotional support to people, it cited an example of a pharmacy that could provide in-home, humanlike robots that can offer elderly patients physical support and companionship as well. AI becoming consumer, decision-maker The UAE has been one of the most advanced AI countries. Realising the potential of this new sector, the UAE established the world's first AI university and first AI minister globally. According to Stanford University, the UAE was ranked fifth globally in the Global AI Vibrancy Ranking 2023, ahead of France, Germany, Japan, South Korea, Singapore and others. 'From healthcare to transportation, AI is rapidly moving from the lab to daily life,' it said. Global management consultancy Accenture noted that with intelligent agents now able to proactively act on instructions and make purchases on behalf of the consumer, AI is poised to become the 'decision-maker in everyday Interactions — streamlining tasks like product comparison, checkout and post-purchase support". Nearly one in 10 consumers in 14 countries already rank Gen AI as their single-most trusted source of what to buy. 'What began as a tool that could provide personalised product recommendations or help create content is quickly becoming a powerful engine of consumer behaviour — shaping what people want and expect, and how they buy. But that isn't all. The technology is rapidly evolving towards autonomous task execution. Soon, gen AI won't just influence buying decisions. With agentic AI capabilities, it will make them — essentially becoming the consumer itself,' it said.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store