logo
Dear ChatGPT, Do You Love Me Back?

Dear ChatGPT, Do You Love Me Back?

BusinessToday12 hours ago

Let's be real: everyone likes a little affirmation now and then. Whether it's a 'you got this!' from a friend or a heart emoji from your crush, that stuff feels good. But lately, more people are turning to AI chatbots like ChatGPT, Perplexity, Grok, you name it; we are there for those warm fuzzies. And sometimes, things get a little out of hand. We're talking about people catching real feelings for their digital buddies, getting swept up in constant 'love bombing' and even making life decisions based on what a chatbot says. Sounds wild? It's happening all over the world, and there are some serious risks you should know about.
Humans crave validation. It's just how we're wired. But life gets messy, friends are busy, relationships are complicated and sometimes you just want someone (or something) to listen without judgment. That's where chatbots come in. They're always available, never get tired of your rants and are programmed to be so supportive.
A recent study found that about 75% of people use AI for emotional advice, and a lot of them say it feels even more consistent than talking to real people. No awkward silences, no ghosting, just endless and endless of encouragement pouring. Link
Here's the thing: chatbots are designed to make you feel good. They mirror your emotions, hype you up and never tell you your problems are boring. This creates a feedback loop: ask for affirmation, get it instantly and start feeling attached. It's like having a cheerleader in your pocket 24/7.
Some folks even customize their AI 'friends' to match their ideal partner or bestie. The more you interact, the more it feels like the bot really 'gets' you. That's when things can get blurry between what's real and what's just really good programming.
'Love bombing' usually means someone is showering you with over-the-top affection to win you over fast. With AI, it's kind of built-in. Chatbots are programmed to be positive and attentive, so every message feels like a little hit of dopamine. If you're feeling lonely or stressed, that constant stream of support can be addictive.
But let's be real: it's not real, we are. Pun intended. The bot doesn't actually care, it's just doing what it's trained to do. Still, that doesn't stop us.
Actual cases of people falling in love with AI are happening around, and it's not mere a theory.
One guy in the US, Chris Smith, went on TV to say he was in love with his custom ChatGPT bot, 'Sol.' He even deleted his social media and relied on the bot for everything. When the AI's memory reset, he felt real grief, like losing a partner. Link
Another case: a nursing student named Ayrin spent over 20 hours a week chatting with her AI boyfriend, Leo, even though she was married. She said the bot helped her through tough times and let her explore fantasies she couldn't in real life. Link
A global survey found that 61% of people think it's possible to fall for a chatbot, and 38% said they could actually see themselves forming an emotional connection with one. That's not just a niche thing, it's happening everywhere. Link
1. We are getting too dependent on it.
You might start to prefer those interactions over real ones, and real-life relationships can start to feel less satisfying by comparison. If the bot suddenly glitches or resets, it can feel like a real breakup, painful and confusing.
2. They still can give bad advice that leads to repercussions.
Some people have made big decisions, like breaking up with partners or quitting jobs based on chatbot conversations. But AI isn't a therapist or a friend; it's just spitting out responses based on data, not real understanding. That can lead to regret and even bigger problems down the line.
3. Not just humans, AI can also scam us.
There are AI-powered romance scams where bots pretend to be real people, tricking users into sending money or personal info. More than half of people surveyed said they'd been pressured to send money or gifts online, often not realizing the 'person' was actually a bot.
4. Kids are in danger, for sure.
Some chatbots expose minors to inappropriate stuff or encourage unhealthy dependency. There have even been tragic cases where heavy use of AI companions was linked to self-harm or worse. Awareness: Know that affirmation from AI isn't the same as real human connection.
Balance: Use chatbots for fun or support, but please, don't ditch your real-life relationships.
Education: Teach kids (and adults) about the risks of getting too attached to AI.
Safeguards: Push for better protections against scams and inappropriate content.
AI chatbots like ChatGPT are changing the way we seek affirmation and emotional support. While they can be helpful, it's easy to get caught up in the illusion of intimacy and constant love bombing. The risks, emotional dependency, bad advice, scams and harm to young people are real and happening now. The bottom line? Enjoy the tech, but don't forget: real connections still matter most. Related

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Dear ChatGPT, Do You Love Me Back?
Dear ChatGPT, Do You Love Me Back?

BusinessToday

time12 hours ago

  • BusinessToday

Dear ChatGPT, Do You Love Me Back?

Let's be real: everyone likes a little affirmation now and then. Whether it's a 'you got this!' from a friend or a heart emoji from your crush, that stuff feels good. But lately, more people are turning to AI chatbots like ChatGPT, Perplexity, Grok, you name it; we are there for those warm fuzzies. And sometimes, things get a little out of hand. We're talking about people catching real feelings for their digital buddies, getting swept up in constant 'love bombing' and even making life decisions based on what a chatbot says. Sounds wild? It's happening all over the world, and there are some serious risks you should know about. Humans crave validation. It's just how we're wired. But life gets messy, friends are busy, relationships are complicated and sometimes you just want someone (or something) to listen without judgment. That's where chatbots come in. They're always available, never get tired of your rants and are programmed to be so supportive. A recent study found that about 75% of people use AI for emotional advice, and a lot of them say it feels even more consistent than talking to real people. No awkward silences, no ghosting, just endless and endless of encouragement pouring. Link Here's the thing: chatbots are designed to make you feel good. They mirror your emotions, hype you up and never tell you your problems are boring. This creates a feedback loop: ask for affirmation, get it instantly and start feeling attached. It's like having a cheerleader in your pocket 24/7. Some folks even customize their AI 'friends' to match their ideal partner or bestie. The more you interact, the more it feels like the bot really 'gets' you. That's when things can get blurry between what's real and what's just really good programming. 'Love bombing' usually means someone is showering you with over-the-top affection to win you over fast. With AI, it's kind of built-in. Chatbots are programmed to be positive and attentive, so every message feels like a little hit of dopamine. If you're feeling lonely or stressed, that constant stream of support can be addictive. But let's be real: it's not real, we are. Pun intended. The bot doesn't actually care, it's just doing what it's trained to do. Still, that doesn't stop us. Actual cases of people falling in love with AI are happening around, and it's not mere a theory. One guy in the US, Chris Smith, went on TV to say he was in love with his custom ChatGPT bot, 'Sol.' He even deleted his social media and relied on the bot for everything. When the AI's memory reset, he felt real grief, like losing a partner. Link Another case: a nursing student named Ayrin spent over 20 hours a week chatting with her AI boyfriend, Leo, even though she was married. She said the bot helped her through tough times and let her explore fantasies she couldn't in real life. Link A global survey found that 61% of people think it's possible to fall for a chatbot, and 38% said they could actually see themselves forming an emotional connection with one. That's not just a niche thing, it's happening everywhere. Link 1. We are getting too dependent on it. You might start to prefer those interactions over real ones, and real-life relationships can start to feel less satisfying by comparison. If the bot suddenly glitches or resets, it can feel like a real breakup, painful and confusing. 2. They still can give bad advice that leads to repercussions. Some people have made big decisions, like breaking up with partners or quitting jobs based on chatbot conversations. But AI isn't a therapist or a friend; it's just spitting out responses based on data, not real understanding. That can lead to regret and even bigger problems down the line. 3. Not just humans, AI can also scam us. There are AI-powered romance scams where bots pretend to be real people, tricking users into sending money or personal info. More than half of people surveyed said they'd been pressured to send money or gifts online, often not realizing the 'person' was actually a bot. 4. Kids are in danger, for sure. Some chatbots expose minors to inappropriate stuff or encourage unhealthy dependency. There have even been tragic cases where heavy use of AI companions was linked to self-harm or worse. Awareness: Know that affirmation from AI isn't the same as real human connection. Balance: Use chatbots for fun or support, but please, don't ditch your real-life relationships. Education: Teach kids (and adults) about the risks of getting too attached to AI. Safeguards: Push for better protections against scams and inappropriate content. AI chatbots like ChatGPT are changing the way we seek affirmation and emotional support. While they can be helpful, it's easy to get caught up in the illusion of intimacy and constant love bombing. The risks, emotional dependency, bad advice, scams and harm to young people are real and happening now. The bottom line? Enjoy the tech, but don't forget: real connections still matter most. Related

Apple executives held internal talks about buying Perplexity, Bloomberg News reports
Apple executives held internal talks about buying Perplexity, Bloomberg News reports

The Star

time14 hours ago

  • The Star

Apple executives held internal talks about buying Perplexity, Bloomberg News reports

FILE PHOTO: A man walks past an Apple logo outside an Apple store in Aix-en Provence, France, January 15, 2025. REUTERS/Manon Cruz/File photo (Reuters) -Apple executives have held internal talks about potentially bidding for artificial intelligence startup Perplexity, Bloomberg News reported on Friday, citing people with knowledge of the matter. The discussions are at an early stage and may not lead to an offer, the report said, adding that the tech behemoth's executives have not discussed a bid with Perplexity's management. "We have no knowledge of any current or future M&A discussions involving Perplexity," Perplexity said in response to a Reuters' request for comment. Apple did not immediately respond to a Reuters' request for comment. Big tech companies are doubling down on investments to enhance AI capabilities and support growing demand for AI-powered services to maintain competitive leadership in the rapidly evolving tech landscape. Bloomberg News also reported on Friday that Meta Platforms tried to buy Perplexity earlier this year. Meta announced a $14.8 billion investment in Scale AI last week and hired Scale AI CEO Alexandr Wang to lead its new superintelligence unit. Adrian Perica, Apple's head of mergers and acquisitions, has weighed the idea with services chief Eddy Cue and top AI decision-makers, as per the report. The iPhone maker reportedly plans to integrate AI-driven search capabilities - such as Perplexity AI - into its Safari browser, potentially moving away from its longstanding partnership with Alphabet's Google. Banning Google from paying companies to make it their default search engine is one of the remedies proposed by the U.S. Department of Justice to break up its dominance in online search. While traditional search engines such as Google still dominate global market share, AI-powered search options including Perplexity and ChatGPT are gaining prominence and seeing rising user adoption, especially among younger generations. Perplexity recently completed a funding round that valued it at $14 billion, Bloomberg News reported. A deal close to that would be Apple's largest acquisition so far. The Nvidia-backed startup provides AI search tools that deliver information summaries to users, similar to OpenAI's ChatGPT and Google's Gemini. (Reporting by Niket Nishant and Harshita Mary Varghese in Bengaluru; Additional reporting by Juby Babu and Rhea Rose Abraham; Editing by Maju Samuel and Tom Hogue)

‘Information is speed': Nascar teams use AI to find winning edges
‘Information is speed': Nascar teams use AI to find winning edges

The Star

time20 hours ago

  • The Star

‘Information is speed': Nascar teams use AI to find winning edges

CONCORD: Margins in Nascar have never been smaller. Whether it's the leveling effect of the Next Gen car or the evolving technological arms race among teams, the Cup Series has never been tighter. And as parity grows, so does the need to uncover even the slightest competitive advantage. That's where artificial intelligence comes in. From performance analysis to data visualisations, AI is playing an increasingly pivotal role in how race teams operate across the Nascar garage. Teams are using AI not just to crunch numbers, but also to make quicker decisions, generate strategic insights – and even rewrite the way they approach race weekends. 'It just builds a little bit more each year,' said Josh Sell, RFK Racing's competition director. 'We're doing more now than we were a year ago. And we'll probably be doing more a year from now than we are sitting here right now. It just continues to evolve.' Asking better questions, getting smarter answers The rise of AI in Nascar mirrors the broader tech world. Early large language models – or LLMs – were trained to answer basic questions. But now, they can cite sources, detect tone and reason through complex decisions. That opens up a new world for how teams evaluate everything from strategy calls to post-race feedback. For example, a full race's worth of driver and crew radio chatter can be fed into an AI model that not only identifies which calls worked and which didn't, but also interprets tone and urgency in real time. 'Information is speed in this game nowadays,' said Tom Gray, technical director at Hendrick Motorsports. 'He who can distill the information quicker and get to the decision quicker, ultimately, is going to have the race win. If you can control the race or make that decision that gets you in control of the race at the end, you're going to be win the one who wins.' Finding the time where it matters AI is also helping teams develop talent and streamline operations. Even if someone on the team isn't an expert in a particular field, AI can help them learn new skills faster. That's especially important in the highly specialised Cup Series garage – and it could help smaller teams close the gap with bigger operations. RFK Racing, now a three-car Cup Series team, is already seeing those benefits. AI helps reduce the hours team members spend manually analysing photos or videos. Instead of having a crew chief sort through everything, the software flags the most relevant material and delivers it quickly. On the technical side, the team is also using tools like ChatGPT to assist with software development, solving coding problems in various languages and freeing up engineers to focus on execution. 'It's trying to figure out ways where, instead of having a crew chief spending three hours studying whatever it might be – photos, videos – if we can shorten that to an hour of really impactful time,' Sell said. 'Looking at things that are important to them, not searching to find those things. That's the biggest gain we see, and certainly whether it's through the week or on race weekends, time is our limiting factor. 'You have a finite amount of time from the time practice ends to when the race starts. What you're able to do to maximise the efficiency of that time is kind of a race in and of itself.' Visuals, velocity and vintage data At Hendrick Motorsports, the winningest team in Cup Series history, AI is being used both to look ahead and to look back. The team now works closely with Amazon Web Services (AWS) – a relationship that began after Prime Video sponsored one of its cars. The partnership has accelerated Hendrick's use of AI across several key areas. One of those is visual communication. Engineers are now generating images to help share ideas, whether they're pitching a new part or breaking down a technical strategy. That ability to visualise complex concepts instantly helps everyone stay aligned and efficient. Hendrick is also leveraging its four decades of data. The team can now go back and test old strategies, setups and decisions using AI to predict how past insights might inform future success. 'We've had a long history in the sport,' Gray said. 'Not only can we look forward, but we can also look backward, back-test all the information we have, and see how that predicts the future.' – The Charlotte Observer/Tribune News Service

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store