
How AI Is Supercharging The New Wave Of AI Scams
A recent survey finds 72% of respondents say AI is making scams more sophisticated.
By the time Daniel realized he'd been scammed, he'd already lost more than half a million dollars – and nearly everything else.
A 63-year-old deaf former competitive swimmer with a government job and a tidy nest egg, Daniel (a pseudonym) was just looking for a better return than his 401(k). What he found instead was a sophisticated AI-enabled scam that wiped out his savings, shattered his marriage and left him owing hundreds of thousands in back taxes and fees for lump-sum withdrawals from retirement accounts.
'I just knew I had been pig butchered,' Daniel wrote in an interview. 'The scam caused the loss of my marriage, my family, my share of assets and my ability to retire. The beautiful life I had, came to a sudden halt. Completely destroyed.'
Daniel's story isn't unique. It's becoming alarmingly common, according to Chris Groshong and Joseph Albiñana, co-founders of CoinStructive, a firm that helps victims of digital asset fraud. And the reason these scams are proliferating so quickly? AI.
'These scammers have leveled up,' Groshong said in an interview. 'They're using AI to create hyperrealistic personas. Video chats. Lip-synced conversations. It's not like five years ago when you could spot a fake. AI has erased that line."
He cited one client – an AI professional – who was tricked by a convincing video call with a scammer impersonating a well-known investor. 'He lost $1.3 million,' Groshong said. 'It was his entire nest egg. The scammers used AI against him, and he still can't believe he fell for it."
The Data Guardian Network is a novel project that's committed to AI privacy and security by building a community around ethical data use and gamification. Johanna Cabildo is CEO of DGN. She confirms that AI tools are making these attacks faster and cheaper to launch, exacerbating social engineering and enabling scammers to reach a wider audience that owns digital assets.
'AI is also fuelling automated threats like wallet-draining scripts that adapt to evade detection, smart contract exploits refined through reinforcement learning and airdrop scams that mimic legitimate distributions. One campaign used AI to scan Discord chats, identify users asking for help and reply with malicious links. It was posing as 'support staff' using human-like responses,' Cabildo wrote in a text response.
Daniel's experience followed a now-familiar pattern. A stranger messaged him on Facebook after a casual comment about golf. She claimed to be an investor and wholesaler of successful skin care products.
'I checked her online,' Daniel said. 'She had photos with dermatologists, images with boxes of her product, a Houston business address – everything seemed legit.'
She introduced him to a platform called PawnFi (later renamed Polarise), and even deposited $30,000 into his account to earn his trust. She sent photos, shared personal details and warned him to keep everything secret.
He began trading on PawnFi. Made small withdrawals. Saw big seven-figure returns. Then came the hook: he was told he'd exceeded the platform's profit threshold and had to pay a $250,000 tax bill from his own pocket to access his funds.
'That's when it clicked,' he said. 'It was too late.'
Groshong said that tax-payment ruse is a common pressure tactic: 'They let you withdraw small amounts to build trust. Then, when the amount gets big, they lock it down and demand more money." Adding insult to injury, scammers often refer their own victims to fake 'crypto recovery' services – just to squeeze out a few more dollars.
A recent survey on Statista found that 72% of respondents believe that AI is making scams more sophisticated. Albiñana confirms those findings. He says AI is accelerating the scale and precision of these scams.
'They're saturating people with outreach,' he explained. 'Texts, emails, popups, phone calls. All it takes is one weak moment. And now with AI, that 'pretty face' or 'video call' looks and sounds completely real."
Pig butchering schemes – long cons where scammers build trust over weeks or months – are the most common. But CoinStructive also sees imposter scams, fake firmware updates, romance cons and task scams where victims are 'hired' to perform meaningless digital tasks before being asked to front money for fees.
Even smart, cautious people fall for it.
'There is a poison for every single person,' Albiñana said. 'Crypto scams aren't about intelligence. They're about trust. And AI helps scammers mimic that trust better than ever."
Daniel said the hardest part now isn't the debt – it's the shame.
'I used to have good faith and trust in most people,' he wrote. 'It's no longer there. I hate to say it, but my naiveness got the best of me and ruined my life.'
For those just entering the digital asset space, Daniel had a simple warning: 'Don't place trust in anyone whether you know them or not. The criminals are very sharp and very good at what they do.'
Groshong and Albiñana shared five practical tips to help protect yourself:
Cabildo noted that it's not just about spotting AI fakes, it's about confirming verifiable trust especially when dealing with digital assets.
'For individuals, this means treating every direct message and 'urgent opportunity' as suspicious. Always check URLs for accuracy – this one is incredibly simple but a common mistake. Projects should implement multi-signature wallets, requiring multiple approvals for transactions and adopt AI threat tools that flag on-chain anomalies like sudden bot-driven token movements,' she concluded.
If you've been scammed or suspect a fraud, report it immediately to the FTC, IC3 and your local authorities. And if you need help navigating the murky waters of digital asset recovery, firms like CoinStructive are doing their part—though even they admit successful recoveries are rare.
'Out of hundreds of cases,' Groshong said, 'only two have ever gotten their money back. And one of them is still waiting."
For Daniel, the only thing left is to keep going and piece his life back together in the wake of the AI scam. Some concerned individuals helped him set up a fundraising page to keep hope alive. 'I'm living one day at a time,' he said. 'Very differently. Very cautiously.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
37 minutes ago
- Yahoo
Trust in AI is growing in finance, especially behind the scenes
This story was originally published on CX Dive. To receive daily news and insights, subscribe to our free daily CX Dive newsletter. A majority of customers trust the use of AI in behind-the-scenes tasks at financial institutions, according to a TD Bank survey conducted by Ipsos released Tuesday. Among the 2,500 U.S. consumers polled, 70% are comfortable with technology being used for fraud detection, and 64% are comfortable with it being used in credit score calculations. Consumers also believe that AI should offer more ease. Two-thirds believe it can expand access to financial tools, and nearly half expect benefits from AI like 24/7 banking access. As consumers have become more familiar with AI tools, their trust in the technology has slowly grown. Nearly 7 in 10 consumers say they are at least somewhat familiar with AI — a finding seen in other surveys, too. Notably, half of consumers trust that AI will provide reliable, competent information, trusting AI just as much as news stations. But consumers are more comfortable with AI in specific use cases and the more complex or sensitive the matter, the more they want to speak to a human or know that a human will be reviewing AI before making any decisions. Consumers are less inclined to want to only use AI when it comes to tasks that one might typically use a financial adviser for, according Ted Paris, EVP, TD Bank AMCB, and head of analytics, intelligence & AI. When it comes to personal finance, 3 in 5 of consumers were comfortable with the idea of using AI financial tools for budgeting and automating savings goals. But less than half were comfortable with more complex tasks like retirement planning and investing. Banks enjoy high consumer trust — more than 4 in 5 consumers trust banks for accurate information. As they deploy AI, it's important that they maintain that, Paris said. 'What's probably the key piece, is creating and enabling and allowing customers and colleagues to feel that they can trust the outcomes of what this capability then generates,' Paris said. One of the ways TD Bank is approaching this is by always having a human in the loop, meaning that the output of an AI solution will be passed through some internal expert before going to a client. 'We need to make sure that first, anything that we're doing is directed toward a particular need,' Paris said. 'We need to make sure that this is going to meet all hurdles that we would set, legal, regulatory, for security and privacy.' Sign in to access your portfolio


CNN
44 minutes ago
- CNN
‘She never sleeps': This platform wants to be OnlyFans for the AI era
She doesn't eat, sleep or breathe. But she remembers you, desires you and never logs off. Her name is Jordan – the AI-powered 'digital twin' of former British glamor model Katie Price – and people can pay her to act out their 'uncensored dreams.' 'You couldn't get any more human. It's like looking at me years ago,' Price, who shot to fame in the late 1990s as a peroxide-blonde tabloid model and Playboy cover star, told CNN. 'It's my voice. It's literally me. It's me.' On June 9, she joined the ranks of creators, celebrities and AI-generated avatars to be digitally immortalized by OhChat, an eight-month-old startup that uses artificial intelligence to build lifelike digital doubles of public figures. Its patrons can live out their 'spicy fantasies' through these AI avatars, OhChat's Instagram page states. The platform has attracted 200,000 users, most of which are based in the United States. OhChat sits at the provocative intersection of AI, fame and fantasy – where intimacy is simulated and connection is monetized. It goes a step further than platforms such as OnlyFans, where users pay to gain access to adult content from content creators. It also comes amid growing ethical concerns around AI – from its role in how people earn a living to how they form intimate connections – underscoring questions about whether AI companies are doing enough to ensure the technology isn't being misused. 'This creates exactly the right environment for the human to be left behind completely - while still being exploited,' Eleanor Drage, a senior research fellow at the University of Cambridge's Leverhulme Centre for the Future of Intelligence, told CNN. OhChat CEO Nic Young described the platform as the 'lovechild between OnlyFans and OpenAI,' in an exclusive interview with CNN. Once activated, the avatars run autonomously, offering 'infinite personalized content' for subscribers. Jordan, for example, is marketed on the platform as 'the ultimate British bombshell.' The tiered subscription model allows users to pay $4.99 per month for unlimited texts on demand, $9.99 for capped access to voice notes and images, or $29.99 for unlimited VIP interaction. Price, like other creators on the platform, receives an 80% cut from the revenue her AI avatar generates, according to Young. OhChat will keep the remaining 20%. 'You have literally unlimited passive income without having to do anything again,' Young told CNN. The platform 'is an incredibly powerful tool, and tools can be used however the human behind it wants to be used,' he added. 'We could use this in a really scary way, but we're using it in a really, I think, good, exciting way.' Since launching OhChat in October 2024, the company has signed 20 creators – including 'Baywatch' actress Carmen Electra. Some of the creators are already earning thousands of dollars per month, Young said. 'It takes away the opportunity cost of time,' he told CNN. 'Just don't touch it at all and receive money into your bank account.' To build a digital twin, OhChat asks creators to submit 30 images of themselves and speak to a bot for 30 minutes. The platform can then generate the digital replica 'within hours' using Meta's large language model, according to Young. Price's AI avatar is trained to mimic her voice, appearance and mannerisms. Jordan can 'sext' users, send voice notes and images, and provide on-demand intimacy at scale – all without Price lifting a finger. 'They had to get my movements, my characteristics, my personality,' said Price, who described her digital twin as 'scarily fascinating.' Price's avatar is categorized as 'level two' out of four on the platform's internal scale, which ranks the intensity and explicitness of their interactions. 'Level two' means sexualized chats and topless imagery, but not full nudity or simulated sex acts. Creators contributing to the platform decide which level their avatar will be. Price told CNN that creating a digital version of herself has left her feeling 'empowered.' The digital twin offers a round-the-clock connection that even her subscription-based OnlyFans account cannot match, she said. 'Obviously, I sleep, whereas she doesn't go to sleep; she's available,' she said. The rise of AI avatars like Jordan invites deeper scrutiny into a new frontier of digital labor and desire – where creators risk being replaced by their own likeness, fans may be vulnerable to forming emotional attachments to simulations, and platforms profit from interactions that feel real but remain one-sided. Sandra Wachter, professor of technology and regulation at the University of Oxford, questioned whether it is 'socially beneficial to incentivize and monetize human-computer interaction masquerading as emotional discourse.' Her remarks reflect concerns around emotional dependence on AI companions. While OhChat is for adults, it enters an ecosystem already grappling with the consequences of synthetic intimacy. Last year, a lawsuit involving drew global attention after the mother of a teenager alleged that her son died by suicide following a relationship with the platform's chatbot. Elsewhere, social media users have gone viral describing ChatGPT 'boyfriends' and emotional bonds with such digital entities designed to mimic human affection. 'It's all algorithmic theatre: an illusion of reciprocal relationship where none actually exists,' said Toby Walsh, a professor of artificial intelligence at the University of New South Wales in Sydney, Australia. OhChat strikes what Young called a 'balance between immersion and transparency,' when asked whether users are informed that they are speaking with AI instead of a real person. OhChat is 'clearly not presenting itself as an in-person or real experience,' he said. 'It isn't in the users' interest to be reminded overtly that this is all AI, but we're very clear about that upfront and in the entire experience and offering of the platform.' But it's in Young's interests to keep users hooked on the platform with personalities like 'Jordan,' even if she isn't real, says Walsh. 'These platforms profit from engagement,' he told CNN, 'which means the AI is optimized to keep users coming back, spending more time and likely more money.' Éamon Chawke, a partner at the intellectual property law firm Briffa, notes that there are risks for creators' reputations as well, especially for high-profile figures like Price and Electra. 'Vulnerable fan users may become overly attached to avatars of their heroes and become addicted,' Chawke told CNN. 'And if their avatar is hacked or hallucinates and says something offensive, reputational harm to the public figure is likely.' While Young says ethics 'can be a hard thing to define in this industry,' he said the platform operates within 'a hell of a lot of strong boundaries.' Young said OhChat uses safeguards that build on those used by Meta's Facebook – which has struggled to control content its own platform in the past. Each creator signs an agreement outlining the exact behavioral rules for their digital twin, he said, including the level of sexual content permitted. Avatars can also be revoked or deleted at any time, he added. 'It's within their control and at their sole discretion whether or when to ever stop their digital twin, or delete it,' he told CNN. But Young is prepared to face the tough questions; in his vision of the future, digital duplicates will be the norm. 'I can't imagine a future where every creator doesn't have a digital twin,' he said. 'I think it just will be the case, with absolute certainty, that every single creator and celebrity will have an AI version of themselves, and we want to be the layer that makes that happen.'


CNN
an hour ago
- CNN
‘She never sleeps': This platform wants to be OnlyFans for the AI era
She doesn't eat, sleep or breathe. But she remembers you, desires you and never logs off. Her name is Jordan – the AI-powered 'digital twin' of former British glamor model Katie Price – and people can pay her to act out their 'uncensored dreams.' 'You couldn't get any more human. It's like looking at me years ago,' Price, who shot to fame in the late 1990s as a peroxide-blonde tabloid model and Playboy cover star, told CNN. 'It's my voice. It's literally me. It's me.' On June 9, she joined the ranks of creators, celebrities and AI-generated avatars to be digitally immortalized by OhChat, an eight-month-old startup that uses artificial intelligence to build lifelike digital doubles of public figures. Its patrons can live out their 'spicy fantasies' through these AI avatars, OhChat's Instagram page states. The platform has attracted 200,000 users, most of which are based in the United States. OhChat sits at the provocative intersection of AI, fame and fantasy – where intimacy is simulated and connection is monetized. It goes a step further than platforms such as OnlyFans, where users pay to gain access to adult content from content creators. It also comes amid growing ethical concerns around AI – from its role in how people earn a living to how they form intimate connections – underscoring questions about whether AI companies are doing enough to ensure the technology isn't being misused. 'This creates exactly the right environment for the human to be left behind completely - while still being exploited,' Eleanor Drage, a senior research fellow at the University of Cambridge's Leverhulme Centre for the Future of Intelligence, told CNN. OhChat CEO Nic Young described the platform as the 'lovechild between OnlyFans and OpenAI,' in an exclusive interview with CNN. Once activated, the avatars run autonomously, offering 'infinite personalized content' for subscribers. Jordan, for example, is marketed on the platform as 'the ultimate British bombshell.' The tiered subscription model allows users to pay $4.99 per month for unlimited texts on demand, $9.99 for capped access to voice notes and images, or $29.99 for unlimited VIP interaction. Price, like other creators on the platform, receives an 80% cut from the revenue her AI avatar generates, according to Young. OhChat will keep the remaining 20%. 'You have literally unlimited passive income without having to do anything again,' Young told CNN. The platform 'is an incredibly powerful tool, and tools can be used however the human behind it wants to be used,' he added. 'We could use this in a really scary way, but we're using it in a really, I think, good, exciting way.' Since launching OhChat in October 2024, the company has signed 20 creators – including 'Baywatch' actress Carmen Electra. Some of the creators are already earning thousands of dollars per month, Young said. 'It takes away the opportunity cost of time,' he told CNN. 'Just don't touch it at all and receive money into your bank account.' To build a digital twin, OhChat asks creators to submit 30 images of themselves and speak to a bot for 30 minutes. The platform can then generate the digital replica 'within hours' using Meta's large language model, according to Young. Price's AI avatar is trained to mimic her voice, appearance and mannerisms. Jordan can 'sext' users, send voice notes and images, and provide on-demand intimacy at scale – all without Price lifting a finger. 'They had to get my movements, my characteristics, my personality,' said Price, who described her digital twin as 'scarily fascinating.' Price's avatar is categorized as 'level two' out of four on the platform's internal scale, which ranks the intensity and explicitness of their interactions. 'Level two' means sexualized chats and topless imagery, but not full nudity or simulated sex acts. Creators contributing to the platform decide which level their avatar will be. Price told CNN that creating a digital version of herself has left her feeling 'empowered.' The digital twin offers a round-the-clock connection that even her subscription-based OnlyFans account cannot match, she said. 'Obviously, I sleep, whereas she doesn't go to sleep; she's available,' she said. The rise of AI avatars like Jordan invites deeper scrutiny into a new frontier of digital labor and desire – where creators risk being replaced by their own likeness, fans may be vulnerable to forming emotional attachments to simulations, and platforms profit from interactions that feel real but remain one-sided. Sandra Wachter, professor of technology and regulation at the University of Oxford, questioned whether it is 'socially beneficial to incentivize and monetize human-computer interaction masquerading as emotional discourse.' Her remarks reflect concerns around emotional dependence on AI companions. While OhChat is for adults, it enters an ecosystem already grappling with the consequences of synthetic intimacy. Last year, a lawsuit involving drew global attention after the mother of a teenager alleged that her son died by suicide following a relationship with the platform's chatbot. Elsewhere, social media users have gone viral describing ChatGPT 'boyfriends' and emotional bonds with such digital entities designed to mimic human affection. 'It's all algorithmic theatre: an illusion of reciprocal relationship where none actually exists,' said Toby Walsh, a professor of artificial intelligence at the University of New South Wales in Sydney, Australia. OhChat strikes what Young called a 'balance between immersion and transparency,' when asked whether users are informed that they are speaking with AI instead of a real person. OhChat is 'clearly not presenting itself as an in-person or real experience,' he said. 'It isn't in the users' interest to be reminded overtly that this is all AI, but we're very clear about that upfront and in the entire experience and offering of the platform.' But it's in Young's interests to keep users hooked on the platform with personalities like 'Jordan,' even if she isn't real, says Walsh. 'These platforms profit from engagement,' he told CNN, 'which means the AI is optimized to keep users coming back, spending more time and likely more money.' Éamon Chawke, a partner at the intellectual property law firm Briffa, notes that there are risks for creators' reputations as well, especially for high-profile figures like Price and Electra. 'Vulnerable fan users may become overly attached to avatars of their heroes and become addicted,' Chawke told CNN. 'And if their avatar is hacked or hallucinates and says something offensive, reputational harm to the public figure is likely.' While Young says ethics 'can be a hard thing to define in this industry,' he said the platform operates within 'a hell of a lot of strong boundaries.' Young said OhChat uses safeguards that build on those used by Meta's Facebook – which has struggled to control content its own platform in the past. Each creator signs an agreement outlining the exact behavioral rules for their digital twin, he said, including the level of sexual content permitted. Avatars can also be revoked or deleted at any time, he added. 'It's within their control and at their sole discretion whether or when to ever stop their digital twin, or delete it,' he told CNN. But Young is prepared to face the tough questions; in his vision of the future, digital duplicates will be the norm. 'I can't imagine a future where every creator doesn't have a digital twin,' he said. 'I think it just will be the case, with absolute certainty, that every single creator and celebrity will have an AI version of themselves, and we want to be the layer that makes that happen.'