logo
#

Latest news with #digitalrelationships

Is surveillance culture fuelling child cyberstalking?
Is surveillance culture fuelling child cyberstalking?

BBC News

time20 hours ago

  • BBC News

Is surveillance culture fuelling child cyberstalking?

Children being drawn into a world of cyberstalking need to be educated about healthy relationships in the digital age, says Safeguarding Minister Jess comments came in response to a BBC investigation that found some children as young as 10 and 11 had been reported to police forces in England for suspected cyberstalking say constant monitoring online is becoming normalised from a young told the BBC: "We really need to be out there educating young people on what healthy relationships look like and that will be part of the government's violence against women and girls strategy." Cyberstalking is defined as using digital tools to harass, send threats or spread false information. Just like physical stalking it is fixated, obsessive, unwanted, and repetitive behaviour that causes fear, distress, or alarm in the victim. "Young people are told they should be flattered by this sort of behaviour, but it's very serious and can really control lives, making them anxious and nervous," said Phillips. 'My heart sank' Charlotte Hooper, who works for The Cyber Helpline, which supports victims of online abuse, knows first-hand how psychologically damaging cyberstalking can 19, pictures from her social media profiles were posted across pornographic websites and other forums filled with explicit comments."My heart sank," she recalled. "I didn't really know what was going on or who had done this."But Charlotte had first become a victim of cyberstalking when she was much younger. As a teen, Charlotte had tens of thousands of followers on X - many of them older men. But there was one who became disturbingly persistent."He messaged me daily: 'Hi,' 'How are you?' 'I wish we could talk more'," she she discovered he was behind the posts on the pornographic man was cautioned by the police for malicious communications and the messages stopped. But the experience left Charlotte anxious and hyper-aware, especially in public spaces. The Crime Survey for England and Wales found people aged 16 to 19 were most likely to be victims of stalking in the year ending March the survey does not gather data on under-16s, and new police figures suggest stalking is also affecting younger children. Charlotte believes the "normalisation of digital surveillance" - especially among young people - is fuelling concerning behaviours."Sharing locations, checking online activity, and constant messaging are often seen as signs of love and care - especially when their parents are doing it for safety," she said."But it also sets precedents for their other relationships."In Kent, the national charity Protection Against Stalking has expanded its workshops in schools to meet demand."We've got so many younger people now being referred in from schools, with the youngest being 13," said operations manager Alison Bird."It's quite concerning that we are getting referrals from children that age and the perpetrators themselves are equally just as young." The Suzy Lamplugh Trust - which runs the National Stalking Helpline - said cyberstalking among under-16s remained "significantly under-researched" and underfunded, despite its growing relevance and Mascalls Academy secondary school in Kent, students said Snapchat was their most-used app. Its Snap Map feature lets users constantly share their live location with friends."When I first got with my girlfriend, pretty quickly we both had each other on Snap Map," one student told the BBC."It wasn't really a big deal - I already had it with all my friends, so why not her as well?"Snapchat shared their safety features with the BBC, which include allowing teenagers to set location-sharing to private as the default, and restricting messaging. Collett Smart, family psychologist and partner in tracking app Life360, says "location sharing can be a valuable tool for both kids and parents but even well-intentioned digital tools should be introduced and managed with care".She stressed the importance of being clear about meaningful consent, adding: "Teach your child that location sharing should always be a choice, never a condition of trust or friendship, whether with parents, friends, or future partners." 'Risk of exploitation' For Jo Brooks, principal of Mascalls Academy, one of the biggest challenges was the disconnect between students' online behaviour and their behaviour in the classroom."Some young people feel confident online and see the internet as a shield," she said. "It makes them braver and sometimes more hurtful with their words."Emma Short, professor of cyberpsychology at London Metropolitan University, agrees anonymity can be both protective and harmful."It lets people explore identities they might not feel safe expressing in real life," she said. "But it also carries the risk of exploitation." In November 2022, the National Stalking Consortium submitted a super-complaint to the Independent Office for Police Conduct and the College of Policing, raising concerns about how stalking was handled in the response, the College of Policing has urged for better tracking of online offences."Every force now has an action plan to properly record all stalking - including online," said Assistant Chief Constable Tom Harding. "That's really important, because we need to be able to track and monitor these offences." If you have been affected by the issues raised in this article, help is available from BBC Action Line. The BBC contacted 46 police forces across the UK and among the 27 that responded, 8,365 cyberstalking offences had been recorded in eight forces were able to provide an age breakdown, with the youngest alleged victim recorded as an eight-year-old boy in Wiltshire in 2024 and the youngest suspect was a 10-year-old in Cheshire in Metropolitan Police had also recorded two victims under the age of 10, but did not specify how old they were. Anonymity is a common feature in cyberstalking cases, where perpetrators can create multiple accounts to evade tackle this, the government introduced the Right to Know statutory guidance in December, allowing victims to learn their stalker's identity as quickly as measures have also expanded the use of Stalking Protection Orders (SPOs), which can restrict alleged stalkers from contacting their victims. But charities warn court delays are limiting their effectiveness."Delays are a big concern," said Phillips. "We're working to strengthen SPOs so victims stay protected - even after sentencing."

AI companion apps such as Replika need more effective safety controls, experts say
AI companion apps such as Replika need more effective safety controls, experts say

ABC News

time11-06-2025

  • ABC News

AI companion apps such as Replika need more effective safety controls, experts say

The idea of having an emotional bond with a digital character was once a foreign concept. Now, "companions" powered by artificial intelligence (AI) are increasingly acting as friends, romantic partners, or confidantes for millions of people. With woolly definitions of "companionship" and "use" (some people use ChatGPT as a partner, for instance), it's difficult to tell exactly how widespread the phenomenon is. But AI companion apps Replika, Chai and each have 10 million downloads on the Google app store alone, while in 2018 Microsoft boasted its China-based chatbot XiaoIce had 660 million users. These apps allow users to build characters, complete with names and avatars, which they can text or even hold voice and video calls with. But do these apps fight loneliness, or are they supercharging isolation? And is there any way to tip the balance in the right direction? Romance and sexuality are big drawcards to the AI companion market, but people can have a range of other reasons for setting up a chatbot. They may be seeking non-judgemental listening, tutoring (particularly in their language skills), advice or therapy. Bethanie Drake-Maples, a researcher at Stanford University who studies AI companions, says some people also use the apps to reflect their own persona. "Some people will create a digital twin and just have a relationship with an externalised version of themselves," she tells ABC Radio National's series Brain Rot. Ms Drake-Maples published a study based on interviews with more than 1,000 students who used the AI companion app Replika. She and her colleagues found there were important benefits for some users. Most significantly, 30 of the interviewees said using the app had prevented them from attempting suicide. Many participants also reported the app helped them forge connections with other people, through things like advice on their relationships with other people, helping them to overcome inhibitions to connect with others, or by teaching them empathy. But other users reported no benefits, or negative experiences. Outside Ms Drake-Maples' study, AI companions have also been implicated in deaths. Ms Drake-Maples points out their study was a self-selecting cohort, and not necessarily representative of all Replika users. Her team is carrying out a longer-term study to see if they can glean more insights. But she believes it's possible these apps are, on the whole, beneficial for users. "We specifically wanted to understand whether or not Replika was displacing human relationship or whether it was stimulating human relationship," she says. But this social promotion can't be taken for granted. Ms Drake-Maples is concerned that companion apps could replace people's interactions with other humans, making loneliness worse. The participants in her study were much lonelier than the general population, although this isn't necessarily unusual for young college students. She believes governments should regulate AI companion technology to prevent this isolation. "There's absolutely money to be made by isolating people," she says. "There absolutely does need to be some kind of ethical or policy guidelines around these agents being programmed to promote social use, and not being programmed to try to isolate people." Replika says it's introduced a number of controls on its apps to make them safer, including a "Get Help" button that directs people to professional helplines or scripts based on cognitive behavioural therapy, and a message coding system that flags "unsafe" messages and responds in kind. Ms Drake-Maples thinks this is a good example for other apps to follow. "These things need to be mandated across the board," she says. Raffaele Ciriello, a researcher at the University of Sydney, is more sceptical of Replika's safety controls, saying they're "superficial, cosmetic fixes". He points out the controls were introduced months after the Italian government ruled the app had to stop using the data of Italian citizens in early 2023, citing concerns about age verification. "They were fearing a regulatory backlash." Dr Ciriello has also been interviewing and surveying AI companion users, and while he says some users have found benefits, the apps are largely designed for emotional dependence. "If you look at the way [Replika is] making money, they have all the incentives to get users hooked and dependent on their products," he says. Replika operates on a "freemium" model: a free base app, with more features (including the romantic partner option) available by paid subscription. Other companion apps follow the same model. "Replika and their kin have Silicon Valley values embedded in them. And we know what these look like: data, data, data, profit, profit, profit," Dr Ciriello says. Nevertheless, he also believes it's possible for AI companion technology to be built safer and more ethically. Companies that consult vulnerable stakeholders, embed crisis response protocols, and advertise their products responsibly are likely to create safer AI companions. Dr Ciriello says that Replika fails on several of these fronts. For instance, he calls its advertising "deceptive". The company badges its product as "the AI companion who cares". "[But] it's not conscious, it's not actually empathetic, it's not actually caring," Dr Cirello says. A Replika spokesperson said the tagline "the AI companion who cares" was "not a claim of sentience or consciousness." "The phrase reflects the emotionally supportive experience many users report, and speaks to our commitment to thoughtful, respectful design," they said. "In this regard, we are also working with institutions like the Harvard Human Flourishing Program and Stanford University to better understand how Replika impacts wellbeing and to help shape responsible AI development." Dr Ciriello says women-centred Australian app Jaimee is an example of an AI companion with better ethical design — although it faces the "same commercial pressures" as bigger apps in the market. The Californian Senate last week passed a bill regulating AI chatbots. If the bill continues through the legislature to become law, it will — among other things — require the companions to regularly remind users they're not human, and enforce transparency on suicide and crisis data. This bill is promising, Dr Cirello says. "If the history of social media taught us anything, I would rather have a national strategy in Australia where we have some degree of control over how these technologies are designed and what their incentives are and how their algorithms work." But, he adds, research on these apps is still in its infancy, and it will take years to understand their full impact. "It's going to take some time for that research to come out and then to inform sensible legislation." Listen to the full episode about the rise and risks of AI companions, and subscribe to the podcast for more.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store