logo
#

Latest news with #SexualAbuse

Digital wounds, lifelong scars
Digital wounds, lifelong scars

The Star

timea day ago

  • The Star

Digital wounds, lifelong scars

Child Sexual Abuse Material isn't just content – t's a crime that inflicts deep, lasting trauma JUST one hour. That's all it takes for an online predator to groom a child – convincing them to share personal details, including their location, and ultimately trapping them in a web of sexual abuse. In just 60 minutes, a predator can build trust through social media, using flattery, attention and deceitful promises to manipulate a young mind. Yet, many Malaysians remain unaware of the gravity of what's happening behind screens. Most of us have never even heard of the term Child Sexual Abuse Material (CSAM), let alone understand its devastating implications. CSAM isn't limited to explicit photos. It covers a wide range of disturbing content – videos, drawings, manipulated images, and any material that depicts or suggests the sexual exploitation of children. Lurking dangers In a 2024 global report by the Childlight Global Child Safety Institute, it was estimated that a staggering 300 million children around the world fall prey to CSAM every year. Behind that number are countless children whose lives have been deeply affected by online sexual abuse – receiving suggestive questions, being pushed to share images of themselves or their body parts or exposed to sexually explicit content involving other minors. According to THORN – a non-profit that develops technology to protect children from sexual abuse – children under the age of 12 are often the main targets in CSAM, especially in content that is shared among offenders. But teens aged 13-17 are also at risk with predators turning to sextortion – cruelly blackmailing them with the threat of exposing their most intimate moments unless they hand over more images or even money. Kelly Chan, a clinical psychologist at Soul Mechanics Therapy in Petaling Jaya, explains that online grooming is a calculated process in which predators earn a child's trust – often targeting children who feel isolated or emotionally neglected. 'Groomers often present themselves as a supportive adult or even as a friend, to an extent, they offer praises, gifts and attention to create emotional dependency on the children,' she shares. Chan also adds that trust is established, groomers begin to desensitise children to sexual content – often by introducing inappropriate topics disguised as games or jokes. Over time, they escalate their demands, asking for explicit photos or acts, leaving the child feeling trapped in a cycle of fear, shame and guilt. Lifetime of trauma Once CSAM is shared online, it spreads like wildfire – almost impossible to erase. Survivors live with the constant fear that someone, somewhere, is viewing their abuse, and the trauma is repeated every time a photo or video is opened, shared, or saved. 'Psychologically speaking, victims can struggle with severe anxiety, depression and symptoms of post-traumatic stress disorder (PTSD),' warns Chan. 'They may experience chronic shame and low self-worth, especially if they feel they've lost control over their identity – even more so if they know others can still access their abuse at any time,' she adds. Kempen Internet Selamat (KIS), an initiative by the Communications Ministry and Malaysian Communications and Multimedia Commission (MCMC), was launched to promote safer internet use and awareness of digital crimes, including CSAM. Even if CSAM was created in the past, its continued circulation online can keep the trauma alive, leaving victims feeling powerless and trapped in a relentless cycle of abuse. Many become hypervigilant, withdrawn, or even aggressive, driven by fear and distrust. This emotional toll can affect their ability to build secure relationships and friendships. 'Some children may exhibit age-inappropriate sexual behaviours, such as engaging in sexual talk or mimicking sexual acts, which could be a result of exposure to CSAM,' Chan observes. She adds that older children may also resort to substance use, self-harm, or other high-risk behaviours as a way to regain a sense of control or escape the emotional pain. No child's play 'The circulation of CSAM online today involves a complex and evolving ecosystem,' says CyberSecurity Malaysia chief executive officer Datuk Dr Amirudin Abdul Wahab. He noted that Peer-to-Peer networks, encrypted messaging apps, and the dark web are often used to share CSAM due to their anonymity, making detection and enforcement difficult. Amirudin adds that they also see a concerning shift toward the misuse of more mainstream platforms. 'Cloud storage services, social media direct messaging, and even online gaming platforms are increasingly being exploited to share or store such material, often through covert methods,' he says. By law, those caught possessing, producing, or circulating such material face tough consequences under the Sexual Offences Against Children Act 2017, with prison terms of up to 30 years. On top of that, Section 233 of the Communications and Multimedia Act 1998 adds another layer of punishment, with fines reaching RM50,000 or up to a year behind bars for distributing obscene or offensive content. Yet, the rising numbers indicate more than law and order are needed to battle this epidemic which is silently slipping through screens, reaching into the lives of young Malaysians. In just the first quarter of 2024, Malaysian authorities reported 51,638 cases of harmful online content to social media platforms – a sharp rise from the 42,904 cases recorded throughout all of 2023. Malaysia has long been battling CSAM through various awareness initiatives, including the latest effort by the Communications Ministry and the Malaysian Communications and Multimedia Commission (MCMC). The campaign, called Kempen Internet Selamat (KIS), is a nationwide campaign running from 2025 to 2027, which will involve talks, exhibitions and training on areas including online safety guides and digital literacy. Raising awareness In December last year, Bukit Aman's Sexual, Women and Child Investigations Division (D11) principal assistant director senior assistant commissioner Siti Kamsiah Hassan issued a stern reminder that parents have a critical duty to shield their children from all forms of abuse – including sexual exploitation. Her reminder came as the country faced a troubling surge in CSAM cases. 'While awareness of general online threats such as scams has grown among the Malaysian public, understanding of the presence and dangers of CSAM remains limited,' Amirudin observes. KIS will be carried out in primary and secondary schools, universities and colleges, teacher training institutes, and local community spaces like Digital Economy Centres. He notes that the deeply rooted taboo and stigma surrounding abuse often prevent open discussion – leading to under-reporting and obscuring the true scale of the issue. Amirudin also highlights a widespread lack of awareness about how seemingly innocent, everyday actions can put children at risk. 'There is a lack of sustained, targeted education that highlights the evolving risks, including how everyday actions like 'sharenting' (parents who share children's images online) can be misused by predators,' he explains. Everyone's responsibility 'I make it a point to ask my teens about the apps they're using, who they're talking to, and what kind of messages they're getting,' says homemaker P. Meena Kumari, whose children are aged 13 and 16. 'And honestly, just teaching them what's not okay– like someone asking for photos, or trying to move the chat to another app. Being able to talk about these things with your children goes a long way.' But parents too, says Meena, have to educate themselves. 'It's so easy to fall behind with all the new stuff coming out, but if we don't know what they're on, we can't really help guide them.' While she agrees parents should play the biggest responsibility, she also feels strongly that it takes a collective effort. 'Schools can help by teaching online safety, and tech companies really need to do more to flag and block harmful stuff before it ever reaches our children.' If you come across any form of child sexual abuse material, don't stay silent. Report it immediately at your nearest police station or through the Communications Ministry and the Malaysian Communications and Multimedia Commission (MCMC). Every report helps protect a child. Scan the QR code below to find out more:

Possession of child sex abuse material arrest in Bossier
Possession of child sex abuse material arrest in Bossier

Yahoo

time10-06-2025

  • Yahoo

Possession of child sex abuse material arrest in Bossier

SHREVEPORT, La. (KTAL/KMSS) – In a joint investigation with state and national partners, the Bossier Parish Sheriff's Office has arrested a man for possession of Child Sexual Abuse materials. Detectives with the Bossier Parish Sheriff's office arrested Brandon Shapiro age 30 of Bossier for the possession of Child Sexual Abuse Materials. Shreveport police arrest two women for severe child abuse The arrest stems from tips submitted to the National Center for Missing and Exploited Children which were forwarded to the Louisiana Attorney General's Office as well as the Internet Crimes Against Children Task Force. Police say Shapiro was arrested at his workplace without incident. He has been booked into the Bossier Parish Maximum Facility with a $100,000 bond. If you have information about this case, know of any potential victims or related incidents, please contact the Sheriff's Office. This case is still under investigation. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

British Soap Awards 2025 sees Jacqueline Jossa lead famous arrivals
British Soap Awards 2025 sees Jacqueline Jossa lead famous arrivals

Daily Mirror

time31-05-2025

  • Entertainment
  • Daily Mirror

British Soap Awards 2025 sees Jacqueline Jossa lead famous arrivals

The British Soap Awards 2025 have finally kicked off again. This year, the annual event is being held in London. The biggest names in soap were gathered in London on Saturday night to celebrate the drama, laughs and highlights of the past year of stories on the small screen for the 2025 British Soap Awards. The ceremony is now in its 26th year, with winners voted for by the public. Famous faces from the land of soaps have started arriving, with Patsy Palmer and Jacqueline Jossa being amongst the first to walk the red carpet. The ceremony is being held at the Hackney Empire in the capital - and highlights from the show will air on ITV on Thursday 5 June. Coronation StreetEmmerdaleEastEndersHollyoaks Jack P. Shepherd - David Platt in Coronation StreetPatsy Palmer - Bianca Jackson in EastEndersNicola Wheeler - Nicola King in EmmerdaleNicole Barber-Lane - Myra McQueen in Hollyoaks Kellie Bright - Linda Carter in EastEndersLacey Turner - Stacey Slater in EastEndersBeth Cordingly - Ruby Fox-Miligan in EmmerdaleEden Taylor-Draper - Belle Dingle in Emmerdale Calum Lill - Joel Deering in Coronation StreetNavin Chowdhry - Nish Panesar in EastEndersNed Porteous - Joe Tate in EmmerdaleTyler Conti - Abe Fielding in Hollyoaks Peter Ash - Paul Foreman in Coronation StreetSteve McFadden - Phil Mitchell in EastEndersEden Taylor-Draper - Belle Dingle in EmmerdaleIsabelle Smith - Frankie Osborne in Hollyoaks The Platts - Coronation StreetThe Slaters - EastEndersThe Dingles - EmmerdaleThe Osbornes - Hollyoaks Jacob Roberts - Kit Green in Coronation StreetLaura Doddington - Nicola Mitchell in EastEndersShebz Miah - Kammy Hadiq in EmmerdaleIsabelle Smith - Frankie Osborne in Hollyoaks Alison King and Vicky Myers - Carla Connor and Lisa Swain in Coronation StreetRudolph Walker and Angela Wynter - Patrick and Yolande Trueman in EastEndersWilliam Ash and Beth Cordingly - Caleb and Ruby Miligan in EmmerdaleNathaniel Dass and Oscar Curtis - Dillon Ray and Lucas Hay in Hollyoaks Mason's Death - Coronation StreetPhil's Psychosis: The Mitchells in 1985 - EastEndersApril's Life on the Streets - EmmerdaleHollyoaks Time Jump - Hollyoaks Paul's Battle with MND - Coronation StreetPhil Mitchell: Hypermasculinity in Crisis - EastEndersBelle & Tom: Domestic Abuse - EmmerdaleSibling Sexual Abuse - Hollyoaks Will Flanagan - Joseph Winter-Brown in Coronation StreetSonny Kendall - Tommy Moon in EastEndersAmelia Flanagan - April Windsor in EmmerdaleNoah Holdsworth - Oscar Osborne in Hollyoaks Mason's Death – The Effects of Knife Crime - Coronation StreetEastEnders at 40: Angie Watts' Shock Return - EastEndersAmy's Deathly Plunge Reveals a Grisly Secret - EmmerdaleMercedes Confronts Her Mortality - Hollyoaks There are also awards for and which are announced on the night Jacqueline, who is best known for playing Lauren Branning, oozed glamour and sophistication as she rocked a dress featuring a daring thigh split. Both EastEnders and Emmerdale have been nominated 13 times across 12 categories - making themy the shows to beat on Saturday night. Coronation Street and Hollyoaks are not far behind, however, as both are up for 11 nominations each. EastEnders has been first out the gate, however, scooping the Best British Soap award over the other rival shows. While all the winners will be unvieled at the ceremony on Saturday night, the ceremony won't be broadcast on ITV until Thursday 5 June at 8pm. Singer and TV star Jane McDonald has been hosting the event since 2023 and was back to preside over the ceremony on Saturday. Despite hosting the show before, Jane admits she does get star struck by her famous audience. She told Inside Soap previously: "Hosting this is a massive thing for me because I'm a genuine fan. I get starstruck standing on the stage in front of all the soap actors!" Celebrities arrived at the Hackney Empire theatre in London on Saturday for the British Soap Awards 2025. Stars from Coronation Street, Emmerdale, Hollyoaks and more were in attendance to find out who won what, as voted for by the public. EastEnders star Sophie Khan Levy and soap icon Patsy Palmer were among the first to arrive at the event, with both ladies opting for blue outfits.

Expert warns parents over AI deepfakes of children
Expert warns parents over AI deepfakes of children

RTÉ News​

time20-05-2025

  • RTÉ News​

Expert warns parents over AI deepfakes of children

Only 20 images of a child are needed to create a deepfake video of them, a leading expert in cybersecurity has warned. The study, conducted by Perspectus Global, focused on 2,000 parents with children under the age of 16 in the UK, and showed that parents upload an average of 63 images to social media every month. Over half of these photos include family photos (59%), with one in five parents (21%) uploading these types of images multiple times a week. Speaking on RTÉ's Today with Claire Byrne, CEO of Mick Moran, said that as AI gets stronger, the 20 images required to create the videos will be reduced to only one. "The big worry is that these AI models will be used to create CSAM (Child Sexual Abuse Material) and children involved in sex acts," he said. "We've already seen in the past, innocent images that kids themselves are posting, or their parents are posting, being used in advertising pornography sites. "In this case however, giving a certain data set of images, 20 of them, will allow you to produce a non-limited amount in any scenario of that child." Mr Moran explained that the risk of CSAM is only one aspect of the issue, and the deep-fake videos could also be used for fraud or for scams. "You have to be aware that your data is being used to train these models and fundamentally, any information you share online can be used in ways you never intended." He said that if images are being shared publicly, the expectation of privacy is "gone," adding that some companies see uploaded material as under "implicit consent." "If you're an adult and you share a picture... it attracts different rules under data protection. However, if you're a parent and you share a picture of your child or another child, it is deemed to be implicit consent from the parent that transfers to the child, and therefore they can use the image." Parents urged to limit social media privacy settings Mr Moran said that there is "no problem" in sharing images online, as long as the audience who can view it is limited through social media privacy settings. He called on the Government to bring in legislation to make it illegal to possess or to make an engine which trains AI to produce CSAM. "CSAM and child pornography are illegal under the Child Trafficking and Pornography Act of 1998, so it's illegal to possess it, whether it's made by AI or not," he said. "What I'd be calling on the Government to do here would be to make it illegal to possess, make an engine, or to train an AI engine that will produce CSAM - that's not illegal. "What you put into it might be illegal, what comes out of it might be illegal, but the act of doing it is not necessarily illegal," he added.

Just over €65m paid out in mother and baby home redress scheme
Just over €65m paid out in mother and baby home redress scheme

Irish Examiner

time18-05-2025

  • Politics
  • Irish Examiner

Just over €65m paid out in mother and baby home redress scheme

Just over €65m has been so far paid out under the mother and baby home redress scheme. However, renewed calls have been made to end the "arbitrary exclusion" from the scheme of people who spent less than six months in a home as well as those who were in institutions not named in the final report. Labour leader Ivana Bacik has also hit out at the majority of religious orders who were involved in running mother and baby homes who have refused to pay into the scheme. Only two of eight religious bodies linked to mother and baby homes have offered to contribute to a survivor redress scheme despite lengthy negotiations. The Sisters of Bon Secours offered €12.97m, a sum deemed as meaningful and accepted by the Government. The Daughters of Charity of St Vincent de Paul proposed contributing a building to the scheme and this offer is being considered by the Government. A third religious body — the Sisters of St John of God — declined to contribute to the scheme but offered a conditional donation of €75,000 to be used for a charitable purpose associated with mother and baby home survivors. No offer from five religious bodies The remaining five bodies — the Congregation of Lady of the Good Shepherd; the Congregation of the Sacred Hearts of Jesus and Mary; the Congregation of the Sisters of Mercy; the Legion of Mary; and the Church of Ireland — made no offer. 'Sadly, many culpable religious orders refuse to pay redress or even acknowledge wrongdoing," Ms Bacik said. "Urgently, the Government must enact Labour's Civil Liability (Child Sexual Abuse Proceedings Unincorporated Bodies of Persons) Bill 2024. "This Bill would enable the State to compel religious orders to pay redress to survivors of abuse perpetrated within or by religious-run institutions, and also to survivors of mother and baby homes. "The bill, which was published last September, aims to provide a remedy for Government to address the legal obstruction tactics so routinely deployed by religious orders and their associated lay-run trusts. "These tactics are used to avoid having to pay redress to those who have endured abuse in institutions controlled by such orders. 'We have a dark and shameful past of institutional abuse in Ireland." For many decades, we have seen religious orders and institutions engaged in the covering up of this tragic history, with resulting injustice to survivors. "If we've learned anything as a nation, it is that accountability must be provided for survivors and victims of abuse," she said. Figures provided to Ms Bacik show that more than 6,460 applications have been made to the scheme, which opened in March 2024. Some 5,670 notices of determination have issued to applicants, over 81% of which contain an offer of benefits under the scheme. Applicants then have six months to consider their offer, before they need to respond to the Payment Office. Almost 5,000 payments are either processed and completed or in the process of being made and the total amount which has been paid out under the redress scheme to date is over €65m.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store