
Counterfeits, dangerous products: AliExpress threatened with EU fine
The European Commission on Wednesday took a significant step towards imposing a substantial fine on Chinese e-commerce giant AliExpress. The Commission preliminarily found that AliExpress has not adequately managed risks related to the sale of illegal products, despite numerous improvements.
The Brussels-based regulator, acting as the EU's digital watchdog, believes AliExpress breached its obligation to assess and mitigate risks associated with the distribution of illegal products—ranging from counterfeits to items that fail to meet European safety standards.
This marks the first time the Commission has targeted this Alibaba subsidiary under the EU's new Digital Services Act (DSA), which came fully into force last year to strengthen protections for internet users.
In its statement, the Commission highlighted that AliExpress underestimated these risks due to the limited resources allocated to its moderation system. It also failed to correctly enforce its sanctions policy against sellers repeatedly posting illegal content. The regulator pointed to systemic failures that rendered moderation efforts ineffective and easily circumvented by malicious sellers.
AliExpress now has access to the case file and may respond in writing to the preliminary findings. Should the Commission's accusations be confirmed, the platform could face a fine of up to six percent of its annual global turnover and be placed under enhanced supervision until corrective measures are implemented.
The formal challenge announced Wednesday follows an investigation launched by the Commission in March 2024. However, it also acknowledged progress made over the past year, with AliExpress proposing improvements that the Commission has validated.
The regulator specifically noted that AliExpress addressed concerns related to monitoring and detecting illegal products—such as medicines, food supplements, and adult content that could harm users' health and minors' well-being. The platform's reporting mechanisms and complaint handling systems were deemed satisfactory.
Brussels also confirmed that AliExpress complies with legal requirements regarding advertisement transparency, recommendation systems, seller traceability, and data access for researchers.
'The measures taken today demonstrate the strength of the Digital Services Act in creating a safer online environment,' said European Commissioner for Technological Sovereignty Henna Virkkunen. She welcomed AliExpress's commitment to becoming a safer platform for its users. This article was translated to English using an AI tool.
FashionUnited uses AI language tools to speed up translating (news) articles and proofread the translations to improve the end result. This saves our human journalists time they can spend doing research and writing original articles. Articles translated with the help of AI are checked and edited by a human desk editor prior to going online. If you have questions or comments about this process email us at info@fashionunited.com
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


NBC News
41 minutes ago
- NBC News
BREAKING: Judge orders release of Mahmoud Khalil
Tech News Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties Thorn, a nonprofit that provides detection and moderation software related to child safety, said it canceled its contract with X after the platform stopped paying it. June 18, 2025, 4:57 PM EDT / Updated June 18, 2025, 5:58 PM EDT By Ben Goggin When Elon Musk took over Twitter in 2022, he said that addressing the problem of child sexual abuse material on the platform was his ' top priority.' Three years later, the problem appears to be escalating, as anonymous, seemingly automated X accounts flood hashtags with hundreds of posts per hour advertising the sale of the illegal material. At the same time, Thorn, a California-based nonprofit organization that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X. Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities. X said Wednesday that it was moving toward using its own technology to address the spread of child abuse material. Some of Thorn's tools are designed to address the very issue that appears to be growing on the platform. 'We recently terminated our contract with X due to nonpayment,' Cassie Coccaro, head of communications at Thorn, told NBC News. 'And that was after months and months of outreach, flexibility, trying to make it work. And ultimately we had to stop the contract.' Many aspects of the child exploitation ads issue, which NBC News first reported on in January 2023, remain the same on the platform. Sellers of child sexual abuse material (CSAM) continue to use hashtags based on sexual keywords to advertise to people looking to buy CSAM. Their posts direct prospective buyers to other platforms where users are asked for money in return for the child abuse material. Other aspects are new: Some accounts now appear to be automated (also known as bots), while others have taken advantage of 'Communities,' a relatively new feature launched in 2021 that encourages X users to congregate in groups 'closer to the discussions they care about most.' Using Communities, CSAM advertisers have been able to post into groups of tens of thousands of people devoted to topics like incest, seemingly without much scrutiny. The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material. Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was 'a bit old hat' at this point, and that X's response 'has been woefully insufficient.' 'It seems to be a little bit of a game of Whac-A-Mole that goes on,' he said. 'There doesn't seem to be a particular push to really get to the root cause of the issue.' X says it has a zero-tolerance policy 'towards any material that features or promotes child sexual exploitation.' A spokesperson for X directed NBC News to a post from its @Safety account detailing what the company says are new efforts to find and remove child abuse material. 'At X, we have zero tolerance for child sexual exploitation in any form. Until recently, we leveraged partnerships that helped us along the way,' the company said in the post. 'We are proud to provide an important update on our continuous work detecting Child Sexual Abuse Material (CSAM) content, announcing today that we have launched additional CSAM hash matching efforts. 'This system allows X to hash and match media content quickly and securely, keeping the platform safer without sacrificing user privacy,' the post continued. 'This is enabled by the incredible work of our safety engineering team, who have built state of the art systems to further strengthen our enforcement capabilities.' The company said that the system would allow the company to automatically detect known CSAM and remove it, though it was not clear how it differs from existing hashing technology. The spokesperson did not respond to questions about Thorn's allegations regarding the payments. A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute. Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today. Historically, Twitter and then X have attempted to block certain hashtags associated with child exploitation. When NBC News first reported on the use of X to market CSAM, X's head of trust and safety said the company knew it had work to do and would be making changes, including the development of automated systems to detect and block hashtags. In January 2024, X CEO Linda Yaccarino testified to the Senate Judiciary Committee that the company had strengthened its enforcement 'with more tools and technology to prevent bad actors from distributing, searching for, or engaging with [child sexual exploitation] content across all forms of media.' In May 2024, X said it helped Thorn test a tool to 'proactively detect text-based child sexual exploitation.' The 'self-hosted solution was deployed seamlessly into our detection mechanisms, allowing us to hone in on high-risk accounts and expand child sexual exploitation text detection coverage,' X said. Pailes Halai, Thorn's senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn's software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn't clear if they ever fully implemented it. 'They took part in the beta with us last year,' he said. 'So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it's not very clear to us at this point how and if they used it.' Without Thorn, it's not entirely clear what child safety mechanisms X is currently employing. 'Our technology is designed with safety in mind,' Halai said. 'It's up to the platform to enforce and use the technology appropriately … What we do know on our side is it's designed to catch the very harms that you're talking about.' Halai said Thorn didn't take the termination of its contract with X lightly. 'It was very much a last-resort decision for us to make,' he said. 'We provided the services to them. We did it for as long as we possibly could, exhausted all possible avenues and had to terminate, ultimately, because, as a nonprofit, we're not exactly in the business of helping to sustain something for a company like X, where we're actually incurring huge costs.' Currently, some hashtags, like #childporn, are blocked when using X's search function, but other hashtags are open to browse and are filled with posts advertising CSAM for sale. NBC News found posts appearing to peddle CSAM in 23 hashtags that are oftentimes used together in the posts. NBC News only identified two hashtags that were blocked by X. The hashtags that were available to be posted to and viewed during an NBC News' review of the platform ranged from references to incest and teenagers to slightly more coded terms, like combinations of words with the name of the defunct video chat platform Omegle, which shut down in 2023 after a child sex exploitation lawsuit. Some hashtags contained jumbled letters and only contained posts advertising CSAM, indicating that they were created with the exclusive purpose of housing the advertisements. Some usernames of accounts posting the ads were simply a jumble of words associated with CSAM content on the platform, mixing names of social media platforms with other keywords. Many of the users linked directly to Telegram channels in their posts or their account bios and included explicit references to CSAM. Some posts linked to Discord channels or solicited direct messages to secure Discord links. Telegram and Discord have distinct positions in the internet's child exploitation ecosystem, offering semiprivate and private venues for people looking to sell or buy child exploitation material. NBC News previously reported on 35 cases in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord. A Discord representative said, 'Discord has zero tolerance for child sexual abuse material, and we take immediate action when we become aware of it, including removing content, banning users, and reporting to the National Center for Missing and Exploited Children (NCMEC).' The company said in response to NBC News' outreach that it removed multiple servers 'for policy violations unrelated to the sale of CSAM.' A representative for Telegram said 'CSAM is explicitly forbidden by Telegram's terms of service and such content is removed whenever discovered.' The representative pointed to the company's partnership with the U.K.-based Internet Watch Foundation, which maintains a database of known CSAM and provides tools to detect and remove it. While some of the X accounts posted publicly, others solicited and offered CSAM through X's Communities feature, where users create groups based on specific topics. NBC News observed groups with tens of thousands of members in which CSAM was solicited or was offered to be sold. In a group with over 70,000 members devoted to 'incest confessions,' multiple users posted multiple times linking to Telegram channels, explicitly referencing CSAM. 'I'm selling 6cp folder for only 90$,' one user wrote, linking to a Telegram account. CP is a common online abbreviation for 'child pornography.' CSAM has been a perpetual problem on the internet and social media, with many companies employing specialized teams and building automated systems to identify and remove abuse content and those spreading it. But Musk also instituted drastic cuts to the company's trust and safety teams and disbanded the company's Trust and Safety Council. In 2023, the company said that it was detecting more CSAM than in previous years and that it had increased staffing devoted to the issue despite larger trust and safety layoffs. Richardson, C3P's director of information technology, said that while X will sometimes remove accounts that are flagged to it for violating rules around CSAM, 'a new account pops up in two seconds, so there's not a lot of in-depth remediation to the problem. That's just sort of the bare minimum that we're looking at here.' He said an increasing reliance on artificial intelligence systems for moderation, if X is using them, could be in part to blame for such oversights. According to Richardson, AI systems are good at sorting through large datasets and flagging potential issues, but that, currently, systems will inevitably over- or under-moderate without human judgment at the end. 'There should be an actual incident response when someone is selling child sexual abuse material on your service, right? We've become completely desensitized to that. We're dealing with the sale of children being raped,' Richardson said. 'You can't automate your way out of this problem.' Ben Goggin Ben Goggin is the deputy tech editor for NBC News.


Edinburgh Reporter
43 minutes ago
- Edinburgh Reporter
How MOT Check is empowering first-time drivers across the UK
MOT Check, a London-based online MOT information service founded by Connor Evans in 2024, is a fresh player in the UK automotive market that has already achieved thousands of daily users thanks to its comprehensive toolkit. The company has been especially successful in helping first-time drivers safely get on the road by effectively conveying the importance of timely background MOT checks for those who are looking to purchase their first vehicle. 'The UK is sadly known for its bad vehicle sales practices, with many sellers providing insufficient or downright incorrect MOT information to get a better deal. This isn't something we can overlook — in some cases, people buy cars that are heavily damaged without ever realizing it, and that can lead to very costly repairs at best and terrible road accidents at worst. Inexperienced drivers often fall victim to these scams, which is why we find this audience highly important to protect — hence, our detailed and convenient database,' says Connor Evans, CEO of MOTCheck. MOT Check functions as a website that offers drivers the ability to enter their vehicle's registration number to quickly get an in-depth report on its MOT history, technical information, taxes, and more. This way any driver can quickly estimate if the vehicle they are about to purchase is indeed in good condition and if it had any issues in the past that need to be addressed. 'There are lots of areas that newcomers aren't properly aware of. For example, a vehicle that is known to have been in use by a company is much more likely to have a significantly higher mileage. Its condition is generally worse as well. There are many things that people who are only starting their journey as licensed drivers need to understand to ensure safety for everyone and avoid extra costs or fines,' says Connor Evans. This is why MOT Check doesn't simply analyze multiple official sources and reliable third-party services — the resulting report also contains helpful tips that enable drivers to get a full picture of what their car's MOT history means. 'Raw data is only as good as you can understand it, and our goal was to ensure that we don't just provide it with no context. It is easier, sure, but modern users are accustomed to an entirely different level of convenience in every other industry, so we believe that we have to attain it as well. Our reports aren't just comprehensive: they are analytical and we ensure that anyone can understand the true meaning behind the data, regardless of their expertise in the field. We are proud to say that our systems provide the most inclusive and accessible results in the local market,' says Connor Evans. As Evans adds, the team behind MOT Check currently works hard on increasing their algorithms' capabilities in order to provide even more information in a convenient way. The company is also focused on delivering new helpful features that will be particularly useful for first-time drivers, such as SMS notifications and email reminders for timely checks. 'We are a new and highly motivated team, so our hands are always busy, but we are happy to use all of our expertise to ensure that the UK's roads and the automotive market in general become friendlier and more understandable for everyone. We have gathered all of this information and spent years working in the niche so that everyone else can get our knowledge in a matter of seconds — this is what the future holds,' says Evans. Like this: Like Related

Rhyl Journal
an hour ago
- Rhyl Journal
Nigerian communities set to have oil pollution High Court claims tried in 2027
Members of the Bille and Ogale communities in the Niger Delta, which have a combined population of around 50,000, are suing Shell plc and a Nigerian-based subsidiary of the company, the Shell Petroleum Development Company of Nigeria, which is now the Renaissance Africa Energy Company. The two communities began legal action in 2015, claiming they have suffered systemic and ongoing oil pollution for years due to the companies' operations in the African country, including pollution of drinking water. They are seeking compensation and asking for the companies to clean up the damage caused by the spills. The companies are defending the claims, saying that the majority of spills are caused by criminal acts of third parties or illegal oil refining, for which they are not liable. On Friday, Mrs Justice May ruled on more than 20 preliminary issues in the claims, following a hearing held in London over four weeks in February and March. She said that 'some 85 spills have, so far, been identified', but added that the case was 'still at a very early stage'. Her findings included that Shell could be sued for damage from pipeline spills caused by third parties, such as vandals, in efforts to steal oil, a process known as bunkering. She also said that while there was a five-year limitation period on bringing legal claims, a 'new cause of action will arise each day that oil remains' on land affected by the spills. The cases are due to be tried over four months, starting in March 2027. Reacting to the ruling, the leader of the Ogale community, King Bebe Okpabi, said: 'It has been 10 years now since we started this case, we hope that now Shell will stop these shenanigans and sit down with us to sort this out. 'People in Ogale are dying; Shell need to bring a remedy. 'We thank the judicial system of the UK for this judgment.' Matthew Renishaw, international development partner at law firm Leigh Day, which represents the claimants, said: 'This outcome opens the door to Shell being held responsible for their legacy pollution as well as their negligence in failing to take reasonable steps to prevent pollution from oil theft or local refining.' He continued: 'Our clients reiterate, as they have repeatedly for 10 years, that they simply want Shell to clean up their pollution and compensate them for their loss of livelihood. 'It is high time that Shell stop their legal filibuster and do the right thing.' A Shell spokesperson said that the company welcomed the judgment. They said: 'For many years, the vast majority of spills in the Niger Delta have been caused by third parties acting unlawfully, such as oil thieves who drill holes in pipelines, or saboteurs. 'This criminality is the cause of the majority of spills in the Bille and Ogale claims, and we maintain that Shell is not liable for the criminal acts of third parties or illegal refining. 'These challenges are managed by a joint venture which Shell's former subsidiary operated, using its expertise in spill response and clean-up. 'The spills referenced in this litigation were cleaned up by the joint venture regardless of the cause, as required by Nigerian law, working closely with government-owned partner NNPC Ltd, Nigerian government agencies and local communities. 'Clean-up certificates were issued by the Nigerian regulator NOSDRA.' The High Court and the Court of Appeal ruled in 2017 and 2018, respectively, that there was no arguable case that Shell owed the claimants a duty of care, but the Supreme Court ruled in 2021 that there was a 'real issue to be tried'.