logo
#

Latest news with #parentalcontrols

Key admission in social media ban update
Key admission in social media ban update

Yahoo

time11 hours ago

  • Business
  • Yahoo

Key admission in social media ban update

The brains tasked with finding a way to enforce Labor's world-leading social media ban for under 16s say it is possible but that there is no 'silver bullet'. The preliminary findings of the Age Assurance Technology Trial (AATT) were released on Friday just six months before the ban was set to come into force. Project chief Tony Allen said his team found 'there isn't a one solution fits all' but rather a range of options that parties could use. 'There isn't like a silver bullet that will solve everything,' Mr Allen told Sky News. 'And different providers of social media services, for instance, will need to explore exactly what will work for them and their users, and that's really for them to assess their risk and to consider what they might want to implement.' In terms of what it might look like in practice, he suggested 'successive validated' – a series of tests designed to firm up a user's age. Mr Allen said it could start with 'something which is fairly simple, like holding your hand up or showing your face or talking'. 'And then that might not give you sufficient level of confidence, so then move on to maybe age inference techniques, or ultimately, they may need to move on to age verification where you need some sort of record or document,' he said. The trial uncovered some challenges. It found parental control and consent systems could be effective when first rolled out but could not 'cope with the evolving capacity of children' or properly protect a 'child's digital footprint'. It also warned that 'service providers were over-anticipating the eventual needs of regulators' and over-collecting user data. This consequently 'increased risk of privacy breaches', according to the findings. But Mr Allen said the 'clear conclusion' was that enforcing age limits could be enforced safely. He held back on putting a figure on the efficacy, noting the measurers were not 'foolproof'. 'There are ways that they (children) can get around them,' Mr Allen said. 'But then we've had tobacco laws for 100 years to stop children accessing tobacco, and it doesn't stop them from accessing some children from tobacco. 'So you have to try and work on how you reduce the risk and reduce the instance. 'You'll never completely eliminate it.' NewsWire understands the full findings will be handed to the government later this month.

No ‘silver bullet' for age verification in Labor's social media ban, project chief says
No ‘silver bullet' for age verification in Labor's social media ban, project chief says

News.com.au

time11 hours ago

  • Business
  • News.com.au

No ‘silver bullet' for age verification in Labor's social media ban, project chief says

The brains tasked with finding a way to enforce Labor's world-leading social media ban for under 16s say it is possible but that there is no 'silver bullet'. The preliminary findings of the Age Assurance Technology Trial (AATT) were released on Friday just six months before the ban was set to come into force. Project chief Tony Allen said his team found 'there isn't a one solution fits all' but rather a range of options that parties could use. 'There isn't like a silver bullet that will solve everything,' Mr Allen told Sky News. 'And different providers of social media services, for instance, will need to explore exactly what will work for them and their users, and that's really for them to assess their risk and to consider what they might want to implement.' In terms of what it might look like in practice, he suggested 'successive validated' – a series of tests designed to firm up a user's age. Mr Allen said it could start with 'something which is fairly simple, like holding your hand up or showing your face or talking'. 'And then that might not give you sufficient level of confidence, so then move on to maybe age inference techniques, or ultimately, they may need to move on to age verification where you need some sort of record or document,' he said. The trial uncovered some challenges. It found parental control and consent systems could be effective when first rolled out but could not 'cope with the evolving capacity of children' or properly protect a 'child's digital footprint'. It also warned that 'service providers were over-anticipating the eventual needs of regulators' and over-collecting user data. This consequently 'increased risk of privacy breaches', according to the findings. But Mr Allen said the 'clear conclusion' was that enforcing age limits could be enforced safely. He held back on putting a figure on the efficacy, noting the measurers were not 'foolproof'. 'There are ways that they (children) can get around them,' Mr Allen said. 'But then we've had tobacco laws for 100 years to stop children accessing tobacco, and it doesn't stop them from accessing some children from tobacco. 'So you have to try and work on how you reduce the risk and reduce the instance. 'You'll never completely eliminate it.'

It's 1PM. Do you know where your children are scrolling?
It's 1PM. Do you know where your children are scrolling?

The Verge

time21 hours ago

  • General
  • The Verge

It's 1PM. Do you know where your children are scrolling?

Adi Robertson Maybe, argues longtime internet law scholar Danielle Citron, sometimes you shouldn't. We've got a slow holiday Thursday here at The Verge, so it's time for me to finally read this paper from early June about alternatives to the 'parental control model' of children's privacy online — a topic that's not going away any time soon. The parental control model is a wolf in sheep's clothing. It is an empowering façade that leaves parents unable to protect children and undermines the intimate privacy that youth need to thrive. It is bad for parents, children, and parent-child relationships. And it is bad for the pursuit of equality.

Apple's latest online safety tools: What parents need to know
Apple's latest online safety tools: What parents need to know

Gulf Business

timea day ago

  • Gulf Business

Apple's latest online safety tools: What parents need to know

Apple is expanding its suite of parental controls and privacy features to offer families new tools for managing children's digital experiences across its platforms. The company announced updates on June 11, aimed at helping parents protect kids and teens online, part of its upcoming software releases including iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, visionOS 26, and tvOS 26. Read- The features, many of which were previously previewed, are designed to support age-appropriate usage from the moment a child sets up their device—without compromising privacy or security. These enhancements build on existing features such as Screen Time and App Store age controls, reinforcing Apple's commitment to a safer and more private digital environment for young users, Image credit: Apple/Website Simplifying child account setup Apple has long supported Child Accounts—Apple IDs designed for children under 13 and available for users up to age 18 when managed by a parent or guardian within a Family Sharing group. With the latest updates, the setup process for Child Accounts has been streamlined. Parents can now defer parts of the setup process while ensuring that age-appropriate settings are automatically enabled from the start. These features are already available on iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4. Additionally, Apple is making it easier for parents to confirm the age associated with their child's account. If the child is under 13, the system prompts parents to connect the account to their Family Sharing group. Once verified, the account is converted into a Child Account, unlocking Apple's full suite of parental control tools with default safety settings already in place. New age range sharing with apps A major privacy-forward update allows parents to share only their child's age range—not full birthdates—with apps, enabling developers to tailor experiences without collecting sensitive data. Using Apple's new Declared Age Range API, developers can request access to a user's age range in order to offer age-appropriate experiences. Parents can control how this information is shared: always, per request, or never. By default, children cannot alter these settings, but parents can grant them the ability to do so via Content & Privacy Restrictions. Apple emphasizes that this approach allows apps like weather or sports apps to function for children without requiring developers to gather unnecessary personal data, helping protect kids' identities while still enabling relevant functionality. Extending protections to teens Until now, Apple required that children under 13 use Child Accounts, which automatically include safety features like web content filters and app restrictions. With the upcoming OS updates, similar protections will also be applied automatically to users aged 13 to 17, regardless of their account type. These protections include Communication Safety features and web filters, all powered by enhanced age categorisation in the App Store. These changes ensure that teens receive more consistent protections, even if their Apple Account was set up independently of Family Sharing. Granular age ratings coming to App Store Apple is also refining its App Store age rating system. While developers have long self-assigned age ratings for apps, a more detailed system is being introduced by year's end. The revised framework includes five categories, adding three new distinctions for adolescents: 13+, 16+, and 18+. This change gives users and parents clearer insight into app appropriateness and allows developers to fine-tune how their apps are rated for various age groups. The new system will also integrate tightly with parental control settings such as Ask to Buy and Screen Time. Communication limits expanded with PermissionKit Apple's existing Communication Limits feature, which manages how and when kids can communicate via Phone, FaceTime, Messages, and iCloud, is being expanded to give parents more oversight. With the upcoming update, children will need to send a request to their parent before initiating contact with a new phone number. Parents can approve or deny these requests directly within Messages. In addition, Apple is introducing a new PermissionKit framework for developers. This allows kids to request parental approval to initiate chats, follows, or friend requests inside third-party apps. When implemented by developers, the framework offers another layer of control and safety for online interactions. App Store updates for transparency and control Apple is enhancing App Store transparency by updating product pages to show whether an app contains user-generated content, messaging capabilities, or in-app advertisements. It will also indicate whether the app includes built-in parental controls or age-assurance features. When content restrictions are in place, apps that exceed a child's allowed age range will no longer appear in areas like the Today tab, Games, or editorial content, minimizing exposure to inappropriate content. The Ask to Buy feature is also gaining flexibility. Parents can now approve one-time exceptions for apps that exceed a child's set age range, and just as easily revoke access through Screen Time if needed. Communication safety now extends to FaceTime and Photos Building on its existing Communication Safety tools, which warn children when sending or receiving explicit content, Apple is adding new capabilities: FaceTime calls: The system will now intervene if nudity is detected during a video call. Shared Albums in Photos: Nudity in shared images will be automatically blurred, and children will be warned before viewing. These additions further Apple's mission to prevent unwanted exposure to explicit content while maintaining user privacy and device control. Enhanced tools for parents and developers The new updates are backed by a robust ecosystem of tools Apple already offers to safeguard children: Screen Time & Ask to Buy : Allow parents to manage screen usage and approve purchases. Find My : Helps locate family members. Made for Kids section : A curated set of age-appropriate apps held to Apple's highest privacy standards. Limits on Apple Ads : Blocks ads for children under 13 and restricts personalized ads for teens. No ad tracking : Developers cannot track or request tracking of child user behavior. In addition, developers have access to several powerful frameworks: ScreenTime Framework : Enables supervision of a child's app usage. Device Activity & Family Controls APIs : Help customize parental control experiences. SensitiveContentAnalysis : Identifies and blurs sensitive imagery in apps. Media Ratings : Allow developers to incorporate parents' film/TV restrictions. Looking ahead With the launch of iOS 26 and its accompanying OS updates this fall, Apple is aiming to deliver a safer, more controlled digital experience for families—without compromising its strict privacy standards. By giving parents smarter tools and giving developers better ways to engage responsibly with young users, the tech giant continues to position itself as a leader in digital wellbeing.

Apple's updated parental controls will require kids to get permission to text new numbers
Apple's updated parental controls will require kids to get permission to text new numbers

The Verge

time11-06-2025

  • The Verge

Apple's updated parental controls will require kids to get permission to text new numbers

Apple is introducing a suite of updated child safety features, including one that will give parents more control over who their kids can communicate with. The features are set to arrive with iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, visionOS 26, and tvOS 26, which will launch this fall. Children will now be required to get parental approval when they want to communicate with a new phone number. Requests will appear in the Messages app, and parents can tap a button to approve or decline. Apple is also launching a 'PermissionKit' that will let developers fold a similar feature into their apps so that kids can 'send requests to their parents to chat, follow, or friend users.' The company's parental controls automatically enforce protections such as web content filters and app restrictions for kids under 13, and Apple will enable 'similar age-appropriate protections' for kids between the ages of 13 and 17. Apple's Communication Safety tool is being updated to 'intervene' when it detects nudity in FaceTime calls and to blur nudity in shared albums in the Photos app, the company says. App Store age ratings will also expand to include more granular categories, including 13 plus, 16 plus, and 18 plus. Additionally, Apple will let parents share a child's age range with apps, the company says, without disclosing their specific date of birth. Developers can request age range information with a new 'Declared Age Range API.' Companies like Meta, Snap, and X and a coalition of adult content companies have advocated for legislation that would require app store operators to verify their users' age, a requirement that Apple has pushed back on over privacy concerns. Utah and Texas have already passed app store age verification bills.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store