Latest news with #Snap


Web Release
6 hours ago
- Web Release
Snapchat Relaunches Family Hub to Support Safer Digital Experiences for Teens and Parents
Snapchat has relaunched its Family Safety Hub, a refreshed and more inclusive platform designed to help families navigate the app confidently and safely. The updated Hub offers clearer guidance, accessible resources, and new tools that reflect the evolving needs of both parents and teens. In an effort to educate parents, creators and press, Snapchat hosted an educational session in collaboration with the Abu Dhabi Early Childhood Authority and life educational coach, Hala Kazim. The revamped Family Center section includes updated guidance on how to use in-app features that enable parents to see who their teen is communicating with – without viewing the content of conversations – helping to strike a balance between safety and autonomy. To better represent the shared role of both parents and teens in building safer digital habits, the platform now uses more inclusive language, shifting from its previous name of 'Parents Site' to 'Family Hub'. Fatima Al Melhi, Director of Special Projects at Abu Dhabi Early Childhood Authority said, 'Protecting children and boosting their digital quality of life is a priority to us. And we know that protecting them, takes all of us. We are working together with Snapchat and the rest of the Children's Digital Wellbeing Pact members to ensure that we provide a space that balances freedom of access to information with ensuring the safety of children from electronic risks. With the revamped Family Safety Hub, Snap is proactively equipping parents, guardians, and teens with the essential tools needed to support their safety and digital well-being.' Jawaher Abdelhamid, Head of Public Policy, MEA at Snap Inc. said, 'From the start, Snapchat was designed as a safe and private platform, making user safety a fundamental priority. Our mission is to create a safer, more supportive experience for teens on Snapchat. The Family Safety Hub reflects our commitment to empowering families across the region with the tools they require to make what they believe are the right choices for their teens based on their age and family values, all while still respecting young Snapchatters' privacy.' New additions to the Hub include a dedicated FAQ section and a reorganized overview of Snapchat's features, providing a tab-by-tab explanation of the platform and offering practical tips for families. The site now hosts downloadable tools and resources that were previously only available at in-person Snap events. These will be continuously updated as Snapchat's product features and safety offerings evolve. In addition, relevant videos from Snap's YouTube channel have been integrated across the platform and will be refreshed quarterly, ensuring the content remains engaging and up to date. The Family Safety Hub presents content in clear, digestible formats to enhance understanding and make it easier for families to have meaningful conversations about digital wellbeing. Earlier this year, Snapchat was also named as a leading member of The Pact, the UAE's new Digital Wellbeing committee led by the Digital Wellbeing Council and the Abu Dhabi Early Childhood Authority. The Pact brings together government bodies, tech platforms, and telecom providers to support a safer, more age-appropriate digital experience for young people across the UAE. To explore the Family Safety Hub, visit


Bloomberg
15 hours ago
- Business
- Bloomberg
Teen Social Media Ban Moves Closer in Australia After Tech Trial
Australia's world-first social media ban for under-16s moved closer to implementation after a key trial found that checking a user's age is technologically possible and can be integrated into existing services. The conclusions are a blow to Facebook-owner Meta Platforms Inc., TikTok and Snap Inc., which opposed the controversial legislation. Some platform operators had questioned whether a user's age could be reliably established using current technology.
Yahoo
2 days ago
- Business
- Yahoo
Meta, social media, stablecoin-related stocks: Trending Tickers
Meta (META) is offering $100 million bonuses to OpenAI ( developers as the tech giant builds out its artificial intelligence (AI) team. Social media stocks, like Meta, Snap (SNAP), and Pinterest (PINS), are in focus after the White House confirmed that President Trump will sign an executive order this week to extend the TikTok delay, allowing the ByteDance-owned platform to avoid a US ban. The Senate passed the GENIUS stablecoin bill, sending stablecoin-related stocks like Coinbase (COIN), Robinhood (HOOD), and Circle (CRCL) higher. To watch more expert insights and analysis on the latest market action, check out more Morning Brief here. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


India Today
2 days ago
- Business
- India Today
Google rolls out budget-friendly Gemini 2.5 Flash Lite, opens 2.5 Flash and Pro to all
Google has introduced a new addition to its Gemini AI model line-up — the Gemini 2.5 Flash-Lite. According to Google, this new AI model can deliver high performance at the lowest cost and fastest speeds yet. Alongside the new model, the company has announced the general availability of the Gemini 2.5 Flash and Pro models to all says that Gemini 2.5 Flash-Lite is its most affordable and fastest model in the 2.5 family. It has been built to handle large volumes of latency-sensitive tasks such as translation, classification, and reasoning at a lower computational cost. Compared to its predecessor, 2.0 Flash-Lite, the new model is said to deliver improved accuracy and quality across coding, maths, science, reasoning, and multimodal benchmarks. 'It excels at high-volume, latency-sensitive tasks like translation and classification, with lower latency than 2.0 Flash-Lite and 2.0 Flash on a broad sample of prompts,' says Google. advertisementGoogle highlights that despite being lightweight, 2.5 Flash-Lite comes with a full suite of advanced capabilities. These include support for multimodal inputs, a 1 million-token context window, integration with tools like Google Search and code execution, and the flexibility to modulate computational thinking based on budget. According to the company, these features make the Gemini 2.5 Flash-Lite ideal for developers looking to balance efficiency with robust AI 2.5 Flash-Lite availability The Gemini 2.5 Flash-Lite model is currently available in preview via Google AI Studio and Vertex AI. Google has also integrated customised versions of 2.5 Flash-Lite and Flash into its core products like Search, expanding their reach beyond developers to everyday 2.5 Flash and Pro models now available to allIn addition to introducing Flash-Lite, Google has also announced that its Gemini 2.5 Flash and Gemini 2.5 Pro models are now stable and generally available. These models were previously accessible to a select group of developers and organisations for early production to Google, companies like Snap, SmartBear, and creative tools provider Spline have already integrated these models into their workflows with encouraging results. Now that Flash and Pro are fully open, developers can use them in production-grade applications with greater the stable and preview models can be accessed through Google AI Studio, Vertex AI, and the Gemini app.


Tom's Guide
3 days ago
- Business
- Tom's Guide
Exclusive: I asked Snap's hardware chief about the company's next-gen Specs — here's what I found out
So as we found out last week, Snap is finally launching Specs to the public in 2026 — after an exhaustive developer program that spans four years since its first Spectacles AR glasses. It's been a heluva journey, and with Meta's Project Orion on the horizon and Apple being 'hellbent' on delivering smart glasses, this is becoming a very competitive space. So far, Snap CEO Evan Spiegel has said they will be smaller, lighter, and fully standalone with 'no puck' required. But there's a lot we don't know yet. What has been the story that's led to this point where Snap is ready to go for a full public release? What tech can we expect inside these future contenders for best smart glasses? What's the price? And is society ready for true AR glasses like this? I had a chance to sit down with Snap's VP of Hardware, Scott Myers, and put these questions to him. I have to be a little careful with what I say, because we have like fully announced everything. Yeah. But what he said was that it's substantially smaller. So we have been in this area for 11 years, and we have been building these things for a very long time. It's public information that we have made some acquisitions that like our entire opt optical engine is our own custom thing. We build it ourselves. We design it ourselves, which gives us a pretty unique position where we know exactly how these things are going. We have road maps for development and I really like where we're going. And because we're not just a bunch of companies strung together, we're all like one group working all toward the same goal. I can have the team designing, say, the waveguide, talking to the same team that's working on the rendering engine and SnapOS. And that like synthesis is how we end up still confident about where we're at. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. We've been getting feedback in a lot of different forms about the hardware. We've gotten some phenomenal feedback from the community, but also feedback like 'we wish the field of view was bigger,' you know, or that 'the device was lighter.' There's a joke with the team that like, this is what I want [points towards his reading glasses]. It's a question of what I want. It's how we get there. It's the trade-offs we make to go make these like the dream of true augmented reality, something people can wear and walk around. The social acceptability element is so critically important. Context: Since 2020, Snap has been on a bit of a shopping spree — acquiring AI Factory in 2020 (computer vision AI company), WaveOptics in 2021 (the company designing its waveguide displays), Fit Analytics in 2022 (shopping and ecommerce), a 3D-scanning startup called Th3rd in 2023, and GrAI Matter Labs in 2024 (an edge AI chip company), alongside many more. Well, I think this is one of the reasons we're standalone. I don't want to see people wearing a wire coming out of the back of their head. It makes people look like U.S. Government officials, and that's not how I want to see the world. The form factor obviously matters, but it's also the fit and finish of these things that also matter when you make that jump. Like they need to be robust, but like all of those are pulling the product in different directions. So like, I think one of our strengths is, like, the balance of all of these things. You can make a giant field of view. Some companies have, but you also need really high pixel count or pixels per degree, because it's important for text legibility. You need the ability to make it work indoors and outdoors. Why? Because I don't want to spend all my time inside. As I'm moving through my day, like some of that's inside, some of it's outside. It needs to work in both. So you can't just have a static tint like sunglasses, nor can you just make them clear because they don't work in both environments. So because we've been building these things for so long, we learned these things. We've learned how to solve those problems — what works and what doesn't — but it's all in that trade-off and exactly how you balance all those things. Like, obviously, I'd want the battery to last for days, but then you end up with this giant battery pack that's directionally incorrect, too. This has been a multi-year multi-generation arc. We've launched a pair of 26-degree field of view, augmented reality glasses in 2021 to developers. With that, we learned a ton, and it drove the way our development tool Lens Studio is constructed. So we've been just iterating and iterating and iterating. And what we learned is that like the breath of feedback, the depth of feedback, it's not like you release the product once — you get a long written document, and that's it. It's an active conversation with the community. We even iterate in public in collaboration with our Spectacles subreddit. We want to learn. And what we find is like, as the community grows, as people get better and better at building lenses, they start answering each other's questions. It's a back and forth, like, I personally know developers. That's what a successful community looks like, and we're building this together. And that's very, very intentional. It's in the way our pricing is structured. It's in the way our community is growing. We don't just sell it to anybody, because we want the people who are really going to move the platform forward. It's all very, very, very intentional, and we're very happy with the results as well. As we've had the product out a little bit longer, the lenses have been getting more and more engaging and we're learning together how different UI elements are. I think we are really here to build this with the community because it's an entirely new paradigm. There's a lot of things that will take time for people to understand and figure out. It's not just going to be like, 'oh, here you go, developers — come build for this!' That's not going to work, in my opinion. It is very, very much a community-led discussion. And I couldn't be happier with that. I think what Evan shared was more than Ray-Ban Metas, and less than a Apple Vision Pro. I recognize that's a huge scale!. Obviously we want to make it as low cost as possible. Yeah. But it's also pretty, as you pointed out, pretty advanced technology. And so there's a balance there. One of the things that may not be super intuitive is there's a lot of technology that there is not a ton of world capacity for. Like, we have to go off and work with our suppliers to create these new technologies. Then we have to build the capacity to actually produce them. It's a fun challenge, but there's certainly a ton of work to do. Like, this isn't a Snap-specific problem. This is industry-wide. This is an area where Snap is in a very good spot. Trust matters, privacy matters. And the way we're constructing all of this is a privacy centric way. Like, I want to personalize it. But, this is the most personal possible device. It is literally seeing exactly what I'm seeing. And so, of course, we're going to bring in all the personalization that AI kind of already has like memory. That's an element here, but like I'm actually more worried about how we do it in a privacy-centric way. Back to your previous question, I'm very happy with our direction there. And we've shared a little bit about it, but like having built these for a while, having lived with them, like, it's very much one thing to say, like, hey, but what is this use case? Which I personally don't think is that valuable. It's more about that responsiveness — when I want it, I can go as deep as I want on any topic with it. But do so in a way that maintains my privacy for the times when I don't really need it. But I think that's maybe an undervalued, underexpected problem. Like, you don't want to just share camera images of your entire day! I like that you said battery life, and not just battery capacity. Like, it's all about the way you use it smartly. I used to work on smartphones for a very long time. And yeah, the battery capacity has grown pretty consistently, to be honest — X percent per year. But really, software has gotten much better in how it's being used. This is one of the reasons we built Snap OS, so that we have complete control of exactly how every little bit of energy is consumed across the entire device. It also goes to the way we design the displays, how we make them just super duper efficient, how we do the processing and how we distribute the heat. All of these things have to be balanced, and that's why it's so important to build these, again, where engineers can talk to engineers, and really look at everything as precisely as I can. The other thing I would say is I think if you were to have like in the limit, you have full display, including everything in your world all the time. That would probably be visually overwhelming. I don't personally want a world where I'm walking around and everything's an ad all the time. That would be terrible. So like, I think it'll be about like, what is shown and when, how it's used, and then just generally technology progressing. You know, if you look at some of the initial talk times of very early phones, we're not that long in our developer models. But I think we have some good strategies to increase the battery life now, and it'll just get better and better over time.