Latest news with #SnapSpectacles


Tom's Guide
12-06-2025
- Tom's Guide
Best of AWE 2025: The top 7 XR gadgets that caught our eye
Augmented World Expo (AWE) is a show focused on the world of virtual reality headsets and smart glasses, and how those devices are changing the future. The 2025 edition was a leap forward from previous years with a massive presence from well-known tech companies like Qualcomm, Sony, and Meta. Smart glasses are getting better and better, and headsets like the Meta Quest 3 are receiving more ways to play and work. And we haven't even mentioned the various wearables that can connect with your phone or these devices. Much of the show is focused on the future of headsets and glasses, but there were a number of products that are coming soon or are available now. We were able to go hands-and face-on with several products. Here's our picks for the best of AWE 2025 that you need to know about. Yes, these are the AR glasses that Viture has been teasing for a while now. While I can't tell you much about my hands-on time until they are announced, the fact that I've immediately given them the "best of show" trophy is hopefully enough of a green flag of what you'll get here. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. And what you do get is the best screen I've ever seen with a massive 60-degree field of view — all with that same great color production that Viture is known for. On top of that, with such a huge field of view, the glasses don't look or feel significantly larger to pull it off, and there's zero fringing around the outer edges of the display. Put simply, if you've been waiting for the best external display for your eyes on long journeys, I recommend you wait for these. Snapchat had a massive presence at AWE 2025 with multiple demos of its current Snap Spectacles, including AI-enabled object recognition and linked glasses for multi-person experiences. The biggest news was that its rebranded Specs will launch in 2026. Developers have had access to the prototype versions of the new smart glasses since late 2024, with a ton of expected "Lenses" or apps already in development. Snap CEO Evan Spiegel announced that the new glasses would be lighter and a "much smaller form factor" than the current Spectacles and the dev kits that we've seen. AWE 2025 was filled with VR accessories that ranged from haptic gloves to giant mech-suit-esque apparati. Unfortunately, many were either prototypes, meant for businesses, or too big for most people to feasibly use in their homes (looking at you, MEK). bHaptics showed off its TactSuit, a series of wearable VR accessories that add haptic feedback to your VR gaming. And we got to try some, including a vest, gloves, and sleeves. The accessories work with the Meta Quest 3 headset and were a lot of fun, even during simple tech demos. They're a bit spendy, but if you're invested in VR gaming, they are worth the cost. Controlling AR content on glasses has been a bit of a minefield. Either you've got to use a secondary device like a wand (like the Xreal Beam Pro), or it's a whole hand-tracking situation that doesn't really work without more raw computation. That's where the KiWear Smart Ring comes in — accurately capturing pinch and hand movements to a degree that it all feels like spatial computing without the need for an Apple Vision Pro on your face. Whether it's pinching to select, swirling your finger around to change the volume, or turning your hand palm-up for additional interactions, it's all here with this ring. It could possibly bring on a new wave of controlling AR content. We all know that AI goes hand-in-hand with smart glasses to deliver an immeasurably better experience — take a look at the Ray-Ban Metas for example. But it can all be a bit impersonal. How do you make that AI more personalized to you? A lot of sensors, and that's what Emteq is doing. Simply put, this company has delivered a fitness and wellness tracker better than any smart watch or smart ring ever could be. With nine optical sensors, it's able to measure your facial muscles to a near-microscopic level. This has uses in being able to create an avatar for talking in video calls, but the real immediate benefit I saw is in healthcare. Not only can you use the cameras to take a picture of your food and use ChatGPT to give you a caloric breakdown of what you're about to eat, but you can also get a reading on whether you're chewing too fast, which may cause digestion problems. And then the subconscious muscle twitches in your face can give it a read on your emotional well-being too. This is true personalized AI, and a look at what smart glasses could be as real assistants. The Wizpr ring caught us by surprise as we wandered the AWE 2025 show floor. It's an AI-enabled smart ring that features a microphone you can use to speak with AI. We tested it, and you can just about whisper, as the name implies, into the ring to give it commands or prompts. On the loud show floor, we were able to ask questions like, "What's the weather like?" or "How far away is the nearest Starbucks?" and the interface appeared to hear and understand the prompts. It can also be used to control some smart home devices like lights or media in your AirPods. Snapdragon AR1+ is a turbo-boosted version of what you see in the Ray-Ban Metas, but it's so much more than that. You see, one of the common obstacles with AI in smart glasses is the time taken to receive a response from the cloud, or latency. But this is able to run a 1-billion parameter model entirely locally, which is great for both privacy and speed. On top of that, there are improvements to camera quality, display quality, and energy efficiency. This chip puts Qualcomm on a path towards smart glasses that cut the cord to any phone or additional computing puck, and sets them towards a future where your smart glasses could replace what you may be reading this on right now.


CNET
10-06-2025
- CNET
Snap's CEO Told Me About Its New AR Glasses, Coming in 2026
Last year I played around with the latest AR glasses from Snap, the company behind the popular social media app Snapchat. The glasses allowed me to play multiplayer games overlaid in the real world, they worked outdoors and used AI. The ones I tested were meant for developers, not everyday people. Snap's AR Spectacles are going on sale for real in 2026. I spoke to Snap's CEO Evan Spiegel about what to expect. He promised a smaller design, better battery life and more AI features. They'll work without needing a phone, but they'll be arriving in a more crowded glasses landscape than ever. A lot of tech companies are working on getting smart glasses on your face. These devices promise to integrate with the real world to augment reality with information, games and more. Meta's doing it, Google's doing it, Apple might be doing it and Snap's been doing it for years, releasing its Snap Spectacles back in 2020. The 2026 version of Snap's glasses could be arriving ahead of its competitors in the AR glasses race. The aim is to bring games and 3D collaborative experiences to a larger audience, but at an unknown price and design. Here's what we know so far. Smaller Snap AR Spectacles, better battery life I tried Snap's AR Spectacles several times since last year. They float 3D graphics into the real world with transparent lenses using waveguides and internal projectors. They include built-in hand tracking via cameras, feeling almost like a hybrid between VR devices like the Quest 3 and future AR glasses like Meta's Orion. But their battery life was extremely short: only around 45 minutes. And they look a lot bigger and thicker than any everyday pair of glasses, or even other smart glasses like Meta Ray-Bans. Spiegel says that AR apps (that Snap calls Lenses) made for developer Spectacles will work on the upcoming glasses, but that it'll be "a ton more capable at a fraction of the weight, and in a smaller form." I mentioned my concern about the battery life. "Battery life will be dramatically improved," Spiegel replied. "And, you know, again, it'll depend on the task. If you're doing something really heavy-duty and immersive in AR versus more passive browsing or streaming, you'll see differences in battery life." It also sounds like these upcoming glasses will have better displays, or at least more compact ones. "There'll be some pretty meaningful improvements to the wave guide in the optical engine in this next generation," Spiegel said. "That's part of what enables the form factor and the capability improvement." Standalone but not quite 'everyday' glasses Much like Snap's existing AR Spectacles, released last year, these new glasses are also designed to work on their own. "They'll be fully standalone," said Spiegel, "But if you want to use it with your phone, with the phone as a controller or something like that, we offer a lot of those sorts of services to developers." Snap's glasses currently use phones as either handheld remotes or motion controllers if apps want to build in support. They use hand tracking and a pinch-and-gesture system to control the OS otherwise. Don't expect these to truly replace your everyday glasses. No one's made full augmented reality glasses that can last a full day yet, and Snap doesn't sound like they've done it either. "I wouldn't say that they're necessarily designed for all day wear, although someone could use them that way if they wanted," said Spiegel. "Where we focus more is on the capability side to enable true computing experiences. I wouldn't think of it as a smartphone peripheral. I would think about this as a full featured computer." A depth-sensing tool will allow AI to recognize 3D environments. Snap AI, including Gemini, that can see your 3D world One of the new AR tools Snap's announced is a way to share the scanned depth map of your area on glasses with onboard AI services. Spiegel calls it "spatial intelligence," and it's not something I've seen discussed by any other glasses- or headset-makers yet. It could allow large language model AI to get an understanding of your own space, and possibly combine it with other data. Spiegel said these tools could start "labeling objects around you or helping instruct you how to do something, not through a text or voice based interface, but actually in the world with you." Snap's also brought support for Google's Gemini AI on its Spectacles, which can already connect with ChatGPT. Supporting different AI services is part of Spiegel's pitch to make Snap's AR glasses also work as an AI platform. Snap's Spectacles aren't part of Google's Android XR ecosystem yet, but Spiegel is open to it: "We would love to make sure Spectacles interoperate as tightly as possible with other platforms and services. That's part of delivering a great customer experience, and certainly something we're thinking about." Camera-enabled AI also brings its own privacy questions. A new tool called SmartGate aims to wall off camera data and keep it from leaking outside the glasses, an increasingly important feature as more AI apps will start being able to access home depth maps and camera data. How will Snap strike that balance? It remains to be seen. "Trying to continue to architect our system in a way that developers and users can get the full power of AI without compromising on privacy is something that's really important to us," said Spiegel. It's a territory every AI-enabled glasses maker is going to have to navigate. A number of augmented-reality game apps already live on Snap's Spectacles, and we can expect more with multiplayer features. Seen here: Pool Assist by Studio ANRK. Snap Multiplayer games are a big part of Snap's pitch I played a lot of short game experiences on Spectacles the last time I tried them, and Snap's glasses can share AR collaboratively with other people in real-world spaces. Think collaborative Lego brick-building, or outdoor virtual laser tag. Games sound like a big part of Snap's approach to its next glasses, too, especially multiplayer. "[Gaming is] one of the things that's most exciting to me, because it really brings people together in a shared space," Spiegel said. "I remember when I was a kid playing N64 and having kids over your house and playing together, but I think this takes sort of that concept of shared play to a whole new level, because you can run around outside and share these experiences." That collaborative AR play is a unique feature to Spectacles at the moment, and it could spark some fun game ideas, and pave some of the way towards outdoor AR collaborative apps to come. Expect real-world installations and demos Spiegel didn't give a specific answer when I asked how these Spectacles will be sold, but it sounds like the goal is to build out experiential demo opportunities. While Meta Ray-Bans and Google's upcoming AI glasses will be sold in optical shops, Snap used to sell its Spectacles out of whimsical pop-up vending machines. "We've had so much fun thinking about different ways to distribute Specs," said Spiegel. "In the case of this product, it's just not something you can really believe until you try it. And so a lot of our efforts are just going to be focused on helping people all over the world try and experience [them]." Installations at museums, art exhibits, pop-up events or maybe even theme parks could be part of it. "We've seen a lot of demand for location-based experiences, and that's why we're releasing some new tools specifically for folks who are trying to design experiences around museums or monuments, or these sort of shared game experiences," said Spiegel. Maybe Snap's AR glasses will emerge in some real-world experience you get to try before you even think about buying. And maybe that's the best way for any of these future AR glasses to really prove what they can do.


Bloomberg
17-03-2025
- Entertainment
- Bloomberg
Snap CEO Evan Spiegel Bets Meta Can't Copy High-Tech Glasses
By Evan Spiegel sees something others don't: a neon-red flag, digitally projected on the lenses of the augmented-reality glasses he's wearing. The co-founder and chief executive officer of Snap Inc. is at Clover Park, next to the company's headquarters in Santa Monica, California, sporting the new Snap Spectacles. Spiegel was originally supposed to demo a basic chess app, but he decides instead that he wants to challenge me to a more complex game of capture the flag. He guides me through the process, telling me to tap a button that appears through my own Spectacles lenses on my hand—visible to me, invisible to others—and then, from the menu now floating before my eyes, to select the app by pinching the air. We stake our 3D flags near a tree and a picnic table, set up rival bases and prepare to sprint around, using our palms to fire blasts of light at each other like Iron Man.