Latest news with #Spectacles


Tom's Guide
5 days ago
- Business
- Tom's Guide
Exclusive: I asked Snap's hardware chief about the company's next-gen Specs — here's what I found out
So as we found out last week, Snap is finally launching Specs to the public in 2026 — after an exhaustive developer program that spans four years since its first Spectacles AR glasses. It's been a heluva journey, and with Meta's Project Orion on the horizon and Apple being 'hellbent' on delivering smart glasses, this is becoming a very competitive space. So far, Snap CEO Evan Spiegel has said they will be smaller, lighter, and fully standalone with 'no puck' required. But there's a lot we don't know yet. What has been the story that's led to this point where Snap is ready to go for a full public release? What tech can we expect inside these future contenders for best smart glasses? What's the price? And is society ready for true AR glasses like this? I had a chance to sit down with Snap's VP of Hardware, Scott Myers, and put these questions to him. I have to be a little careful with what I say, because we have like fully announced everything. Yeah. But what he said was that it's substantially smaller. So we have been in this area for 11 years, and we have been building these things for a very long time. It's public information that we have made some acquisitions that like our entire opt optical engine is our own custom thing. We build it ourselves. We design it ourselves, which gives us a pretty unique position where we know exactly how these things are going. We have road maps for development and I really like where we're going. And because we're not just a bunch of companies strung together, we're all like one group working all toward the same goal. I can have the team designing, say, the waveguide, talking to the same team that's working on the rendering engine and SnapOS. And that like synthesis is how we end up still confident about where we're at. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. We've been getting feedback in a lot of different forms about the hardware. We've gotten some phenomenal feedback from the community, but also feedback like 'we wish the field of view was bigger,' you know, or that 'the device was lighter.' There's a joke with the team that like, this is what I want [points towards his reading glasses]. It's a question of what I want. It's how we get there. It's the trade-offs we make to go make these like the dream of true augmented reality, something people can wear and walk around. The social acceptability element is so critically important. Context: Since 2020, Snap has been on a bit of a shopping spree — acquiring AI Factory in 2020 (computer vision AI company), WaveOptics in 2021 (the company designing its waveguide displays), Fit Analytics in 2022 (shopping and ecommerce), a 3D-scanning startup called Th3rd in 2023, and GrAI Matter Labs in 2024 (an edge AI chip company), alongside many more. Well, I think this is one of the reasons we're standalone. I don't want to see people wearing a wire coming out of the back of their head. It makes people look like U.S. Government officials, and that's not how I want to see the world. The form factor obviously matters, but it's also the fit and finish of these things that also matter when you make that jump. Like they need to be robust, but like all of those are pulling the product in different directions. So like, I think one of our strengths is, like, the balance of all of these things. You can make a giant field of view. Some companies have, but you also need really high pixel count or pixels per degree, because it's important for text legibility. You need the ability to make it work indoors and outdoors. Why? Because I don't want to spend all my time inside. As I'm moving through my day, like some of that's inside, some of it's outside. It needs to work in both. So you can't just have a static tint like sunglasses, nor can you just make them clear because they don't work in both environments. So because we've been building these things for so long, we learned these things. We've learned how to solve those problems — what works and what doesn't — but it's all in that trade-off and exactly how you balance all those things. Like, obviously, I'd want the battery to last for days, but then you end up with this giant battery pack that's directionally incorrect, too. This has been a multi-year multi-generation arc. We've launched a pair of 26-degree field of view, augmented reality glasses in 2021 to developers. With that, we learned a ton, and it drove the way our development tool Lens Studio is constructed. So we've been just iterating and iterating and iterating. And what we learned is that like the breath of feedback, the depth of feedback, it's not like you release the product once — you get a long written document, and that's it. It's an active conversation with the community. We even iterate in public in collaboration with our Spectacles subreddit. We want to learn. And what we find is like, as the community grows, as people get better and better at building lenses, they start answering each other's questions. It's a back and forth, like, I personally know developers. That's what a successful community looks like, and we're building this together. And that's very, very intentional. It's in the way our pricing is structured. It's in the way our community is growing. We don't just sell it to anybody, because we want the people who are really going to move the platform forward. It's all very, very, very intentional, and we're very happy with the results as well. As we've had the product out a little bit longer, the lenses have been getting more and more engaging and we're learning together how different UI elements are. I think we are really here to build this with the community because it's an entirely new paradigm. There's a lot of things that will take time for people to understand and figure out. It's not just going to be like, 'oh, here you go, developers — come build for this!' That's not going to work, in my opinion. It is very, very much a community-led discussion. And I couldn't be happier with that. I think what Evan shared was more than Ray-Ban Metas, and less than a Apple Vision Pro. I recognize that's a huge scale!. Obviously we want to make it as low cost as possible. Yeah. But it's also pretty, as you pointed out, pretty advanced technology. And so there's a balance there. One of the things that may not be super intuitive is there's a lot of technology that there is not a ton of world capacity for. Like, we have to go off and work with our suppliers to create these new technologies. Then we have to build the capacity to actually produce them. It's a fun challenge, but there's certainly a ton of work to do. Like, this isn't a Snap-specific problem. This is industry-wide. This is an area where Snap is in a very good spot. Trust matters, privacy matters. And the way we're constructing all of this is a privacy centric way. Like, I want to personalize it. But, this is the most personal possible device. It is literally seeing exactly what I'm seeing. And so, of course, we're going to bring in all the personalization that AI kind of already has like memory. That's an element here, but like I'm actually more worried about how we do it in a privacy-centric way. Back to your previous question, I'm very happy with our direction there. And we've shared a little bit about it, but like having built these for a while, having lived with them, like, it's very much one thing to say, like, hey, but what is this use case? Which I personally don't think is that valuable. It's more about that responsiveness — when I want it, I can go as deep as I want on any topic with it. But do so in a way that maintains my privacy for the times when I don't really need it. But I think that's maybe an undervalued, underexpected problem. Like, you don't want to just share camera images of your entire day! I like that you said battery life, and not just battery capacity. Like, it's all about the way you use it smartly. I used to work on smartphones for a very long time. And yeah, the battery capacity has grown pretty consistently, to be honest — X percent per year. But really, software has gotten much better in how it's being used. This is one of the reasons we built Snap OS, so that we have complete control of exactly how every little bit of energy is consumed across the entire device. It also goes to the way we design the displays, how we make them just super duper efficient, how we do the processing and how we distribute the heat. All of these things have to be balanced, and that's why it's so important to build these, again, where engineers can talk to engineers, and really look at everything as precisely as I can. The other thing I would say is I think if you were to have like in the limit, you have full display, including everything in your world all the time. That would probably be visually overwhelming. I don't personally want a world where I'm walking around and everything's an ad all the time. That would be terrible. So like, I think it'll be about like, what is shown and when, how it's used, and then just generally technology progressing. You know, if you look at some of the initial talk times of very early phones, we're not that long in our developer models. But I think we have some good strategies to increase the battery life now, and it'll just get better and better over time.


Egypt Independent
7 days ago
- Business
- Egypt Independent
Google, Meta and Snap think this tech is the next big thing
New York CNN — Silicon Valley thinks it's finally found the next big thing in tech: smart glasses – the same thing Google tried (and failed at) more than a decade ago. But Google Glass may simply have been ahead of its time. Now tech companies believe technology has finally caught up, thanks in part to artificial intelligence—and they're going all-in on truly 'smart' glasses that can see and answer questions about the world around you. The latest example: Snap announced this past week it's building AI-equipped eyewear to be released in 2026. The renewed buzz around smart glasses is likely the combination of two trends: a realization that smartphones are no longer exciting enough to entice users to upgrade often and a desire to capitalize on AI by building new hardware around it. That's why, although smart glasses aren't entirely new, advancements in AI could make them far more useful than the first time around. Emerging AI models can process images, video and speech simultaneously, answer complicated requests and respond conversationally. And that could make smart glasses finally worth wearing. 'AI is making these devices a lot easier to use, and it's also introducing new ways people can use them,' said Jitesh Ubrani, a research manager covering wearable devices for market research firm The International Data Corporation. Meet the new class of smart glasses Google, Snap, Meta and Amazon have previously released glasses with cameras, speakers and voice assistants. But the Google Glass of a decade ago never caught on. The screen was tiny, the battery life was short and the 'glasses' themselves were expensive and unfashionable. More modern glasses like Amazon's Echo Frames, Meta's original Ray-Ban Stories and early versions of Snap's Spectacles made it easier to listen to music or take photos hands-free. Yet these still didn't do anything you couldn't already do with a smartphone. This newer crop of smart glasses is far more sophisticated. For example, when I tried prototype glasses based on Google's software last year, I asked Google's Gemini assistant to provide cocktail ideas based on liquor bottles I had been looking at on a shelf. The glasses will also remember what you've seen and answer questions based on that: During its I/O developers conference in May, a Google employee asked Gemini for the name of a coffee shop printed on a cup she has looked at earlier. With the Ray-Ban Meta AI glasses, users can perform tasks like asking whether a pepper they're looking at in a grocery store is spicy or translate conversations between languages in real time. Two million pairs have been sold since their 2023 debut, Ray-Ban parent company EssilorLuxottica said in February. Attendees wear Google Glass while posing for a group photo during the Google I/O developer conference on May 17, 2013 in San Francisco, California.'There's been several years of various failed attempts,' said Andrew Zignani, senior research director of ABI Research's Strategic Technologies team. 'But there's finally now some good concepts of what's working.' And market research indicates the interest will be there this time. The smart glasses market is estimated to grow from 3.3 million units shipped in 2024 to nearly 13 million by 2026, according to ABI Research. The International Data Corporation projects the market for smart glasses like those made by Meta will grow from 8.8 in 2025 to nearly 14 million in 2026. What's coming next Snap didn't reveal many details about its forthcoming 'Specs' glasses but did say they will 'understand the world around you. 'The tiny smartphone limited our imagination,' Snap wrote in a blog post announcing the glasses. 'It forced us to look down at a screen, instead of up at the world.' Apple is also said to be working on smart glasses to be released next year that would compete directly with Meta's, according to Bloomberg. Amazon's head of devices and services Panos Panay also didn't rule out the possibility of camera-equipped Alexa glasses similar to those offered by Meta in a February CNN interview. 'But I think you can imagine, there's going to be a whole slew of AI devices that are coming,' he said in February. Demonstration of prototypes of glasses that can display information in the user's field of vision at the Google I/O developer conference on May 20 in Mountain View, CA. Andrej Sokolow/picture-alliance/dpa/AP AI assistant apps, like OpenAI's ChatGPT and Google's Search and Gemini apps, are already laying the foundation for smart glasses by using your phone's camera to answer questions about your surroundings. OpenAI is putting its tech in everything from a mysterious new gadget co-designed by Apple veteran Jony Ive to future Mattel toys. Google said last month that it would bring more camera use to its search app, a sign that it sees this technology as being key to the way people find information in the future. Apple this past week announced updates to its Visual Intelligence tool that let users ask questions about content on their iPhone's screen, in addition to their surroundings, by using its camera. Meta CEO Mark Zuckerberg recently reiterated his belief that smart glasses could become critical to how people use technology during testimony in a federal antitrust case. 'A big bet that we have at the company is that a lot of the way that people interact with content in the future is going to be increasingly through different AI mediums, and eventually through smart glasses and holograms,' he said in April. Do people actually want smart glasses? Still, tech giants need to get regular people to buy in. This includes potential privacy concerns, which played a big role in Google Glass' demise. Recording video with camera-equipped glasses is more subtle than holding up your phone, although Meta and Google's glasses have a light on the front to let other people know when a wearer is capturing content. Perhaps the biggest challenge will be convincing consumers that they need yet another tech device in their life, particularly those who don't need prescription glasses. The products need to be worth wearing on people's faces all day. Meta CEO Mark Zuckerberg presents Orion AR glasses at the Meta Connect annual event at the company's headquarters in Menlo Park, California, on September 25, 2024. Manuel Orbegozo/Reuters And these devices likely won't come cheap. Meta's Ray-Bans usually cost around $300, roughly the price of a smartwatch. While that's not nearly as expensive as the $3,500 Apple Vision Pro headset, it still may be a tough sell as people spend less on ancillary tech products. Global smartwatch shipments fell for the first time in March, according to Counterpoint Research, perhaps a sign that customers aren't spending as much on devices they may not view as essential. Yet tech firms are willing to make that bet to avoid missing out on what could be the next blockbuster tech product. 'Many in the industry believe that the smartphone will eventually be replaced by glasses or something similar to it,' said Ubrani the IDC analyst. 'It's not going to happen today. It's going to happen many years from now, and all these companies want to make sure that they're not going to miss out on that change.'
Yahoo
12-06-2025
- Business
- Yahoo
Why Snap Stock Was a Winner on Wednesday
The company formally introduced its latest product. This is Specs, the new iteration of its smart glasses line. 10 stocks we like better than Snap › Social media company Snap (NYSE: SNAP) saw its share price creep higher on Hump Day, thanks mainly to the announcement of a new product. The company will face some stiff competition, however, so the market's bullishness was guarded; the stock only rose by 1.2% on the news. That was good enough to beat the S&P 500 (SNPINDEX: ^GSPC), though, as that indicator fell by 0.3% on the day. Toward the end of Tuesday's trading session, Snap announced that it is launching a new line of tech-enhanced eyeglasses called Specs. In the announcement, made at this year's annual Augmented World Expo, the company said the rollout would occur next year. It did not get more specific. It did promise several attractive features of the upcoming augmented reality (AR) products, including artificial intelligence (AI) assistance, social connectivity, and a virtual workstation in case users feel like being productive and not playful. In its official press release touting the eyewear, Snap co-founder and CEO Evan Spiegel said, "We believe the time is right for a revolution in computing that naturally integrates our digital experiences with the physical world." "We couldn't be more excited about the extraordinary progress in artificial intelligence and augmented reality that is enabling new, human-centered computing experiences," he added. Specs is the continuation of the company's digitally enhanced glasses product line, Spectacles. It introduced the first Spectacles in 2016. There's a big mountain to climb here, though, and that belongs to Meta Platforms (NASDAQ: META). Nearly two years ago, Snap's social media rival introduced its Ray-Ban Meta smart glasses, with the product earning generally positive reviews, especially for its feature set. Snap will have to keep on its toes to carve out meaningful share in this still-limited market. Before you buy stock in Snap, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Snap wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $649,102!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $882,344!* Now, it's worth noting Stock Advisor's total average return is 996% — a market-crushing outperformance compared to 174% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 9, 2025 Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool's board of directors. Eric Volkman has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Meta Platforms. The Motley Fool has a disclosure policy. Why Snap Stock Was a Winner on Wednesday was originally published by The Motley Fool Sign in to access your portfolio


Time of India
12-06-2025
- Business
- Time of India
Snap to launch smart glasses for users in 2026 in challenge to Meta
HighlightsSnap Inc. will launch its first-ever smart glasses for consumers, called Specs, next year, intensifying competition with Meta Platforms, Inc. in the wearable technology market. Snap Inc. has invested over $3 billion in developing augmented reality glasses over the past 11 years, highlighting its commitment to integrating technology into wearable products. Snap Inc. plans to collaborate with Niantic, Inc. to enhance the Lens Studio application, which enables creators to design and publish augmented reality lenses for Snapchat. Snap will launch its first-ever smart glasses for all consumers next year, ratcheting up competition with bigger rival Meta in the wearable technology market . The augmented reality smart glasses, called Specs, will be lightweight, the social media company said on Tuesday. Long known for its messaging app Snapchat and animated filters, Snap has been doubling down on AR, which can overlay digital effects onto photos or videos of real-life surroundings through a camera or lens. Integrating technology into wearable products can open up new lucrative markets and diversify revenue streams for Snap amid an uncertain digital ad market due to changing U.S. trade policies. The company had launched its 5th generation of Spectacles glasses in September, but these were only available to developers. The company has invested more than $3 billion over 11 years developing its augmented reality glasses , Snap co-founder and CEO Evan Spiegel said at the Augmented World Expo 2025 on Tuesday. "Before Snapchat had chat, we were building glasses." The popularity of Meta's Ray-Ban Meta smart glasses developed in partnership with EssilorLuxottica have prompted companies such Google to explore similar investments. Meta continues to add AI features to its glasses to attract more consumers. Snap said it would partner with augmented reality and geospatial technology platform Niantic Spatial to enhance the Lens Studio, which is an application used by creators to design, animate and publish AR lenses for Snapchat camera, and Specs.
Yahoo
11-06-2025
- Business
- Yahoo
Snap to introduce new immersive and lightweight glasses in 2026
US-based technology company Snap has unveiled plans to launch a new augmented reality (AR) product, known as Specs, in 2026. Specs is a lightweight wearable computer integrated into glasses with see-through lenses. The company has invested more than $3bn over the past 11 years to develop this lightweight wearable technology, which integrates digital experiences into a pair of glasses with transparent lenses. Specs are designed to enhance the physical environment through advanced machine learning (ML) and AI, allowing users to engage in shared experiences, gaming, and productivity tasks. Snap's vision for Specs is to overcome the limitations of smartphones, which often require users to focus on small screens rather than their surroundings. The company believes that as AI evolves, existing devices and interfaces must also adapt to fully harness AI's capabilities. Specs are intended to facilitate immersive experiences that allow users to interact with their environment in real-time. Snap is also collaborating with Niantic Spatial to integrate their Visual Positioning System into Specs, and plans to introduce WebXR support for browser-based experiences in the near future. Developers are currently 'building new experiences' for Spectacles, the fifth generation of Snap's glasses that were released in 2024 as a precursor to Specs. Snap's announcement showcased the possibilities for developers utilising the new Specs. Among the early experiences include Gowaaa's Super Travel, a tool designed to help travellers with translation and currency conversion, and Paradiddle's Drum Kit, which supports users in mastering drumming techniques. Other applications include Pool Assist from Studio ANRK, Cookmate from Headraft, and Wisp World from Liquid City, each designed to enhance user experiences in various ways. In addition to Specs, Snap has announced significant updates to Snap OS, incorporating feedback from its developer community. Key updates include deep integrations with OpenAI and Google Cloud's Gemini, enabling the developers create of AI-powered Lenses. The new Depth Module API allows for accurate anchoring of augmented reality information in three dimensions, while the Automated Speech Recognition API supports real-time transcription in over 40 languages. New tools for developers will facilitate the creation of location-based experiences, including a Fleet Management app for monitoring Specs and Guided Navigation for AR tours. These enhancements aim to support developers such as Enklu, which operates immersive holographic theatres across the US. In 2024, Snap announced plans to cut its workforce by about 10%, reflecting ongoing job reductions in the tech sector amid economic uncertainties since 2023. "Snap to introduce new immersive and lightweight glasses in 2026" was originally created and published by Verdict, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site. Sign in to access your portfolio