Latest news with #WWDC


Phone Arena
11 hours ago
- Phone Arena
Did Apple just rip off Google's Pixel AI?
At WWDC 2025, Apple took the wraps off iOS 26 and its brand-new AI features. If you watched the keynote with even one eye on what Google's been doing with Pixel phones and Gemini, you probably had the same reaction I did: 'Wait… haven't I seen this all before?' Let's take a quick look: Live Voicemail — Google Pixel's 'Call Screening' When you are getting a call from an unknown number, the AI assistant will pick up for you and ask the person the purpose for calling. Their answer is relayed to you, in real time, via a text that shows up on screen. So, you can decide whether to answer or not. Pixel's Call Screening was introduced in 2018 with Pixel 3. Personal Voice Assistant on Calls — Pixel's 'Hold for Me' When calling a support line that puts you on hold, you can set your phone down. The AI will identify when an actual person answers your call and will notify you via ringtone. First introduced on Pixel 5 in 202, as 'Hold for me'. Visual Look Up — launched with Samsung and Google phones as 'Circle to Search' Takes a screenshot of the screen and you can tap or circle objects to perform an image search (to find a product that you really like in a video, for example, or look up details through an event poster). Introduced in early 2024, first with the Galaxy S24 series, then the Pixel 8 phones. As it says on the tin, Live Translation will translate between different languages right within a call. Both parties will hear the translator voice, so they know what is being said, when the sentence is over, when to expect a reply back. First introduced by Samsung in 2024 with the Galaxy S24 series, then announced by Google at I/O 2025. Quite honestly, the tech is still a bit clunky and not quite there — I am curious to see if Apple manages to add some polish to it. Thankfully, there is a twist — it's all about how it worksWhere Apple is leaning in is privacy and on-device processing. The pitch is clear: most of these AI features will run locally on your iPhone (assuming you've got an A17 Pro or newer chip), powered by Apple's so-called 'Private Cloud Compute' when off-device work is what's the difference?Apple's Private Cloud Compute only sends data to servers when needed, and even then, the servers don't retain any personal information. Apple also promised open verifiability — meaning, in theory, security researchers will be able to audit how this system Gemini/Assistant is primarily cloud-based. This is because Google does want to bring most of its AI features to all of Android, which means it just can't rely on hardware. We all know how many flavors of Android phones there are out there. Some tasks do run locally, but for full-featured Gemini, your data often goes to Google servers. Yes, Google promises strong privacy protections, but historically it's a more data-hungry model (because that's Google's business model).At the end of the day, though — from a user experience point of view — Apple is now shipping the same features that Pixel owners have enjoyed for years. If you're a die-hard iPhone user, it's great to finally get these tools. If you've been on a Pixel, you're probably thinking: 'Welcome to 2023, Apple.' Me? I'm just happy we are all getting cool stuff. As mentioned above, Apple's big advantage is integration — these AI features are coming to all iPhones that support iOS 26 (with some limitations on older chips), so you will feel a certain level of polish across the ecosystem, whereas Android has that inherent unpredictability that's tied to multiple manufacturers having their own spin on hardware. Not exactly. If anything, I think it became perfectly clear (as if it wasn't already) that Google is well ahead in the AI game when it comes to useful smartphone tools and implementation. But this does mark the start of a new phase: Apple went back to doing things that others have already done, but with a distinct Apple shine applied on top. If history teaches us anything, it's that Cupertino might use this solid base as a launching pad to make something truly unique and game-changing. At the very least, competition is about to heat up. Very, very fast. Secure your connection now at a bargain price! We may earn a commission if you make a purchase This offer is not available in your area.


CNET
15 hours ago
- CNET
I Tried the Future of Smart Glasses at WWDC. They Weren't Made by Apple
On a bright sunny day in Cupertino, California, I crammed into my seat, unlocked my laptop, connected to Wi-Fi and checked in on Slack. Apple's WWDC keynote was about to begin. This time, however, I added a new step to my live event coverage routine. I plugged the Xreal One Pro glasses into my MacBook and activated the dimmer. They became my smart display-enabled sunglasses. For the next 2 hours, I covered Apple's announcements wearing Xreal's display glasses, and they worked better than I expected. The One Pros projected my laptop monitor clearly, removed glare that would have overwhelmed my laptop screen and allowed me to watch the stage presentation at the same time. And it got better: By activating Xreal's auto transparency mode, the glasses dimmed the world when I looked at my virtual screen, then became transparent again when I looked at the stage to follow the action. The future of truly useful everyday AR glasses isn't here yet, but wow, with things like the Xreal One Pro, it's getting close. The Xreal One Pros tether with USB-C, but can project a larger display than before. (Shown here with the separate Eye camera plugged in below the bridge.) Scott Stein/CNET A floating display on demand Xreal's glasses, like all glasses in this product category, use a USB-C cable to tether into whatever you're plugging into. Essentially, they're a tiny wearable monitor with speakers in glasses form. Whatever device supports USB-C video out will work with these glasses, either to mirror your screen or act as a second monitor. While I've used Xreal's glasses to watch movies on planes (really fun and portable) and do work on my laptop and iPad (helpful on planes too, since space can be cramped, and my laptop lid doesn't always open fully in economy), the idea of covering a whole live event where I needed to be fast, effective, multitasking and not screw up was a whole different story. I'm happy to say the experiment worked, largely because of the auto transparency mode I never realized existed before -- thank you, Norm Chan of Tested, who told me about it as we sat down at the keynote. Xreal's glasses have three dimmable lens settings that turn the outer glass either transparent semi-dark or close to opaque. It makes them instant sunglasses and also helps the display show up better in bright sunlight. However, these glasses won't block outside light completely -- light bleeds a bit through the dimmed lenses unless you're sitting in a completely dark place -- but the image is still extremely viewable, and looks good. The transparency mode really made looking at the stage and my own laptop keyboard (and my phone) easier. Xreal's glasses aren't like normal glasses: They have layers of lenses, including the prescription inserts I stacked on top. But they can be used to look around, check messages, even (as I did) shoot some on-the-fly social videos and share them with CNET's social team. Beyond transparency mode, other adjustments include screen size, projection distance, location of the screen and whether it's anchored or floating in my field of view. The Xreal One (left) next to the One Pro with Eye camera attached (right). They work and look nearly the same. Scott Stein/CNET One Pro vs. One: subtle differences I reviewed the non-Pro Xreal Ones earlier this year. Compared with previous Xreal display glasses, they have better built-in audio, and the ability to pin the really sharp 1080p microOLED display in space to anchor it, making them work a lot better as plug-in monitors for tablets, phones or laptops (or handheld game systems like Steam Deck). Xreal's Pro version of the One glasses cost $100 more ($599, going up to $649 after June 30) but have a few advantages. The microOLED projection system still projects down from the top of the glasses into thick angled lenses (called birdbath displays), but the One Pro's lenses are flatter, smaller, and reflect less light from my surroundings. The display area's a bit wider -- 57 degrees field of view, versus 50 for the Ones — but that really just makes the 1080p display feel a bit bigger, and more clearly visible at the edges of the large virtual screen. Prescription lens inserts like I use rest flat against the lenses: it's chunky but better than before. I don't think you need the Pros, but their slightly better performance could be worth the difference to avoid fatigue. The small Xreal Eye camera plugs under the bridge of the glasses. Scott Stein/CNET Optional camera isn't necessary I tried a tiny plug-in camera, too, called the Eye (sold separately for $99), that slots into the bridge of the One series glasses. They're designed for future use with AI apps, potentially, but right now they can capture photos and video clips on the glasses' small 2GB of storage. Images can be offloaded on the iPhone by going to a "transfer" mode in the glasses settings that turns the glasses into a USB camera, and the Photos app was able to just find the glasses and import the images. It's clunky, but it works, although you need to tether the glasses via USB-C like you do in regular display mode. These Xreal glasses don't work wirelessly on their own. Test photo out of the NJT train. Scott Stein/CNET The camera takes passable photos and videos, but not as good as Meta Ray-Bans. I think the camera's here to flex another feature: a full six degrees of freedom mode that can pin a display in space and then be there as you walk around the room. It's not necessary for most things I do, but it shows how these glasses could, in future versions, evolve into something more like 3D augmented reality. Xreal's work with Spacetop, a software suite that can float arrays of apps from laptops, shows where things could go. Xreal's future Android XR developer hardware, called Project Aura, may take things further next year. I'd skip the camera for now and just get the glasses, but I'm really curious where Xreal flexes these functions next. These glasses are coming with me on work trips. Scott Stein/CNET When they're good, they're great Like I said in my Xreal One review, these glasses and their microOLED displays are excellent for movie watching. They're surprisingly effective for doing work, too, since they can pin a display (or a semi-curved wide-angle monitor, thanks to an included setting) in place. I do notice the 1080p resolution limit a bit more now that these glasses can give an even larger display size, and it's something I expect future glasses to address in the next year or two with higher-resolution microOLED chips. At a show where Apple announced new Vision Pro software updates but no word on any glasses of its own, I couldn't help but think about the Xreals on my face. The future is arriving in bits and pieces, but lots of smart glasses are already here and changing fast. And, yes, they're actually useful. The year 2026 may be massive for new smart glasses and AR, and my WWDC 2025 experience with Xreal One Pros proves that the evolution is well underway. Now it's your turn, Apple.


International Business Times
17 hours ago
- International Business Times
Google Mocks Apple's iOS 26 With Humorous Podcast Video, Samsung Joins Roast
Apple announced iOS 26, the newest version of its operating system, on June 9th as one of a slew of new announcements at the Worldwide Developers Conference (WWDC). Apple also unveiled a new visual revamp, which they referred to as the "Liquid Glass" design. The new look is also a part of changes to core apps and the entire platform, representing a major change in its aesthetics. However, it seems these new features and announcements by Apple have not impressed tech giants like Samsung and Google at all. Google has posted a playful video mocking a bundle of new features arriving on iOS 26, some of which Android users—especially Pixel owners—have been enjoying for years. Entitled 'Best Phones Forever: Responding to MORE Rumors,' the video was uploaded to Google's YouTube account on June 17 and has since generated a heap of laughter online. The animated video is a playful podcast-style exchange between an Apple iPhone and a Google Pixel. As they chat, the Pixel indirectly accuses iOS 26 of copying some of its older features, which it sarcastically refers to as "just some crazy coincidences" because "they're not out yet!" Features like live text translation, hold assist, and call screening—newly added in iOS 26—have been available on Pixel devices since at least 2018. To add further insult to injury, the video winds down with iPhone asking anxiously, "Sooo... What are you working on for Pixel 10?" —a not-too-subtle reminder of Google's forthcoming launch, which has been speculated for August. However, Google wasn't the only one to mock the Tim Cook-led smartphone maker. Samsung did it days ago, during Apple's WWDC25. Apple's toughest competitor poked fun at iOS 26 and macOS 26 with a series of snarky posts on X (formerly Twitter). One post read, "Customizable apps? Floating bars? That sleek glass UI? Looks... familiar." Another jab stated, "Apple brings Live Translation. For a fuller account, check the Galaxy S24 launch 16 months ago." Samsung also flaunted its Galaxy AI improvements and subtly suggested that Apple's updates are more reactive than innovative. "AI is coming to your watch? Cute! Ours already knows you're too tired to care #GalaxyAI," the Android giant said, pulling no punches. What was noteworthy about Apple's keynote wasn't just the dearth of any specific emphasis on AI; it was how the competition gleefully filled that silence. With Apple's focus on privacy and evolutionary innovation, competitors like Google and Samsung took the opportunity to remind users that they have been ahead on AI-powered features for some time.


CNET
18 hours ago
- CNET
Everything You Should Know About Enhanced Visual Search on Your iPhone
Apple announced at its Worldwide Developers Conference on June 9 that the next version of the iPhone operating system is called iOS 26. The tech giant said iOS 26 will bring a transparent glass design to icons and menus, the Camera and Photos apps will get redesigned interfaces and much more to iPhones. But when Apple released iOS 18 in September 2024, it included a feature called Enhanced Visual Search, and some people online have voiced privacy concerns about the feature. Read more: Everything You Need to Know About iOS 18 Enhanced Visual Search sends photo data from your iPhone to an Apple servers in an effort to help you better find pictures in your photo library. It's turned on by default, but you can turn the feature off in a few easy steps if you're worried about privacy. Here's what you need to know about Enhanced Visual Search and how to disable to feature if you want. Enhanced Visual Search privacy measures "Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest," Apple wrote online. "Your device privately matches places in your photos to a global index Apple maintains on our servers." According to an Apple research post, parts of a photo that might contain a landmark are encrypted then sent out to an Apple server, so the whole picture is not sent. Your encrypted data is also one point among other pieces of junk data not associated with any of your, or anyone else's, images. Once Apple server receives the data, it won't decrypt the data. It only works with the encrypted data. If the server establishes whether or not the encrypted photo data contains a corresponding landmark, the server then sends your device an encrypted response, which your device then decrypts. The company also wrote in the post that it uses a third-party-operated Oblivious HTTP relay to hide the IP address of your data. And each time your iPhone sends photo data to a server the data is given a new IP address. But some people have questioned if this is enough to protect your data. "If my computer sends data to the manufacturer of the computer, then it's not private, or at least not entirely private," developer Jeff Johnson wrote online. "A software bug would be sufficient to make users vulnerable, and Apple can't guarantee that their software includes no bugs." If you are concerned about Enhanced Visual Search sending your photo data to Apple's servers, here's how to turn the feature off. It's important to note that if you have an iCloud account to store photos or backup your iPhone data, your photos are still going to Apple's servers. Turning this feature off won't stop that. How to disable Enhanced Visual Search Apple/Screenshot CNET 1. Open Settings. 2. Tap Apps. 3. Tap Photos. 4. Tap the toggle next to Enhanced Visual Search. Now your iPhone won't send encrypted photo data to an Apple server to help you find pictures on your iPhone. That also means searching for pictures on your iPhone could likely suffer. If you want to turn the feature back on, follow the steps above. For more on iOS 18, here's what you need to know about iOS 18.5 and iOS 18.4. You can also check out our iOS 18 cheat sheet and everything to know about iOS 26.


CNET
a day ago
- CNET
What Apple Could Bring to Your iPhone With iOS 18.6 Before iOS 26
Apple released the first public beta of iOS 18.6 on June 18, just over a week after the company announced iOS 26 at its Worldwide Developers Conference. While the iOS 26 update will bring a major redesign to iPhones this fall, the latest beta mostly consists of bug and security fixes. Read more: An Expert's Guide to iOS 18 Apple/Screenshot by CNET Since this is a beta, I recommend downloading it only on something other than your primary device. Because this isn't the final version of iOS 18.6, the update might be buggy, and battery life may be affected, so it's best to keep those troubles off your primary device. Note that the beta is not the final version of iOS 18.6, so more features could land on your iPhone when it is released. It's unclear when Apple will release iOS 18.6 to the general public. However, it will likely be the last significant iOS update the company releases before it makes iOS 26 available this fall. Here's what to expect from iOS 18.6 when it lands on your iPhone. This beta is all about security patches and squashing bugs If you're a developer or beta tester, don't expect much from this beta other than bug and security fixes. After downloading and looking into iOS 18.6 beta 1, I found no new features or noticeable changes. And that's not surprising considering Apple's iOS release schedule over the years. Apple announced iOS 18 at WWDC 2024 and released iOS 17.6 more than a month later in July. That update was filled with more than 30 important bug fixes and security patches, but no new features. Apple recommended that everyone download the update at the time, and it was the last major iOS 17 update before the release of iOS 18. Apple did the same thing in July 2023 when it released iOS 16.6. When the tech giant released that update, it was focused on bug fixes and security patches; the next major iOS release was iOS 17. Apple is likely shifting gears and focusing more on iOS 26. In fact, the company has already released two developer betas of that iOS software in preparation for its fall release. There will be more betas before iOS 18.6 is released to the public, so there's plenty of time for Apple to add features or change others -- but I wouldn't count on new features. Apple has not announced when it will release iOS 18.6, but since iOS 17.6 and iOS 16.6 were both released in July of 2024 and 2023, respectively, I expect Apple to release iOS 18.6 next month. For more on iOS 18, here's what you need to know about iOS 18.5 and iOS 18.4. You can also check out our iOS 18 cheat sheet and everything to know about iOS 26.