logo
#

Latest news with #VisualIntelligence

Find hidden discounts with brainy new phone trick that instantly slashes cost of shopping – I've already tried it out
Find hidden discounts with brainy new phone trick that instantly slashes cost of shopping – I've already tried it out

The Sun

time14-06-2025

  • The Sun

Find hidden discounts with brainy new phone trick that instantly slashes cost of shopping – I've already tried it out

Sean Keach, Head of Technology and Science Published: Invalid Date, WHO doesn't love a discount? Sadly they're hard to find – but a new iPhone trick has you covered. Apple has just announced a clever upgrade for millions of iPhones that may be able to help you bag a top deal, and I've already seen it in action. I took a trip to Apple Park HQ in California this week, where I was able to check out some of the new iPhone upgrades coming in iOS 26 later this year. One of the big changes was an improvement to Visual Intelligence, which is an iPhone feature that 'scans' what you're looking at through the camera. With the new update, you can now 'scan' what you're looking at on screen. That means you can take a screenshot of an object you like the look of, and have it 'scanned' by Apple Intelligence AI tech. HOW NEW VISUAL INTELLIGENCE WORKS Imagine you've seen a lovely lamp in the background of a picture. You can screenshot it, and then use Visual Intelligence by scrubbing your finger over the lamp (or circling it works too, I found). Your iPhone will then surface that exact lamp and/or ones that look exactly like it. It'll list them alongside their prices on Google, Etsy, and other supported shopping apps on your phone. So you can then find the exact type of lamp you're looking for, and nab it from wherever is offering the best price. It won't just work with lamps: you could do this with a pair of trousers, or a drinks coaster, or even a sofa. Apple interview with Greg Joz Joswiak on new artificial intelligence, screening, hold assist, carplay, liquid glass and ios updates And even if you don't want the exact item, you can find similar ones – making it the ultimate "dupe" hunter. I tried it out at Apple HQ and it managed to track down a specific bird feeder almost instantly. And it offered very similar alternatives – all of which were varying in price. It was quick and easy, and took me from seeing the item to finding it on sale in a matter of seconds. 7 7 7 You can imagine this being a godsend for trying to find a nice jacket that you saw a celeb wear, or snapping up a nice bowl that you saw at a hotel on holiday. HOW TO ACCESS VISUAL INTELLIGENCE TODAY So when can you use it? Well Visual Intelligence is already available now, but it's only for telling you about items you've snapped a pic of – like identifying a dog breed, for example. The new screenshot-scanning feature is coming in iOS 26. IOS 26 SUPPORTED DEVICES – THE FULL LIST Here are the iOS 26 supported devices... iPhone 16e iPhone 16 ‌iPhone 16‌ Plus iPhone 16 Pro ‌iPhone 16 Pro‌ Max iPhone 15 ‌iPhone 15‌ Plus ‌iPhone 15‌ Pro ‌iPhone 15‌ Pro Max ‌iPhone‌ 14 ‌iPhone‌ 14 Plus ‌iPhone‌ 14 Pro ‌iPhone‌ 14 Pro Max ‌iPhone‌ 13 ‌iPhone‌ 13 mini ‌iPhone‌ 13 Pro ‌iPhone‌ 13 Pro Max ‌iPhone‌ 12 iPhone 12 mini ‌iPhone‌ 12 Pro iPhone 12 Pro Max ‌iPhone‌ 11 ‌iPhone‌ 11 Pro iPhone 11 Pro Max iPhone SE (3rd gen) iPhone‌ SE (2nd gen) Picture Credit: Apple If you want to use Visual Intelligence with your camera today, you'll need to press the Camera Control button. That's the button on the side of the iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max. If your iPhone doesn't have that, you can instead customise the Action Button or Lock Screen to do Visual Intelligence instead – or you could add it to your Control Centre. That's what you'll need to do if you've got an iPhone 16e, iPhone 15 Pro, or iPhone 15 Pro Max. If you have an older iPhone, this feature won't work – as it relies on Apple Intelligence. Then just snap a pic and you'll be able to get info about what you're seeing. When iOS 26 lands later this year (likely in September), you'll just need to take a screenshot of an image, rub your finger on the item, and then search it using the built-in tool.

Apple's Visual Intelligence Is Getting Smarter—But It's Still Missing the Feature I Really Want
Apple's Visual Intelligence Is Getting Smarter—But It's Still Missing the Feature I Really Want

Yahoo

time14-06-2025

  • Yahoo

Apple's Visual Intelligence Is Getting Smarter—But It's Still Missing the Feature I Really Want

PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing. When Apple's senior vice president of software engineering, Craig Federighi, started talking about the Visual Intelligence feature in iOS 26 at WWDC 2025, I hoped for significant changes beyond its existing ability to tell you information about the places and objects you point your camera at on recent iPhones. Instead, we got the somewhat underwhelming news that Visual Intelligence options would soon be available directly in the iOS screenshot interface. I can't deny that these capabilities are practical (if a bit unexciting). But Visual Intelligence still falls short of Google's Gemini Live and Microsoft's Copilot Vision in that it can't converse with you out loud about what you see. This sort of live interactivity isn't necessarily vital, but it does feel exciting and natural to use. The foundation of Visual Intelligence is solid, but I still want Apple to push things forward in a way that aligns with its measured approach to AI. Like many of iOS's best features, Visual Intelligence is a core part of the OS and works seamlessly with its default apps. That means you don't need to open a separate app and upload an image to have the AI analyze it. And the new ability to access the tool whenever you snap a screenshot certainly extends its usefulness. Related options appear on the screenshot interface along the bottom: Ask, which sends the image out to ChatGPT for analysis, or Search, which keeps scans on-device. With the latter, Visual Intelligence can, for example, look for information about an event and create a calendar entry with all the important details. You can also draw over a part of the image to identify it, such as an article of clothing that catches your eye. Visual Intelligence can recognize it and either search it on Google or take you directly to its product page on a shopping app, such as Etsy. Apple is making an API available to app developers so Visual Intelligence can open dedicated apps when it detects relevant content or products. All that said, I still feel like Visual Intelligence is missing a level of interactivity I can get with other tools. On either my Android phone or iPhone, I can converse back and forth with Copilot Vision or Gemini Live about what I'm looking at via the camera app. When I pointed my phone's camera out a motel window recently, for example, Gemini Live identified the tree in the courtyard as an olive tree. I could then continue to ask related questions, such as where the tree species was native. This ability to point my camera at something and simply chat with an AI about it feels orders of magnitude cooler than anything Visual Intelligence currently does. And more importantly, it feels like something I expect an AI assistant to be able to do. I understand that Apple is prioritizing on-device AI, which isn't yet capable of such feats, but it seems like it should be able to develop a similar feature given how much emphasis it puts on the Private Cloud Compute tech. We can only hope the company catches up with its competitors before their AI tools take an even greater leap ahead.

Here are Apple's top AI announcements from WWDC 2025
Here are Apple's top AI announcements from WWDC 2025

Yahoo

time12-06-2025

  • Yahoo

Here are Apple's top AI announcements from WWDC 2025

Last year, Apple's WWDC keynote highlighted the company's ambitious strides in AI. This year, the company toned down its emphasis on Apple Intelligence and concentrated on updates to its operating systems, services, and software, introducing a new aesthetic it calls 'Liquid Glass' along with a new naming convention. Nevertheless, Apple still attempted to appease the crowd with a few AI-related announcements, such as an image analysis tool, a workout coach, a live translation feature, and more. Visual Intelligence is Apple's AI-powered image analysis technology that allows you to gather information about your surroundings. For example, it can identify a plant in a garden, tell you about a restaurant, or recognize a jacket someone is wearing. Now, the feature will be able to interact with the information on your iPhone's screen. For instance, if you come across a post on a social media app, Visual Intelligence can conduct an image search related to what you see while browsing. The tool performs the search using Google Search, ChatGPT, and similar apps. To access Visual Intelligence, open the Control Center or customize the Action button (the same button typically used to take a screenshot). The feature becomes available with iOS 26 when it launches later this year. Read more. Apple integrated ChatGPT into Image Playground, its AI-powered image generation tool. With ChatGPT, the app can now generate images in new styles, such as 'anime,' 'oil painting,' and 'watercolor.' There will also be an option to send a prompt to ChatGPT to let it create additional images. Read more. Apple's latest AI-driven workout coach is exactly what it sounds like — it uses a text-to-speech model to deliver encouragement while you exercise, mimicking a personal trainer's voice. When you begin a run, the AI within the Workout app provides you with a motivational talk, highlighting key moments such as when you ran your fastest mile and your average heart rate. After you've completed the workout, the AI summarizes your average pace, heart rate, and whether you achieved any milestones. Read more. Apple Intelligence is powering a new live translation feature for Messages, FaceTime, and phone calls. This technology automatically translates text or spoken words into the user's preferred language in real time. During FaceTime calls, users will see live captions, whereas for phone calls, Apple will translate the conversation aloud. Read more. Apple has introduced two new AI-powered features for phone calls. The first is referred to as call screening, which automatically answers calls from unknown numbers in the background. This allows users to hear the caller's name and the reason for the call before deciding whether to answer. The second feature, hold assist, automatically detects hold music when waiting for a call center agent. Users can choose to stay connected while on hold, allowing them to use their iPhone for other tasks. Notifications will alert them when a live agent becomes available. Read more. Apple also introduced a new feature that allows users to create polls within the Messages app. This feature uses Apple Intelligence to suggest polls based on the context of your conversations. For instance, if people in a group chat are having trouble deciding where to eat, Apple Intelligence will recommend starting a poll to help land on a decision. Read more. The Shortcuts app is becoming more useful with Apple Intelligence. The company explained that when building a shortcut, users will be able to select an AI model to enable features like AI summarization. Read more. A minor update is being introduced to Spotlight, the on-device search feature for Mac. It will now incorporate Apple Intelligence to improve its contextual awareness, providing suggestions for actions that users typically perform, and tailored to their current tasks. Read more. Apple is now allowing developers to access its AI models even when offline. The company introduced the Foundation Models framework, which enables developers to build more AI capabilities into their third-party apps that utilize Apple's existing systems. This is likely intended to encourage more developers to create new AI features as Apple competes with other AI companies. Read more. The most disappointing news to emerge from the event was that the much-anticipated developments for Siri aren't ready yet. Attendees were eager for a glimpse of the promised AI-powered features that were expected to debut. However, Craig Federighi, Apple's SVP of Software Engineering, said they won't have more to share until next year. This delay may raise questions about Apple's strategy for the voice assistant in an increasingly competitive market. Read more.

Apple's Circle to Search competitor for iOS 26 looks awful
Apple's Circle to Search competitor for iOS 26 looks awful

Android Authority

time11-06-2025

  • Android Authority

Apple's Circle to Search competitor for iOS 26 looks awful

Earlier this week, during the WWDC keynote, Apple showed off its new iOS 26. For the first time since iOS 7 in 2013, Apple is revamping the operating system's look and feel, introducing a very Windows Aero-esque design language called 'Liquid Glass' (RIP Windows Vista), and since this was the flashy new thing at the keynote, it's been the week's hot topic. However, we also saw teasers of other new features that aren't getting the same level of attention. Within the segment on iOS, for example, Billy Sorrentino showed off a new capability of Apple's AI-powered Visual Intelligence, which is called, pretty simply, Image Search. The way it works is that you take a screenshot of anything you see on your iPhone's screen. Once you have the screenshot, you can hit the Image Search button in the lower right. Using AI, Visual Intelligence will scan the screenshot and search for things it sees or create calendar events for dates and times revealed in the image. If this sounds familiar, it's because Google's Circle to Search does the exact same thing and has been available for over a year now. However, I'm not bringing this up to do the usual 'LOL, Apple stealing from Android!' reaction. I'm bringing it up because, based on what we saw in the video, Image Search within iOS 26 seems uncharacteristically bad. Visual Intelligence in iOS 26: Circle to Search, but bad During the keynote (starts at 38:27 in the video embedded at the top), Sorrentino makes Image Search seem so easy and powerful. In his first demo, he pulls up a social media feed. There are multiple posts that are only text, and then one image. He takes a screenshot, initiates Image Search, and tells us, the audience, that he's interested in the jacket the model is wearing in the social media post. Apple's own demo on this Circle to Search-esque feature was plagued with bad answers and a poor UI. Image Search does its thing and pulls up a collection of images that share similarities with the social media post. Note that it doesn't search for the jacket. The software doesn't even know that Sorrentino is interested in the jacket because he never indicated that. All the software does is find images that look similar to the one in his screenshot, and Sorrentino acts like this is a marvel. Sir, I've been using TinEye to do that since 2008. Also, note that Image Search ignored everything else going on in the screenshot. It didn't search for the Emoji that appears in one of the posts, nor did it search for anything related to the numerous avatar images. Somehow it knew to only search through that one image, which seems like something that won't ever happen in real life. In the next demo, Sorrentino finds an image of a room with a mushroom-shaped lamp. He initiates Image Search again, but this time tells the system to investigate the lamp specifically. He does this by scribbling over the lamp with his finger. Note that he doesn't circle the lamp, because that would be a dead giveaway of Apple's intention here, but whatever. Once he circles to search scribbles on the lamp, he sees another list of images. Notice anything weird, though? None of the lamps on the visible list are the one from the original photo! Even the first result, the one he chooses, is very clearly not the lamp he was looking for, but Sorrentino moves forward with adding it to his Etsy favorites as if this were a big success. My guy, that is not the lamp. The system failed, and you're pretending it succeeded. You need to use your hands? That's like a baby's toy! In Sorrentino's final demo, he uses Visual Intelligence to deduce what a photo depicts and ask a question about it. In the example, the photo is of a small stringed instrument. He captures the screenshot and types out a question to ChatGPT. He finds out that the photo is of a mandolin and that this instrument has been used in many popular rock songs. The glaring thing here is that Sorrentino types out his question. That doesn't seem very convenient. With Circle to Search, I can just ask my question verbally. Even during the demo, it's awkward as we watch him thumb out the message about which rock songs use the instrument. Ultimately, that's what was so alarming about this whole segment. This is a pre-recorded Apple keynote demo, so you know it will work better here than in real life. But even the demo shows that it is woefully behind Circle to Search in both form and function. I shudder to think how well it will work when it actually lands. This whole demo was another example of Apple being woefully behind the curve when it comes to helpful implementations of AI tools. This is just another thing to throw on the pile when it comes to Apple dropping the AI ball. It was late to the game, and everything it's tried to do has either been a direct lift from Google, Android, or other Android OEMs, or relied on OpenAI to do the real work. Watching this Image Search demo was like watching an overconfident football player stumble through the big game and still try to act like they nailed it. If nothing else, though, the segment proved a hundred times over that Circle to Search is one of Google's biggest successes in years. How many times has Google made something that Apple then tried to riff on and failed this hard? Granted, I'll give Apple the benefit of the doubt for now. It's possible Image Search could be a lot better when it goes stable in September with the iPhone 17 series. But based on today's demo, its Circle to Search clone is a dud.

Apple's hidden AirPods Pro 3 code in iOS 26 Beta tells us when to expect it to arrive
Apple's hidden AirPods Pro 3 code in iOS 26 Beta tells us when to expect it to arrive

Phone Arena

time11-06-2025

  • Phone Arena

Apple's hidden AirPods Pro 3 code in iOS 26 Beta tells us when to expect it to arrive

The next version of the AirPods Pro could feature new health features such as the ability to track the user's heart rate and temperature. The one question that everyone wants to know is that with Apple reported to be looking at adding cameras to its AirPods, could we see that in AirPods Pro 3 ? Unfortunately, if the latter model is released this year, it will probably not include a camera. In fact, TF International's Hall of Fame Apple analyst Ming-Chi Kuo says not to expect major changes to AirPods until next year. Steve Moser discovered code about the AirPods Pro 3 in the iOS 26 Beta 1 release. | Image credit-X Kuo says that in 2026 Apple will start building its wireless earbuds with an infrared camera similar to what is used for Face ID. The camera could allow users to enable some kind of in-air gesture controls allowing users to control the device through hand gestures. The camera could also allow AirPods to give users an AI experience similar to Visual Intelligence. This could allow users to hear more about a restaurant they are standing in front of, or hear about a landmark that they are looking at. In other words, the AirPods camera could help to give a user contextual information through the earbud. I really can't see Apple releasing an AirPods Pro 3 this year and coming back next year with an AirPods Pro 4 with a camera. The AirPods Pro 2 was released in 2022 so if we do get a sequel this year, a version with a camera might not show up until 2028. Perhaps that's not a bad thing for Apple. Even though a Visual Intelligence feature for AirPods Pro could be dynamite, the extra time could help the tech giant perfect it. Considering the delays and underwhelming response to Apple Intelligence, Apple's best bet is to roll out the AirPods Pro 3 this year with no camera and come back in 2028 with the AirPods Pro 4 featuring a camera and Visual Intelligence. Unlike those guys that Tim pays millions of dollars to each year for advice like this, he is getting my plan for nothing.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store