logo
#

Latest news with #Pixel

Gemini on Android can now identify songs — but there's a catch
Gemini on Android can now identify songs — but there's a catch

Tom's Guide

time4 hours ago

  • Entertainment
  • Tom's Guide

Gemini on Android can now identify songs — but there's a catch

Just days after launching 'Search Live,' Google continues to evolve Gemini into a solid AI Assistant on Android, with the latest update bringing a long-missing fan favorite: song identification. You can now ask Gemini, 'What song is this?' and the chatbot will trigger Google's Song Search interface, the same listening tool familiar to Assistant users. It can recognize music from your environment, a playlist or even if you hum the tune yourself. I'm curious to test just how accurate the humming needs to be. Song Search isn't fully native to Gemini yet. When you ask for song identification, it launches a full-screen listening interface from the Google app (not directly inside Gemini), The song identification update works in the Gemini app on Android, but when you type (or speak) "What song is this?" it kicks you into the Google app's Song Search interface, virtually handing off to the Google app, which feels a bit clunky compared to the way Pixel's Now Playing Assistant handles fairly well. It also doesn't stay inside Gemini Live's conversational experience, which also feels like a miss. And, once it finds a match, the result appears in Google Search, not as an inline Gemini response. For now, this feature works only on Android; there's no Gemini song identification on iOS yet. You'll also need to manually restart the full-screen Song Search interface if you want to identify multiple songs in a row. In terms of comparison, Google Assistant's built-in Now Playing feature is still more seamless. You can trigger it with a voice command or lock screen shortcut, and it shows results inline (often with album art) and even works offline. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. In contrast, Gemini's new Song Search relies on the Google app's listening interface. You need to type or say 'What song is this?' in Gemini, which then opens a full-screen search window and hands off the result to the Google app. It's accurate, pulling from the same song database as Assistant, but the experience is naturally less smooth, since it requires switching apps and doesn't stay inside Gemini itself. Another key difference: while Google Assistant's Now Playing is available on most Android devices and pre-installed on Pixel, Gemini's Song Search is Android-only for now, and it's not yet available on iOS. Yes, Gemini can now identify songs, but it's not yet as smooth as Google Assistant or Pixel's Now Playing feature. Still, if you're using Gemini daily, this brings back a key Assistant-era capability, which is a good sign that Google is listening to user feedback as Gemini evolves into a more complete voice assistant.

Tired Of Overedited iPhone Photos? Adobe Launches Free Camera App For iPhone—Built By Pixel Camera Creators
Tired Of Overedited iPhone Photos? Adobe Launches Free Camera App For iPhone—Built By Pixel Camera Creators

India.com

time7 hours ago

  • India.com

Tired Of Overedited iPhone Photos? Adobe Launches Free Camera App For iPhone—Built By Pixel Camera Creators

New Delhi: Have you ever felt like your iPhone Photos look a bit too bright or overly edited? Adobe has launched a new iPhone-only camera app called Project Indigo. The app has been built by the same team behind Google's Pixel camera. Unlike typical smartphone apps, it offers more manual control and aims to deliver a DSLR-style photo experience. It's free to download on the App Store for now. More Natural, True-to-Life Photos Adobe says its new app, Indigo, aims to deliver more natural, true-to-life images—closer to what you'd get from a DSLR. There's less smoothing, less over-sharpening, and the colour adjustments are subtle, avoiding that overly edited 'HDR' look common in regular phone cameras. Full Manual Camera Controls Indigo gives you full manual control over your camera settings like focus, shutter speed, ISO, and white balance. You can choose to shoot in JPEG or RAW (DNG), and even decide how many frames the app captures for each shot. Why does that matter? Because Indigo uses advanced tech to blend up to 32 images into one, helping reduce noise and keep all the details sharp. Night Mode & Long Exposure for Creative Shots Indigo also includes a Night mode that suggests longer exposures in low light, helping you get clearer shots in the dark. There's even a Long Exposure setting to create smooth, motion-blur effects—perfect for capturing waterfalls, flowing traffic, or glowing city lights. Clearer Zoomed-In Shots Adobe says with Indigo, zoomed-in photos will look much clearer and less blurry. Instead of guessing what the image should look like using AI, the app uses a clever trick called multi-frame super-resolution—it quickly snaps several shots when you zoom and blends them together to give you a sharper, more detailed photo. Adobe is also working on a live preview feature, which will let you see how your edited photo will look right in the viewfinder—before you even press the shutter. That could totally change the way people frame and shoot photos on their phones.

Adobe launches free camera app for iPhone users, it is made by same team that made Google Pixel camera
Adobe launches free camera app for iPhone users, it is made by same team that made Google Pixel camera

India Today

time11 hours ago

  • India Today

Adobe launches free camera app for iPhone users, it is made by same team that made Google Pixel camera

If you've ever felt your iPhone photos looked a bit too bright, too smooth, or just too 'smartphone-y,' Adobe may have just created your new favourite camera app. Project Indigo, which is now available as a free download on the App Store, is a new camera app designed by Adobe Labs, and it's built by the same team that helped create the iconic Pixel camera at Google. This time, the goal is different: give iPhone users more manual control and a more realistic, DSLR-style photo experience. For now, Indigo is free to try and available only on iPhone. advertisementHere's what iPhone users need to smartphone cameras today heavily process your photos – they brighten the shadows, smooth your skin, sharpen edges, and boost colours to make things pop on a small screen. While this can make pictures look good at a glance, they often feel artificial, especially when viewed on a bigger display. Adobe says Indigo is designed to produce a more natural, true-to-life image, closer to what you'd get from a DSLR. It applies less smoothing and sharpening, and its colour enhancements are subtle. The app avoids the common 'HDR-ish' or overly edited style that's typical of most default camera offers full manual camera controls – including focus, shutter speed, ISO, and white balance. You can shoot in JPEG or raw (DNG), and even control how many frames are captured for each photo. This matters because Indigo uses computational photography to combine up to 32 images to reduce noise and preserve also a Night mode that automatically suggests longer exposures in dark scenes, and even a Long Exposure setting to capture dreamy motion blur – perfect for waterfalls or city light. perfect for waterfalls or city also promises that with the Indigo app, your zoomed in pictures won't be blurry or noisy anymore. According to the Project Indigo blog post, when you pinch to zoom on the app, it uses a smart feature called multi-frame super-resolution that quietly captures several photos and blends them for sharper results. No AI guessing, just smarter shooting. And, because Indigo is by Adobe, it also seamlessly integrates with Lightroom Mobile. When you review photos in Indigo's gallery, you can launch Lightroom with a single tap to start editing right away – whether it is a JPEG or a raw DNG file. If you're already using Adobe's editing tools, this makes your workflow smoother than Adobe says it is also working on a live preview system, where you will be able to see the final edited look of your photo right in the viewfinder before you take the shot. This could dramatically change how people compose photos on their phones.

A philosophical war over the iPhone camera app
A philosophical war over the iPhone camera app

Hindustan Times

time17 hours ago

  • Hindustan Times

A philosophical war over the iPhone camera app

It is not at all intriguing that Adobe's latest release of Project Indigo, a free experimental camera app for the Apple iPhone (there's an Android version coming soon), brings mobile photography back into conversation. Even more so, how it perhaps retrains focus on different approaches to often similar results. One that intersects computational photography with a camera app, or the other that takes a 'zero processing' approach towards delivering photos a user captures on their iPhone. Adobe's new free camera app, called Project Indigo, has been put together by former Pixel camera engineers and combines computational photography with a layer of AI features. Likely, a significant moment in an increasingly competitive third-party camera app ecosystem. Project Indigo, on its part, emerges from an impressive pedigree, having been developed by Marc Levoy and Florian Kainz, who were instrumental in establishing the Pixel phones as the benchmark smartphone cameras for many years (and many consider that to be the case even now as well). It wasn't plain sailing, as competition caught up, but Pixel phones made a smart pivot towards computational photography capabilities, when the time was right. With Project Indigo, Levoy and Kainz, have access to the iPhone photography hardware. I've used it to a certain extent, and all I'll say for now is that it is simply not a reimagined version of the Pixel Camera app. This is something that is going much beyond what the default Camera app can do. But here's the thing — not all the time. As a user, there's choice, but for now perhaps not an undeniably definitive one. Project Indigo has a unique computational photography pipeline. 'First, we under-expose more strongly than most cameras. Second, we capture, align, and combine more frames when producing each photo — up to 32 frames as in the example above. This means that our photos have fewer blown-out highlights and less noise in the shadows. Taking a photo with our app may require slightly more patience after pressing the shutter button than you're used to, but after a few seconds you'll be rewarded with a better picture,' Adobe says. This is where the big change lies — an aggressive multi-frame approach that's a more intensive computational strategy than many competitor apps, with insistence that priority is on image quality (requiring a dash of patience). This should work as well for casual users, as for the more enthusiastic demographic (I wouldn't call them professional, that side of the table has their own preferences), with the option of enabling the full array of manual controls, as well as both JPEG and raw formats. Strength in diversity? The third-party camera app landscape as it stands, reveals a fascinating philosophical divide between different approaches to smartphone photography. Halide Mark II, Camera+ 2 and VSCO, some prime names, and Final Cut Camera and Leica Lux some very likeable ones too. The idea for third-party camera apps has always been to offer a little more in terms of functionality and perhaps unlock certain functionality that the default camera app doesn't have. That's before we get to the main bit — image processing and the differing approaches. At one end of the spectrum lies the 'zero processing' movement. Halide's Process Zero, is an example. This basically means something that has no AI input and no computational photography pipeline in image processing. There are two distinct schools of thought on this — one that believes shunning AI is a better bet to produce beautiful, film-like natural photos, while the other believes AI does enough to accentuate detailing that may otherwise have been missed. It is a philosophical tension. VSCO, for instance, puts forward a proposition of blending the camera app with extensive editing capabilities as well as quick access to social media apps. Halide Mark II positions itself with professional-grade manual controls, and a tech called Neural Macro that allows iPhones without a dedicated macro lens to get photos with that effect. Camera+ 2 uses AI extensively, for scene detection and automatic optimisation while still providing full manual control when needed. I'd say Project Indigo is embracing a bit of the latter, but with certain diversions towards improvement, like they have explained. The fundamental disagreement about image processing is perhaps why we have differing approaches, and thereby preference based choice for users. A user perhaps has to ask themselves which side they lean on. Is the intent to capture reality as accurately as possible, or to create the most visually appealing image regardless of any computational gymnastics required? There will not be a one-size-fits-all answer. Project Indigo's entry into this ecosystem represents more than just another camera app — it signals Adobe's serious interest in mobile photography and computational imaging. Of course they pitch for closer integration with their creative apps, including the Lightroom app for smartphones. I do see Adobe with the biggest trump card up their sleeve — the mix of their own approach to research, in-house AI development which Firefly resoundingly testifies to, and the expertise of former Pixel engineers who know what they're doing. We seem to be at a point where philosophy will provide a foundation for more sophistication. Vishal Mathur is the Technology Editor at HT. Tech Tonic is a weekly column that looks at the impact of personal technology on the way we live, and vice-versa. The views expressed are personal. Get 360° coverage—from daily headlines to 100 year archives. 11% OFF ₹53,600 Check Details 6% OFF ₹135,900 Check Details 7% OFF ₹111,900 Check Details 8% OFF ₹82,900 Check Details 8% OFF ₹73,500 Check Details 14% OFF ₹59,900 Check Details ₹134,899 Check Details ₹7,999 Check Details ₹9,999 Check Details 5% OFF ₹54,999 Check Details ₹26,999 Check Details ₹15,999 Check Details

Google's Pixel phones may soon borrow a trick from Samsung's Now Bar
Google's Pixel phones may soon borrow a trick from Samsung's Now Bar

Phone Arena

timea day ago

  • Phone Arena

Google's Pixel phones may soon borrow a trick from Samsung's Now Bar

Referential image of the Samsung Galaxy S25 Ultra with the Now Bar. | Image credit — Phonearena Google might be working on its own version of Samsung's Now Bar, according to new details spotted in the latest Android 16 beta. The feature, called "Gemini Space," could be a major step forward for Google's At a Glance widget, offering more real-time information on the lock screen. As reported based on findings through a deep dive of the underlying code, traces of this new experience first appeared in last month's Android 16 QPR1 Beta 1 release. A new system configuration file named "Ambient Data" was added, which appears to be the internal codename for Gemini Space. That file was found in firmware for both the Pixel 9 Pro and Pixel 8 Pro, suggesting the feature won't be limited to the upcoming Pixel 10. Other clues point to an "Ambience Hub," though it's not clear how it will work. The name suggests a new interface for showing useful data on the lock screen or always-on display. This would align with other Google features that use 'ambient' in their name, such as ambient display or ambient AOD. More interestingly, the Android System Intelligence app — which powers the current At a Glance widget — now includes hints about sports scores and finance updates. A toggle for finance recaps was even found in the At a Glance settings, further supporting the idea that these updates could appear as part of Gemini Space. OneUI's Now Bar and Now Brief was a welcome addition when it launched with the Galaxy S25 series. | Image credit — PhoneArena All signs point to Google preparing a rebranded and upgraded version of At a Glance. If true, Gemini Space would serve a similar purpose to Samsung's Now Bar and Now Brief — the former being a live info chip on the lock screen, and the latter offering a full-page summary of your day. While some may say that At a Glance already covers this, Pixel phones don't have anything quite like the Now Brief: a dedicated, rich feed of contextual updates accessible right from the lock screen. That could change if Gemini Space and the rumored Ambience Hub roll out as expected. Whether this will be a Pixel 10 exclusive or come to older models remains to be seen, but the feature looks like a natural evolution of At a Glance — and perhaps a strategic way to keep Gemini AI front and center. Secure your connection now at a bargain price! We may earn a commission if you make a purchase This offer is not available in your area.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store