Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    Tenways’ compact e-bike twists and folds to go flat

    Tenways’ compact e-bike twists and folds to go flat

    February 13, 2026
    Spider-Noir looks like a hard-boiled thriller in first trailer

    Spider-Noir looks like a hard-boiled thriller in first trailer

    February 12, 2026
    The surprising case for AI judges

    The surprising case for AI judges

    February 12, 2026
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Google’s Visual Search Can Now Answer Even More Complex Questions
    Business

    Google’s Visual Search Can Now Answer Even More Complex Questions

    News RoomBy News RoomOctober 7, 20244 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    Google’s Visual Search Can Now Answer Even More Complex Questions

    When Google Lens was introduced in 2017, the search feature accomplished a feat that not too long ago would have seemed like the stuff of science fiction: Point your phone’s camera at an object and Google Lens can identify it, show some context, maybe even let you buy it. It was a new way of searching, one that didn’t involve awkwardly typing out descriptions of things you were seeing in front of you.

    Lens also demonstrated how Google planned to use its machine learning and AI tools to ensure its search engine shows up on every possible surface. As Google increasingly uses its foundational generative AI models to generate summaries of information in response to text searches, Google Lens’ visual search has been evolving, too. And now the company says Lens, which powers around 20 billion searches per month, is going to support even more ways to search, including video and multimodal searches.

    Another tweak to Lens means even more context for shopping will show up in results. Shopping is, unsurprisingly, one of the key use cases for Lens; Amazon and Pinterest also have visual search tools designed to fuel more buying. Search for your friend’s sneakers in the old Google Lens, and you might have been shown a carousel of similar items. In the updated version of Lens, Google says it will show more direct links for purchasing, customer reviews, publisher reviews, and comparative shopping tools.

    Courtesy of Google

    Lens search is now multimodal, a hot word in AI these days, which means people can now search with a combination of video, images, and voice inputs. Instead of pointing their smartphone camera at an object, tapping the focus point on the screen, and waiting for the Lens app to drum up results, users can point the lens and use voice commands at the same time, for example, “What kind of clouds are those?” or “What brand of sneakers are those and where can I buy them?”

    Lens will also start working over real-time video capture, taking the tool a step beyond identifying objects in still images. If you have a broken record player or see a flashing light on a malfunctioning appliance at home, you could snap a quick video through Lens and, through a generative AI overview, see tips on how to repair the item.

    First announced at I/O, this feature is considered experimental and is available only to people who have opted into Google’s search labs, says Rajan Patel, an 18-year Googler and a cofounder of Lens. The other Google Lens features, voice mode and expanded shopping, are rolling out more broadly.

    The “video understanding” feature, as Google calls it, is intriguing for a few reasons. While it currently works with video captured in real time, if or when Google expands it to captured videos, entire repositories of videos—whether in a person’s own camera roll or in a gargantuan database like Google—could potentially become taggable and overwhelmingly shoppable.

    Courtesy of Google

    The second consideration is that this Lens feature shares some characteristics with Google’s Project Astra, which is expected to be available later this year. Astra, like Lens, uses multimodal inputs to interpret the world around you through your phone. As part of an Astra demo this spring, the company showed off a pair of prototype smart glasses.

    Separately, Meta just made a splash with its long-term vision for our augmented reality future, which involves mere mortals wearing dorky glasses that can smartly interpret the world around them and show them holographic interfaces. Google, of course, already tried to realize this future with Google Glass (which uses fundamentally different technology than that of Meta’s latest pitch). Are Lens’ new features, coupled with Astra, a natural segue to a new kind of smart glasses?

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleWatch this one-minute preview of Apple’s first scripted Vision Pro short
    Next Article Apple’s next MacBook Pros might have leaked in Russia

    Related Posts

    What Happens When Your Coworkers Are AI Agents

    What Happens When Your Coworkers Are AI Agents

    December 9, 2025
    San Francisco Mayor Daniel Lurie: ‘We Are a City on the Rise’

    San Francisco Mayor Daniel Lurie: ‘We Are a City on the Rise’

    December 9, 2025
    An AI Dark Horse Is Rewriting the Rules of Game Design

    An AI Dark Horse Is Rewriting the Rules of Game Design

    December 9, 2025
    Watch the Highlights From WIRED’s Big Interview Event Right Here

    Watch the Highlights From WIRED’s Big Interview Event Right Here

    December 9, 2025
    Amazon Has New Frontier AI Models—and a Way for Customers to Build Their Own

    Amazon Has New Frontier AI Models—and a Way for Customers to Build Their Own

    December 4, 2025
    AWS CEO Matt Garman Wants to Reassert Amazon’s Cloud Dominance in the AI Era

    AWS CEO Matt Garman Wants to Reassert Amazon’s Cloud Dominance in the AI Era

    December 4, 2025
    Our Picks
    Spider-Noir looks like a hard-boiled thriller in first trailer

    Spider-Noir looks like a hard-boiled thriller in first trailer

    February 12, 2026
    The surprising case for AI judges

    The surprising case for AI judges

    February 12, 2026
    Ring cancels its partnership with Flock Safety after surveillance backlash

    Ring cancels its partnership with Flock Safety after surveillance backlash

    February 12, 2026
    Do you believe in magic?

    Do you believe in magic?

    February 12, 2026
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    YouTube is coming to the Apple Vision Pro News

    YouTube is coming to the Apple Vision Pro

    By News RoomFebruary 12, 2026

    The Apple Vision Pro is finally getting an official visionOS YouTube app on Thursday. With…

    Jeffrey Epstein might not have created /pol/, but he helped carry out its mission

    Jeffrey Epstein might not have created /pol/, but he helped carry out its mission

    February 12, 2026
    Eufy’s midrange X10 Pro Omni robovac has fallen to its best-ever price

    Eufy’s midrange X10 Pro Omni robovac has fallen to its best-ever price

    February 12, 2026
    El Paso airspace closure was reportedly triggered by the CBP’s use of an anti-drone laser

    El Paso airspace closure was reportedly triggered by the CBP’s use of an anti-drone laser

    February 12, 2026
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2026 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.