Google today shared several updates to the Google Lens search feature that's available in the Google app for iOS. Google is adding support for asking questions about videos, so that users can get information about moving objects.
For example, if you're at an aquarium and want to learn more, you can use Google Lens to capture a short video of a fish and get details about it. According to Google, Lens can interpret the video and the question, using the information to provide an AI Overview with additional web links to learn more.
The video feature can be used by opening Lens in the Google app and holding down the shutter button to record a video while also asking a question out loud.
Google has also added support for asking a voice-based question when taking an image with Lens. Users can point the camera, hold the shutter button, and ask a question.
For those who use Google Lens to find visually similar images for shopping purposes, Google says results will be "dramatically more helpful" with information about the product you want to buy, including price info across retailers and where to get it.
In addition to these Google Lens changes, Google is rolling out search results pages organized by AI in the United States. AI results will show first for recipes and meal inspiration on mobile devices, with research results showing relevant results organized by different recipe options, ingredients, and more.
A new look for AI Overviews is rolling out, with the updated design showing prominent links to supporting webpages within the text of an AI Overview. Google says that this layout better drives traffic to supporting websites. Google is also adding ads to AI Overviews, and the company says that people find ads in AI Overviews helpful because "they can quickly connect with relevant businesses, products and services."
The updated Google Lens features are rolling out to iOS devices starting today, as are the search changes.