Google’s enhancing Google Lens, its computer vision-powered app that brings up information related to the objects it identifies, with new features.
Starting today, Lens can surface skin conditions similar to what you might see on your own skin, like moles and rashes. Uploading a picture or photo through Lens will kick off a search for visual matches, which will also work for other physical disorders that you might not be sure how to describe with words (like a bump on the lip, a line on nails, or hair loss).
It’s a step short of the AI-driven app Google launched in 2021 to diagnose skin, hair, and nail conditions. That app, which debuted first in the E.U., faced barriers to entry in the U.S., where it would have had to have been approved by the Food and Drug However, this Lens feature could be helpful for those deciding to seek medical advice or prescription medications.
Read More:- Does Google Lens Detect any person’s Face?
In addition, as was previously revealed during I/O, Lens is integrating with Bard which is Google’s AI-powered chatbot. Customers will be able to add images to their Bard requests and Google Lens will be working in the background to help Bard comprehend what’s shown. For instance, if you show an image of shoes and then ask what they’re named, Bard — informed by Lens Analysis will provide an answer.
This is the most recent update to Bard Google’s answer to ChatGPT as Google is investing a growing amount of money into technology that is generative AI technologies. This week Google has introduced the capability to allow Bard to compose, run as well as test its own program in the background – making it more efficient to program and tackle complex math-related problems. In May Google joined forces in partnership with Adobe to bring the art-making process to Bard.