Back in May, Google announced its innovative Lens feature that makes use of machine learning and image recognition to identify real world objects and analyze complex scenes as well. The feature seems to be rolling out to Pixel and Pixel 2 smartphones now, as per users on Reddit.
Before the rollout, users with rooted Android devices were able to activate Google Lens inside the Assistant app. However, this now appears to be the official rollout despite Google not yet making any announcement about it. Pixel and Pixel 2 users with latest updates and Assistant can check whether the feature is available for them. It can be seen in the bottom right corner of the screen such as that shown in the image above.
The capabilities of Google Lens are said to evolve over time using advanced machine learning, but users can already get an idea of its power by pointing it to real-world objects such as maps, things, images or anything in general to find out more information about them. According to Android Authority, the Assistant app can also give more information about saved images or screenshots using Lens.
It should be noted that the feature is expected to be exclusive to Pixel devices for at least some time. Users with other smartphones featuring Google Assistant will have to wait a bit longer to get in on the "fun".
Source: Reddit via Android Authority | Images via darkhawk75 on Reddit