Android 12 may get deeper Google Lens integration for quick onscreen translation
A new feature has been spotted on Google Lens that will let users translate everything on screen into the language of their own choice. The feature is expected to debut in the beta versions of Android 12 due in May with deep integration into the Android ecosystem.
- Google Lens was recently spotted with the translate option in the Android 12 Developer Preview 2.
- It will enable users to translate everything on the screen to the language of their choice.
- The feature is expected to make its way to the beta versions of Android 12 due in May.
Google is always looking to increase the level of integration between its various services. In an update to its image recognition technology, the tech major is attempting the same. The reason for this becomes increasingly clear as the Android 12 takes shape.
A new feature on the Google Lens has been spotted in the recent apps/multitasking view on the Android 12 Developer Preview 2. The finding comes in the form of a new button that will appear over other apps and let users translate whatever they see on the screen.
The button was recently spotted by Android Police on a Google Pixel 5 handset. In a recent report, it said that the button will work as and when Google Lens detects a language that is other than the default one for the device, as set by the user.
Upon the user’s trigger action, Google Lens will enter the app and auto-detect the language to be translated and do the needful.
Not much is known about the feature as of now. Since it is still in the developing stage, Google might have rolled it out in the beta preview to check out its capabilities. As of now, the new and seemingly handy feature doesn’t work on screenshots but does have an option of operating offline.
For now, the new Google Lens translate button appears near the bottom of the display and above the screenshot and selector. It seemingly comes as the next big feature for Google Lens that has been in the making for since long.
The Android Police report mentions that the feature was spotted on a Pixel 5 that runs the latest Developer Preview 2.1. However, it notes that the feature doesn’t seem to be restricted to a particular version and could be a server-side update or limited A/B test.
Google is anticipated to come up with the beta versions of its Android 12 in May. It is expected that more will be made clear about how the new Google Lens capability works as and when the Android update is rolled out.
The mere presence of the feature indicates that Google has deepened the level of integration that Google Lens has with the rest of its mobile ecosystem. As AI-based technology learns and improves more, it is sure to enter and help users in more such ways that weren’t thought to be possible before.