A very useful feature is coming back to Android

An in-depth analysis of one of the latest versions of Google Assistant has revealed Google's work on the possible return of a very useful feature

There was "something" before Google Assistant. The intelligent digital assistant from Mountain View has slowly made its way into the daily lives of many, thanks to a steady growth in terms of functionality and ability to react to voice input and thanks to an increasing integration with smart home devices.

But, indeed, before Google Assistant there was something else. A long tap on the central button of Android smartphones, before Assistant, called up a function that, if it hadn't become part of everyone's habits, could certainly not be said not to be very convenient. Google Now on Tap, a software gimmick thanks to which Google's artificial intelligence analyzed what was on the screen in order to show relevant results very close to the context. The name of a monument to which Now on Tap reacted by proposing the Wikipedia page, or an address that triggered in Google's intelligence the idea of proposing Google Maps: Now on Tap was useful to many, and that's why Google tried to keep it even with the debut of Assistant by changing its name to What's on the screen.

The return of What's on the screen

What's on the screen has lived a much more uncertain life than Now on Tap. Google introduced it but never integrated it into Assistant to the point of making it a permanent feature, central to the smart assistant's day-to-day. What's on Screen turned out to be an evanescent feature.

According to reviews from XDA, however, which has thoroughly analyzed one of the latest versions of the Assistant app, Google is testing the possibility of including What's on Screen in the new My Actions section that is expected to debut within the app in the coming weeks or months.

For the moment the new feature doesn't work yet, but it should show proactive results based on what's displayed on the screen just like Now on Tap first and What's on the screen later. Needless to say, a return in style could make happy those users who have been orphaned of such an assistive function, which, moreover, put to use all the knowledge accumulated by Google in years of investment in artificial intelligence.

We are, however, still in a preliminary phase, and it would not even be surprising if Google threw everything in the trash as it has done on several occasions over the years.

Google studies, tests, evaluates and in the end decides, and unlike other companies that when they decide to invest time and resources in the development of something they distribute it to the users anyway, there are concrete possibilities that the work remains in a certain sense unfinished if something of the result doesn't reflect Google's initial expectations.