As Google’s crawling technology becomes more sophisticated and prolific with the release of Android Marshmallow, search behavior is expanding with new input methods that are more contextual and immediate than voice or text-based search.
This article is the third in a series about Android Marshmallow-related changes to the mobile search results that will impact mobile SEO strategies. The first article in this series focused on the Private Index, and the second article focused on Google Now Cards with Merchant data.
This third and final article focuses on another subtle change that Google has added to the mix: Single-gesture search behavior. This is the ability to long-tap on a word or phrase from any page in Google Chrome to execute a new search. This single-gesture search functionality is currently only possible on Android devices so far, but it does fit well with the goals of Google’s new Android OS.
Since Google hasn’t officially announced or documented this change to the Android Chrome functionality, we are calling it “Click to Search.” Click to Search is only available on Android Chrome, unless you have Android Marshmallow, in which case you can also use a version of Click to Search from some stock OS apps like Mail (though not YouTube, Google+, Messenger, News, PlayStore or Calendar yet).
With “Click to Search,” Google’s goal seems to be to enable “drill-down” style search activity without requiring searchers to go back to the search result page, or even find the address bar, to submit their new query.
This may seem like a minor improvement in user experience, but remember, it saves searchers from having to endure extra page loads on mobile devices, and more importantly, it eliminates the need for typing to submit a search. Both are very important aspects of the mobile search user experience.
How Does Click To Search Work?
The ability to initiate a search from a tap on a word is enabled by screen crawling, described in the first article in this series. From Google’s perspective, once the screen is crawled, why not make everything on it a potential search query? This is how Google is tapping into contextual relevance for chained queries and Now on Tap searches — based primarily on what is on your screen when you initiate a search.
This is great for mobile devices, but also for anything with a limited keyboard, or no keyboard at all, such as mobile watches and other wearables. On devices like this, searchers may submit their first query through Google Now with a typed or voice command, then use “Click to Search” as they navigate the web on their Android Wear watch interface to find more detailed information about their original query.
This new method of initiating a search may also encourage people to search more from Android Wear devices and make them more willing to leave the site or app screen that they are on, by searching again to find details on a topic that is within other app or web content, rather than simply investigating within one website.
Barring telepathy, it actually seems like the most intuitive search experience on a smartwatch possible. For those of us who already have Android Marshmallow and Now on Tap, this single-gesture search behavior will seem remarkably similar to the long-hold on the Home button which initiates a Now on Tap search.
This alternative singe-gesture feature initiates an immediate screen crawl and works from any screen on your phone. It can surface more information about whatever text it reads on the screen — not just a single word. It can pull results from the public and Private Index, but it also will show website and app icons at the bottom of the result, as you can see to the right, in the Facebook screen-crawling example.
Why Is Click To Search Important For SEO?
From an SEO perspective, the search results for the word you are highlighting do appear to be the same as the search results you would get if you started at Google.com and typed in the query there — the results do not seem to be chained or contextual yet. The difference between a regular Google search result and a “Click to Search” result is currently all in the display.
Only the first item in the “Click to Search” result is shown in a preview box, without an additional click. Often the first result will be some kind of curated Google result, such as a Featured Rich Snippet (Answer Box), Knowledge Graph result, Location Card, definition or something similar, so ranking a website in this “Click to Search” experience will be hard.
Curated Google results are not 100 percent guaranteed, though — Sometimes the first result is a Wikipedia entry or an actual website. Images, YouTube videos and basically anything that can show up in a regular mobile search (including sponsored listings and PPC results) can become the top “Click to Search” result.
For SEOs, it will be important to watch how these types of search options change mobile user behavior, since the potential for an “exit” can now lead directly to websites, apps or search results from this new interface.
The “superhero” example Click to Search query below shows a search that is initiated from a Target.com shopping page. That search could just as easily have been on a very strategic e-commerce keyword like “men’s graphic tee” or “men’s t-shirt,” in which a competitive website or Google Shopping result ranked first.
This new type of query input brings up a lot of questions for SEOs. Will these new interactivity options decrease page views and time on site, or will users ignore this new search option until they are forced to embrace it on smaller devices like smartwatches? Similarly, will users embrace the long-hold on the home button to access Now on Tap, or will it be ignored?
Both certainly put more importance on ranking #1 for a query, but Google may just fill the results with Featured Rich Snippets, Google Play recommendations and PPC ads, so then what will we do? We are looking forward to the evolution of Android Marshmallow/Google Now on Tap so that we can really see how this new search option plays out.
The next thing to consider is when Google will begin allowing this type of search behavior on images — and leveraging the image relationships and machine learning discussed in the second article in this series. Perhaps soon, users will be able to click on an image in your site or your app to find similar products in other apps or websites.
These image query results might also be prioritized based on Private Index information, because the user has viewed it before, or they could be ranking because many users have it stored in their Private Index. “Click to Search” on images would add a much deeper layer of complexity to a variety of different types of searches and could really change some aspects of mobile SEO dramatically.
It might also represent a significant privacy concern for users. (Would strangers be able to submit a picture of me that they have taken on the street and learn more about me from image-recognition search? Hopefully not!) For e-commerce, however, product matching by image recognition does seem like a logical next step.
Another unknown is how we will report on this new type of search behavior. We have no idea how marketers will be able to report on or attribute searches like this. Google’s analytics and reporting platforms will also need to adapt to all of the changes that came about. The potential traffic loss seems like it could be significant, especially for sites that provide an undesirable mobile user experience or are missing pertinent information.
It would be great if Google Analytics could report on the “exit keywords” that people click on while they are on your site, but don’t hold your breath. The Click to Search action represents an on-site user behavior, rather than that being part of Google’s private referrer data, so it should be included, but as useful as it could be, keyword-level reporting seems particularly unlikely.
Mobile search seems to have been a catalyst for change at Google, and they are trying hard to create browsers and operating systems that minimize the need for manual input of search queries in a variety of different ways.
The image recognition, machine learning, Click to Search and Private Index capabilities that are being added to Google Now, Now on Tap, Android Marshmallow and Chrome blur the lines between app and web, but they also blur the lines between private and public, sponsored and organic and content and query.
All of this surely seems to make for a better, more intuitive experience for the user but a more complicated and entangled effort from marketers and SEOs. We are all eager to see what Google will add to the mix next but hope that more documentation, tracking and focus on cross-account UX will be forthcoming, as we are all just as eager as Google is to embrace our mobile audience.