Last year, I had to do an extensive online reputation management job for a client. While I was able to outrank most of the other websites already in search engine results pages (SERPs), one specific issue presented a higher difficulty: Two top spots were occupied by some very old (c. 2005–2006) articles.
My task was to override those results, growing the digital properties owned by my client. Unfortunately, even after employing a link-building strategy on the digital properties, I was unable to reach my goal.
So I decided to test a different strategy: Remembering the numerous theories about the influence of click-through rate on the search results pages, I hired a certain number of people on a “micro jobs network” and made them click on the digital properties.
In a matter of few weeks, the digital properties were able to overtake the newspaper articles.
The result was so interesting that I decided to do an experiment with my friend, Andrea Scarpetta, in order to validate the hypothesis of the click-through rate as a ranking signal.
We developed a software tool which could simulate a random sequence of clicks on a query, with these characteristics:
- Using a specific proxy service, we had access to thousands of IP addresses within the USA.
- We collected around 500 user-agent strings, between desktop, mobile and tablet browsers, to use randomly in order to emulate several kinds of browsers.
- The software opened a single session, made a query on Google, clicked on a specific URL, opened the page and stayed there for around 20 seconds.
- Each session had a unique IP in 95 percent of cases, and the repeated IPs were never consecutive.
- We did a random number of requests (between 250 and 700 per day) with a random number of concurrent requests between 2 and 4.
We know that Google takes into consideration hundreds of factors in order to calculate the ranking of a single URL; therefore, we tried to exclude many on-site and off-site elements which could have influenced the test. After a long debate, we agreed on the following features:
- an obsolete query without any traffic (linked to the 2002 Winter Olympic Games)
- which had a PDF within the top 10 results (therefore excluding the majority of the on-site ranking factors)
- which had close to zero incoming links
- which was part of a rather stable SERP
- which was between the 8th and 10th positions on the first page for the given query
- which was part of a SERP with few or no universal search results
In order to monitor the changes, we used two different methods:
- We tracked daily rankings using Proranktracker.com.
- We recorded the position of every single URL clicked in a file.
After a week of activity, the clicked URL improved its ranking from the 10th to the 3rd position and maintained an average rank for the rest of the time between the 4th and 5th positions.
We weren’t completely sure of the results shown by Pro Rank Tracker; the service is usually accurate enough, but we know that Google is changing the results depending on the location of the users. In order to have a proper rank check, we recorded the position of every URL clicked by the software.
We noticed an interesting trend: the average rankings were shifting back and forth a lot more than we imagined!
Even after the experiment, we can’t definitively say that the click-through rate is a ranking factor. We agree with AJ Kohn‘s vision that it’s probably an “offset” which changes the results depending on specific interests shown by an audience. We could say that there is correlation between the clicks and the “visible ranking” of a query.
We can’t say if this kind of “offset” is stable or degrades over time, but at least we can affirm that an interesting title and meta description influence the click-through ratio, and therefore, it’s an indirect way to influence the “visible ranking.”
We intend to do some other tests in the future to measure if the “pogosticking” effect is real and influences the rankings of a page.