Hi everyone. Dan Petrovic from Dejan Marketing here. I have got an interesting case of CTR optimization today, and I am going to take you through the steps.
After part one in this series, a lot of people asked about what tool I used — it is you.algoroo.com. That is also the free tool we used for the CT analysis in this case. The website we are looking at today is collectiveray.com. We will be analyzing its CTR, finding anomalies, and looking for ideas for CTR experiments.
After processing the data in Algoroo, we found the site-specific CTR averages for collectiveray.com for non-branded queries only. The reason we did non-branded queries is that branded queries have abnormally high CTRs, and we didn't want them contributing to our site averages. Why is it useful to know the site averages? Well, that is the only way to detect any anomalies. And when analyzing click-through rates, it is good to know when something underperforms or over performs above the expectation.
The first exercise is to find if there is anything that deviates negative or positive. Here is our data sample:
We have a high confidence level; we had a lot of keywords involved to determine our statistics, and here is what we found:
The CTR based traffic loss, in this case, is over 5,000 clicks. We expected over seven and a half thousand clicks, close to 8,000 clicks in fact, and only got just over two and a half thousand. So there was a significant amount of clicks missing from the SERPs for some reason.
Checking the Anomalies
What we found is that close to 1300 queries were responsible for the loss of nearly 7% of the total organic non-branded traffic. And around 80% of that loss was caused by 774 queries. So, the idea is to have a look at those and discover if there is anything that stands out as an opportunity. Needless to say, we used a ton of keywords, close to 20,000 keywords to run this analysis.
Here are some of our offenders:
These are the queries that have lost traffic by not being as attractive to click on as the rest of the website. The top one being "font squirrel", then we have got "psd to wordpress conversion" query, "web design blog", "bloom email", "avada", "hire app developer", and so forth.
I did go through a few of them manually and investigated what is happening.
With "font squirrel" query, I went a little bit deeper historically, and I found an unusual peak that happened at one point, which is probably worth investigating. Why did CTR suddenly jump for this query at that period of time?
Previously we looked at the query CTR, and this is the data for the landing page:
It seems to fluctuate on a regular basis. There is nothing unusual there. When I go and have a look at the actual SERPs, I see that there is an official place that most people seem to be looking for.
In this case, I would say that "font squirrel" is a false positive, and there is not much you can do there other than what we already did with the SERP snippet.
While analyzing "psd to wordpress", I found an interesting drop in a CTR that recovered, so there is definitely an area of investigation for this particular website.
I did notice the rank changes as well, but a little bit of a counter-intuitive impression deviation at the same time. Unfortunately, when I had a look at the SERPs, I thought, well, there is not much I can do about this because there is a wall of ads for this query.
And another thing that follows it is the special search features like accordions and videos, which is suitable for people looking for tutorial material.
Moving onto "web design blog", there is a sudden drop in CTR, definitely worth investigating.
One thing that I did discover with this one is the interesting little special serve feature that sits at the top and a lot of ads:
And I thought: "well, if I was to focus my time experimenting, I probably wouldn't touch this one because I'm not really confident in being able to improve CTR in such a scenario."
The one that did stand out as an interesting case is if we invert the whole thing and if we look at the positives. Why did "hire app developer" suddenly surge in its click-through rate? Which is otherwise poor, but what happened? What happened here, and what can we learn from this?
I investigated the SERPs, and I found the same wall of ads, the full pack of ads getting in the way, which I think is part of the reason why this search snippet is not being clicked on as much as we would expect.
But I think I have an idea for an experiment in this case.
Finding an Opportunity
This is the actual snippet: "Top 5 Places to Hire Freelance iOS/Android or App Developers":
I looked at this a little bit, and I considered the user intent for this. I guess part of the problem could be that users on Google can just click on an accordion element of the SERP, get their answer, and get out. But I wasn't entirely convinced because when I landed on the page itself, I had to scroll several screens to get to the actual answer that the page promises.
The page promises the top five places to find the best app developer. The actual answer is buried very deep down on the page.
CTR Experiment Idea
Here is my actual CTR experiment idea, and that is to tie the promise more directly to the answer. Top five places to find the best app developer. The answer at the end of all that fluff on the page is these five. So here is the experiment. What I am proposing is to take the description for this page and change it around a little bit.
What we are doing effectively is giving the answer. "They are Toptal, Gun.io, Hired, X-Team, Fiverr Pro. But then what I am doing next is, which one is best for you? We explain how each platform works, its pros and cons, and pricing." I think this is more directly related to the user intent for the relevant query.
And we have got another element here:
What I would like to see is, and accordion element that just says, "Top five places, here they are, click through to read more." And the only way to do that is to create a shortlist instead of having it buried deep down in the page content and move it right at the very top.
Measuring the Results
Once we decide to run an experiment like this, we can go back to measuring it. You can do that on a piece of paper or an Excel spreadsheet or Google Docs. But I like to do it in AlgoYou because it provides me an interesting platform and measures everything for me. So I click on the Experiments tab, click on "Create new experiment", fetch the data for the page, and I am going to run it for 30 days.
The keyword is "hire app developer". Case A is the existing case, and Case B is the new one I am changing it to.
I am naming the top five; I am asking the question, which one is best for you? We explain how each platform works, its pros and cons, and pricing. Then I am clicking "Create new experiment," and it is now in my list of experiments that I am running. As the data comes in, I will be able to see the average CTR on this page for that particular query compared to the change.
For this to work naturally when you create the Case B, you have to actually change the meta description on the page, and you have to add the suggested content element to the top of the page and then submit it to Google through Search Console to make sure that the experiment is running with new parameters.
That is the first experiment idea for this website. It will be interesting to see if this webmaster could implement it and ping us back with any results. We would be interested to see if they are positive or negative. It is always good to know. Thanks, everyone.