The Importance of Automatization
Imagine that you want to find what SERP features your competitor has that you don't have yet. You can use the SEMrush report for that and do it one at a time. But what if you have to repeat this process every month, or you have to repeat it across multiple competitors? We are going to learn how we can leverage the SEMrush API. It is a little bit of coding to accomplish that.
Why would you want to learn that yourself? Why would SEOs need to code other than HTML?
Let's talk about some of the reasons:
- You can do more exciting work.
- You can increase your productivity.
- You can gain more credibility.
- You can also get job security.
Let me share some of the most interesting case studies that I have seen people tweeting:
I started encouraging people to learn about Python last year, and Moshe, after spending just a couple of weeks built a fantastic log analyzer, which I was very impressed with. I learned from Matt Lacuesta that a team member from his agency built an amazing visualization from "People also ask" questions. See how cool is that? Hülya came up with a machine learning model to predict the PageSpeed score for different sites. It is incredibly, incredibly powerful stuff that is coming from the community.
Let me share a few articles that you can also dig into:
- "The Dangers of Misplaced Third-Party Scripts" is a more recent one that is about understanding and solving really complex problems. It digs into how the browser actually parses pages.
- "How to Uncover Powerful Data Stories with Python" explains how to write compelling stories from insights that are hard to find because engineers and marketers operate in silos. This is also something very powerful that you can do.
- "Brands Can Better Understand Users on Third-Party Sites by Using a Keyword Overlap Analysis" is an article that we are going to leverage for the idea that I want to share today. It shows how you can invent new processes that didn't even exist before. In this case, I was trying to understand the performance of a brand that sells in marketplaces. The marketplaces don't share that information, and I found a way to get it. Check this out; it is really cool.
- "How to Use BERT to Generate Meta Descriptions at Scale One" covers one of the things that may help future proof your job. A lot of this stuff is being done with AI, with natural language processing. The kind of stuff that is happening right now and the stuff that is actually possible is amazing. In this article, I show you how to generate meta descriptions from scratch, leveraging BERT, if you are familiar with that.
Minimum Python Knowledge Level
What is the minimum programming knowledge that you need? I used to recommend tutorials and courses that are available on Coursera or Udacity or DataCamp. The feedback that I got was that a lot of those are designed for engineers, for people that want to have a programming career, which is not typically the case for marketers.
Thinking about that, I said, "Okay, let me put together a resource that is going to be specifically for marketers." I recently published an article for Search Engine Journal about an introduction to Python for marketers. If you are familiar with working with Google spreadsheets, and which marketer doesn't do that, then you will be able to learn Python.
John Mueller once had this question about what type of content performed better in mobile versus desktop. And he used Wikipedia to find out, but what he did, it wouldn't be possible to do it manually. He created a Python notebook that he shared.
In my article, I walk through it and explain line by line what is used and how everything is working. At the same time, I took the opportunity to introduce the building blocks of how the programming language works and how you essentially put together solutions doing that. It is really cool and very powerful, so make sure that you check that.
The Problem to Solve
Let's talk about what we want to do. What SERP features my competitor has that I don't have? We have to start with what are SERP features. SEMrush's Knowledge Base page for the Position Tracking tool has a list of them.
Among others, there are featured snippets, local pack, reviews, amp, site links, videos, featured videos, top stories, and people also ask. All these different features make the SERPs a lot more useful for end-users, but at the same time, these features are swaying away from the traditional organic listings that we have. It is very important that you start mastering them and keeping track of how many our competitors have that we don't have. That is what we are going to be learning.
Let's open the SEMrush's Organic report for AutoZone website for an example:
Here we can see all these site features that they have and the number of keywords that rank for those SERP features.
This is the same data for Advance Auto Parts website:
When I made the comparison for both, I found that AutoZone has a lot more SERP features, except for the top stories that they have about the same.
I can quickly find this information and create an example report like this:
But you might be able to find workflows and capabilities that you don't have immediately. And that might even be your competitive advantage because you figured out a process that nobody else knows; instead of trying to do it manually all the time, you can automate it with a little bit of programming.
Automatic SERP Features Research with Python
I put together a Colab notebook that uses this workflow. It did not take me a lot of time to put it together. And the reason why is because I basically pull code that I have already written myself, but you can do the same thing with any code. You can take it and make some small modifications and stitch it together to make it do the work that you want to do.
Let me show you what it goes. You are not going to see any code because after I design it in a notebook, I can create a form where you can input the information:
All the code will be hidden so that it's not confusing or not clear for the end-user. But once you put the information here, just run all, and it will go through the whole steps that I created to accomplish my workflow. In the end, you will get the output. In this case, I didn't get it on the first try because I got an error from Colab:
I tried to see quickly to see if there was a mistake. No, there wasn't. That happens sometimes. Just run it again. And now, I got the output. I got a CSV that was downloaded to my computer.
Then I went back again, and now I provide a different domain:
Think about that. It is like I am building a new capability for SEMrush myself without having to wait for them to add it or depending on whether they want to add it or not. It doesn't matter. I can put it together. It does not take a lot of time and effort once I mastered the basics.
So, after doing the same thing for Advanced Auto Parts. I am going to get another CSV file from this. Now we have the output for our analysis. I took the CSV files and manually imported them into Google sheets. But that is also something that can be automated if you follow the Python introduction article that I mentioned.
This is the data for AutoZone:
With a little bit of modification, I can do it for any number of domains, and instead of providing one at a time, I can provide a list, so a lot of cool stuff that you can do.
The Code Explained
Now let's look a little bit at the code without getting into the crazy details. I went to this article, and I just basically copied and pasted the code that I already developed there. I made a very small change:
For the article, I needed to pull a number of columns from SEMrush. In this case, I only needed one, which used the SERP features. You can learn about each column in the API documentation:
It says exactly what I need, and that is it.
When I tested the changes that I made in the code, I found something interesting. The SERP features that I get back from the API are not the names, but the numbers that identify each feature:
When I pulled these features, I did not know what the numbers mean, and I had to translate those features into their names. I could do that manually, creating the dictionary that says, "'0' SERP feature is this, '1' is this." But I said, "Okay, why don't I do it automatically as well?"
The only thing I had to do was copy a selector from the page. A selector allows you to address an element on a page. Without it, you cannot identify an element.
I copy the selector that identifies a single element. I just have to make a single change, which is to remove this part:
It says that I don't want only that element; I want the whole column.
I just have to change the selector in the code that I already wrote for a different article:
I didn't change anything else, you see. It is the same. This is not about Google Trends, but I can use it for the same purpose, just by changing the selector.
Now I copy and paste the code in Chrome's developer tools console while the page is open. I call the function that I just created and get all the elements. And that was what I needed to create the Python code.
Hopefully, you will find this useful. Thank you very much.