Understanding Search Engine Results Pages (SERPs) or SERP results is crucial to developing any competition analysis, especially if you want to enhance your online visibility and rank Google.
However, doing this manually can be repetitive and time-consuming. In this article, I’ll show you how, with simple Python code, SerpApi, and Google Colab, you can simultaneously extract the first 10 results on Google for different keywords.
Create a Google Colab document
This is a straightforward approach to extracting SERP results from Google. That’s why I won’t dive into the details, but I’ll try to explain how it works and how you can use it daily.
The first step is easy. Sign in to Google, and you’ll be redirected to Colab’s main dashboard. Click on New Notebook to create the document, then copy and paste the code I created below.
Copy and paste the Python code on Colab
After creating your Google Colab notebook, name your file in the left part of the document and copy and paste this code, which is built to extract 10 results from Google for each keyword you add to the search_terms variable:
!pip install google-search-results
from serpapi import GoogleSearch
import pandas as pd
from datetime import date
# Define search terms
search_terms = ['que es seo', 'como crear una estrategia seo', 'como hacer una auditoria seo', 'mejores herramientas de seo']
# Initialize empty list to store search results
search_results = []
# Parameters for API request
params = {
"engine": "google",
"gl": "co",
"hl": "es",
"api_key": "your_key_api"
}
# Loop through each search term
for term in search_terms:
# Update query parameters
params["q"] = term
params["num"] = 10 # Number of results to fetch
# Fetch data from Google search API
fetch = GoogleSearch(params)
data = fetch.get_dict()
# Append organic search results to the list
search_results.append(data['organic_results'])
# Define items to extract from the search results
items_to_search = ['position', 'title', 'link']
# Get today's date
today = date.today()
# Initialize an empty list to store filtered results
filtered_results = []
# Iterate through each search term and its corresponding search results
for term, array in zip(search_terms, search_results):
term_results = []
# Iterate through each result in the search results
for result in array:
# Extract specified items and add additional metadata
result_dict = {var: result[var] for var in items_to_search}
result_dict["keyword"] = term
result_dict['fecha'] = today
term_results.append(result_dict)
filtered_results.append(term_results)
Before running the code, you must create an account in SerpApi. In the first part of this article, I walk you through the process of creating the account to get the Api Key.
Run the code to fetch the data from Serpapi
After getting and adding your API key to the code, it’s time to run the script to fetch the information from the Serp API. You will see that the requests were successfully made:
With all that data stored in the filtered_results variables, we must create the data frames to see the information with the three variables: position on Google, titles, links, keywords, and the current date.
As our information is nested into a big list, we’ll access it by the index. The first result is stored in filtered_results[0], the second in filtered_results[1], and so on.
The first data frame will show 10 results after looking for the keyword “Que es SEO” or “What is SEO”.
df = pd.DataFrame(filtered_results[0])
df
The second data frame will show 10 results after looking for the keyword “Cómo crear una estrategia de SEO” or “How to create an SEO strategy”.
df = pd.DataFrame(filtered_results[2])
df
Download and export the results to a document
Now that you have the results separated in listas, we’ll loop through the final results to extract all the results from the keywords used and put them into a final list:
df_results = []
for results in filtered_results:
print(results)
for r in results:
df_results.append(r)
df = pd.DataFrame(df_results)
df
And finally, that information will be exported to a document with this code:
from google.colab import files
df = pd.DataFrame(df_results)
serp_results = "SERP_results.xlsx"
df.to_excel(serp_results, index=False)
files.download(serp_results)
Here you will find the code on Google Colab. Remember you must get your API Key to perform this SERP analysis.
I am a skilled and self-motivated marketing and SEO professional with 11+ years of experience in different digital marketing areas: content planning, information architecture, and primarily creating, leading, and tracking SEO strategies. In the last six years, I have specialized my career in developing processes to redesign and launch websites successfully using Hubspot’s Growth-Driven Design methodology.
Leave a Reply