Last Updated on

To do this, you need to have an understanding of a little bit of XPath and Google search queries. You also need to know how to use the Screaming Frog SEO tool. I first learned about doing this from Rory Truesdale’s blog post on Search Engine Journal.

Before proceeding, I wanted to tell you that there is a small possibility that scraping Google’s Serps could get your IP blacklisted. Use this method at your own risk. If you’re ok with the risk, then keep reading.

Configuring the Screaming Frog Crawl

First, set Screaming Frog’s mode to List.

I followed Rory’s advice and unchecked all of the boxes in Configuration -> Spider -> Crawl.

If you want to see a rendered screenshot after Screaming Frog crawls Google, you can go to the Rendering tab and select JavaScript.

I found the results to be the same (without the screenshot) if I selected Text Only.

Unfortunately, when I used Rory’s XPath selectors in his blog post, I didn’t get any results. His post was created in September 2018. Google probably changed up their html and classes since his blog post. By using Google’s Developer Tools and the Chrome Scraper Extension, I was able find XPath selectors that worked. To get the page titles from Google Serps, you can use this XPath selector:

//h3[@class="LC20lb"]

If you want to grab the url that the page title goes to, you can use this XPath selector:

//div[@class="r"]/a[1]/@href

You would go to Configuration -> Custom -> Extraction to add these XPath selectors. You’ll see a window pop up. Type in the title of what you want each type of data being scraped to be. I used SEO Title and URL. Here’s a screenshot of where you would add the XPath selectors.

Select XPath for the first dropdown menu and Extract Text for the second dropdown menu.

In order for these XPath selectors to work, you will need to select the browser that you are using under Configuration -> User-Agent -> Preset User-Agents. If you leave the Preset User-Agents field as the Screaming Frog Spider, you won’t see the extracted values.

I’ve run queries using Google, Firefox, and Microsoft Edge as the user-agent with no problem.

In Configuration -> Speed, I set the Max Threads to 1.0, checked the Limit URL/s, and set the Max URL/s to .8 just to be safe. If this value is too high, Google could block you with a CAPTCHA test.

Now, you’re all set to add your Google queries to spider.

Adding Google Search Queries URLs

This is the format the Google query urls take.

https://www.google.com/search?q=your+keyword+phrase

The + is used for spaces. Running this query in Screaming Frog will extract data from the first page of the Google search.

If you want to produce more than just the first page of results, you can add a num parameter like so:

https://www.google.com/search?q=your+keyword+phrase&num=50

Google will show the first 50 results with that query.

You can also use the near parameter to see what the local pack results would look like in that city or zip code.

https://www.google.com/search?q=your+keyword+phrase&num=50&near=chicago,+il

This only affects the local pack results. The organic results will be tailored to the city that you are in. If you want to see organic results from a different area other than your own, you’ll need to use a VPN and select a server in the city of your interest. In the case above, it would be Chicago, IL.

I manually created my search queries, but you could create an Excel spreadsheet which automatically creates the queries for you. Read more about that in Rory’s blog post.

Once you come up with your Google search queries, you can add them by clicking Upload -> Enter Manually.

Paste your queries in the window that pops up, then click Next, then click OK.

Click the Export button located right next to the Upload button to save your crawl and save the document as an Excel file.

Organizing Google Search Query Data in Excel

When you open up your Excel file, you’ll want to delete all of the columns that you don’t need. In this case, it would be all of the columns except the Original Url, SEO Title and Url columns. After doing this, my spreadsheet looked like this:

I wanted the SEO Title and URL columns to be rows, and I wanted the Original Url rows to be columns. To do this, I selected the cell range that contained the SEO Title and Url data (including the titles). Then, I copied the cells, clicked where I wanted the copied cell data to be pasted, went to Edit -> Paste Special, check the Transpose box, and click OK.

After I transposed the cells, my data looked like this:

After deleting the SEO Title and Url data from the top cells, I transposed the Google query urls, placed them above the corresponding columns, and renamed Original Url to something more descriptive. This is what I ended up with.

The data is a lot neater to analyze. If you look at the spreadsheet data, you’ll see that Google thinks that the search query had the intent of looking for a local provider.

Leveraging Your Time by Running Multiple Queries

You could scrape data from much more than 3 queries although I don’t know what the limit is with Screaming Frog. Let’s say after you’ve done your keyword research, and you wanted to see what the top results were for all of your keywords that you are tracking. You could run this crawl and have your data organized in a fraction of the time that it would take you to manually type the inquiries on Google.

You can also use Screaming Frog to scrape Google’s related search data for multiple queries with this XPath selector:

//*[@id="brs"]/g-section-with-header/div[2]/div/p/a

This could give you more keyword ideas to optimize your website for.

Thank you for reading this post. Feel free to share it or comment below.

Leave a Comment





This site uses Akismet to reduce spam. Learn how your comment data is processed.