How to Build an Automated SEO Audit

Laura Ferruggia
February 3, 2023
14 min read
automated seo audit

An SEO audit is an incredibly important part of any SEO strategy.

There are many companies that try to make this easy. SEMrush, for example, lets you plug in your domain and then provides suggestions.

However, most of these suggestions are not helpful for most business owners.

How do I unminify my javascript? What is CSS anyway?

regular seo audits

These warnings certainly aren’t actionable and may be tedious for most web developers to fix as well.

What if you could create an SEO audit that updates automatically with the latest crawl of your website according to Google?

This article will show you how to get started.

Why have an automated SEO audit instead of a regular audit?

Members of my marketing team know that I prefer automation over manual work all day long.

Creating an automated SEO audit involves pushing the crawl data from Screaming Frog to a Google Sheet on a daily, weekly, or monthly basis.

Crawl data includes a ton of valuable information such as meta titles and descriptions, alt text, and internal links.

But why not just open Screaming Frog and crawl the website each time you need to?

Well, depending on the size of the site, Screaming Frog may take between 5 to 15 minutes to fully crawl every page of your site and all of the links within those pages.

An automated crawl runs at the time of your choosing, allowing you to open and view the data instantly.

You’ll now have a Screaming Frog file saved to your computer. Opening this file brings up the most recent crawl of your website.

This only gives you a quicker way of viewing your website’s technical data, however. Once you push the data to Google Sheets, you can avoid going into Screaming Frog altogether.

From there, you can better visualize the data and take specific actions.

What data will this automated SEO audit allow you to see?

With this automated audit, you can view:

  • Categorized pages of your site (pages vs blogs)
  • Indexability of all pages
  • Meta titles (and length)
  • Meta descriptions (and length)
  • H1s and H2s
  • Status codes of all pages on your site (200, 301, 404)
  • Google’s page speed score (out of 100)
  • Alt text for all images (and which images are missing alts and where they are)
  • Internal links for every page
  • Pages with low word counts
  • Organic traffic data at a glance
  • Clicks, impressions, CTR and average position (from Google Search Console)
  • A checklist of all tagging and site structure recommendations your business site should follow

All of this data is included in generic SEO audits you may have from other companies like HubSpot and SEMrush.

The difference here is that you will have an actionable Google Sheets report at your disposal that updates automatically— no need for a subscription to a third-party service to run the audit itself.

How to create the automated SEO audit

The first step in the audit is to set up the crawl to run automatically.

After that, we need to manipulate the data. I’ll be utilizing the Query Function, which is an advanced function. If you follow this guide, you won’t need to modify any of the functions very heavily.

My entire team has set up multiple automated SEO audits without knowing the Query function in depth.

After we manipulate the data, we can then use SEMrush via Supermetrics to pull in organic data to set benchmarks.

Finally, we’ll create a separate sheet within our report to track all internal links.

Setting up the crawl to run automatically

Begin by opening Screaming Frog. You can download it here if it is not already installed

Ensure you have a paid license added to the program so you can crawl more than 500 URLs. Larger websites will take longer to crawl.

Navigate to file → scheduling and create a new schedule.

You’ll need to modify the settings in general, start options, and export.


general settings scheduling a screaming frog crawl

Give the crawl a relevant name and choose what time of day and frequency it will run.

A daily crawl allows you to open your automated SEO audit any morning to see the latest version of your website.

This is helpful if your team made updates the previous day and you want to ensure that all SEO items were taken care of.

Start Options

start options settings scheduling a screaming frog crawl

Add the root URL of the website.

You can leave the crawl config blank.

If you’d like to pull in Google Analytics or Google Search Console data, you can do so by connecting the Gmail associated with the account.

For this audit, we’re going to be pulling organic data from Google Search Console via Supermetrics. If you don’t have a paid Supermetrics account, you can pull data directly from these networks.

However, I’ve found that you can’t get as many helpful metrics out of this integration at this time.


export settings scheduling a screaming frog crawl

This tab has the most steps to check, but I’ll summarize them below.

  • Check “headless (required for exports)”
  • Choose the local folder that the crawl will save to
  • Connect a Gmail account (in order to use Google Sheets)
  • Choose “create timestamped folder in output.”
    • “Overwrite files” tends to not actually overwrite the file itself. This is a bug that Screaming Frog’s tech support acknowledged, but was unable to fix.
  • Format should be gsheet. You’ll end up seeing multiple folders in the output when it runs that look like this:
how the crawl looks in google drive
  • Change format to gSheet
  • Click “export tabs” and choose “Internal:all”, “pagespeed:all”
choosing the right reports to export in screaming frog
    • You can also add in Search Console All and PageSpeed All for more data
  • Click “Bulk Exports” and add “links: All Inlinks” and “images:all image inlinks”
    • This allows you to export image alt text and identify which pages link to each other.
choosing the right bulk exports to export in screaming frog

Page Speed often adds a much longer time to the automated audit run time. In order to connect the page-speed insights, you’ll need to create an API via the API Access section of Screaming Frog.

This is simpler than it sounds, as it creates the key and you just need to enter it in the settings.

Once you click Page-speed Insights, you’ll be able to create a project in Google’s API dashboard and paste the API key in your Screaming Frog settings

api access in screaming Frog

After you allow the crawl to run, you can navigate to the folder on your computer and see the crawl as a Screaming Frog file.

Opening this file will immediately bring up the entire run of the domain that you chose to crawl.

As mentioned above, this already saves a ton of time because you don’t have to run the crawl to see the data.

what the screaming frog file looks like on your desktop
crawling using screaming frog

You should now see a folder in your Google Drive called “Screaming Frog SEO Spider” which has the data you created.


Manipulating the Data

After you’ve set up your automatic crawls, you’ll need a place to put the data.

Screaming Frog has an awesome setting that allows you to overwrite existing files.

Unfortunately, it doesn’t work.

Until this does work, you’ll have to apply some manual work to this automatic process.

Honestly, it’s not too bad because you can keep a snapshot of your website when you last updated the crawl and have a before and after version.

That way, when you update the report to show the most recent version, you’ll be able to see the changes in action.

How do I set up the automatic spreadsheet?

To begin, make a copy of this template and update the first sheet to pull in the data from your crawl.

I like to call the first part the legend because it has all of the raw crawl files in one place. I also update column C to show when I last updated the crawl.

When you make a copy, add in the new URLs of each specific report to the legend section.

To find these reports, navigate to that Google Drive folder that had all of the crawls for that date.

It will look like this in Google Drive. Just make sure you have the date that you want (conveniently labeled as the folder name).

the reports you see in your drive folder

Simply open all of these tabs in a new tab in your browser and copy the URL to its respective place in column B.

This will automatically update your reports.

If you want to double-check each report to make sure they are pulling data in, you’ll have to unhide them in Google Sheets. Clicking the hamburger icon on the bottom left shows these raw files.

how to view hidden tabs in google sheets

Ensuring your data makes sense

Now you will need to run through all of the tabs at the bottom to make sure they make sense.

Sometimes, data may be pulled from different columns from website to website, and you want to make sure it is accurate.

Let’s run through an example on the “SEO Info” sheet. The categories should be:

    • Address (page URL)
    • Status code
    • Meta robots
    • Indexability
    • Canonical
    • Title
    • Title length
    • Meta description
    • Meta description length
seo audit in action

If your report doesn’t show these columns, you’ll need to update the query function to reflect it.

Don’t worry—I’ll explain exactly what that means.


Modifying the Query Function

When you’re on the “SEO Info” tab, click cell a1. You’ll notice a formula. It’s written in semi-plain English (very similar to the SQL language). With a little guidance, you’ll be able to make sure it reflects what it should.

modifying the formula in the automated seo audit

=query(‘Raw html’!$A:$BA,

“select A, C, M, E, W, Z, G, H, J, K
where A is not null and not A contains ‘wp-‘ and C = 200 order by E desc”)

The formula pulls from all the data from the Raw HTML file (the one in the legend section we pulled from earlier). Then, it selects specific columns. These letters correspond to the information labeled above (meta title, description, etc).

From there, it ignores any columns missing a page URL and backend WordPress pages so it only shows the pages without errors.

All you (may) have to do is modify which columns it pulls from by changing the letters.

Based on our example above, it pulls in column A, then C, then M, then E.

If you go to the raw html tab, you’ll see these correspond to the columns below.

organizing the columns

Simply make sure the columns match up to what I have above and your custom report is all set up. You’ll never have to modify this again.

Categorized pages

The goal of an SEO audit is to create organization and provide insight.

I like to use this as an opportunity to organize all of the pages on my site. This includes blogs, service pages, case studies, and more.

The best way to organize this data with Screaming Frog is to use an outdated tactic called meta keywords.

While they don’t help with SEO, they do help with categorization.

This audit allows you to organize categories of pages in your website based on their meta keyword grouping.

categorizing pages

For example, these pages on our website have specific keywords associated with them. I can then categorize by consulting pages, industry pages, and anything else that makes sense.

Note: WordPress doesn’t allow you to add meta keywords by default and you may need web developer assistance to add these keywords in.

Adding Organic Results

I mentioned that the Screaming Frog report can pull data from Google Search Console and Google Analytics.

I prefer to use Supermetrics to pull this information because I have more control.

This data is pulled in the GA raw section of the report.

Click extensions → Supermetrics → Launch

adding supermetrics to the report

You can choose Google Analytics as your connector and pull in your traffic by visitors, year and month, then segment by organic traffic.

metrics for google analytics via supermetrics

To have the results run automatically, click schedule at the top and add a daily, weekly, or monthly trigger, depending on the frequency you’ll check the report and need it to run.

supermetrics fresh daily trigger

Thanks to all of this, you’ll be able to see a nice graph in the Organic Benchmark tab that shows how your traffic has improved year over year.

organic chart data from supermetrics

Adding SEMrush or Google Search Console keywords

In the same fashion as adding organic results, you can automatically pull in the major keywords that are driving traffic as well as their search volume via SEMrush.

For SEMrush (paid subscription required) you can use these settings to pull in keyword data.

metrics for semrush via supermetrics

For Google Search Console, you can use these settings to see total clicks by page.

metrics for google search console via supermetrics


Internal Links

What if you could put a URL from your website into a Google Sheet and see all of the internal pages that this page links to?

To date, I’ve never seen an SEO tool out there that does this in a very simple manner.

To do this, make a copy of this report and substitute the legend for your all_inlinks report.

The Internal Links report is separate because there is too much data for most websites.

All you have to do is update the legend section with the “all inlinks” report from your crawl and you’ll be able to begin using this report.

Simply add in a single URL in B2 to view all of the inlinks to that one page

internal links report in automated SEO audit

Adding in multiple pages can be done in row A3 and down. Just add the pages you’d like to view links to and it will display them all.

internal links reporting in automated SEO audit


And there you have it.

It may be tricky to set up your automated SEO audit at first, but once it’s done you don’t have to do anything ever again.

Simply open your Google Sheet once a day, week, or month, and check that everything looks good.

I like to use this to make sure new pages have meta descriptions that aren’t too long or to check that all pages are fully indexed.

From there, I can use this as a way to continue to improve my website month over month.

If you haven’t already, download this template to get started and follow this guide step by step to create your own automated SEO audit.

Meet Laura Ferruggia

Laura Ferruggia

Laura joined Miles IT in 2014, and in her current role of Marketing Strategy Director, she leads and supports our amazing and multi-talented team in providing high-quality consulting and services. With experience spanning from content writing to web development and paid advertising over 9+ years, Laura is equipped to take a holistic approach when discussing marketing and website solutions with businesses.

Discover and Do More With Business Technology!

Get monthly business technology tips directly to your inbox.

Related Posts

Let's Build Something Great Together

Contact Us

Leave a Reply

Your email address will not be published. Required fields are marked *