You might already know that amazing content is the cornerstone of any solid SEO strategy.
However, your content can only get you so far.
If your website has glaring technical issues that prevent search engines and users from finding your content, it doesn’t matter how many unique and well-written resources you have.
In fact, technical issues can cause rankings in search results to plummet.
Luckily, there are ways to find and resolve website problems to keep your site healthy and up-to-date.
These aspects of your site are generally called technical SEO and include broken links, missing pages, and more.
In this article, I’ll discuss technical SEO and core components to consider as part of your strategy so you can maintain your website’s overall health and visibility.
What is Technical SEO?
Before you can analyze your website and fix issues, it’s crucial to understand technical SEO and how it impacts your rankings.
Technical SEO refers to the architectural changes you can make to your website and server to improve how search engines crawl and understand your website.
It involves complicated-sounding terms that we’ll break down in greater detail later.
Why Does Technical SEO Matter?
When done correctly, technical SEO ensures your website is error-free and enhances the user experience.
It helps increase your chances of ranking on search engines by ensuring crawlers can properly locate your website and associated page information.
Technical SEO matters from a user perspective, too.
When you take steps to resolve issues on your site, like fixing broken links, you help visitors find the information they need right when they’re searching for it.
If users have a fantastic experience browsing your website, they’ll be more likely to stay longer and explore.
Accordingly, lower bounce rates and higher time-on-page counts show search engines that your content is valuable for users.
This may boost your site’s rankings on search results.
As you can see, user behavior and search engine algorithms go hand-in-hand so that the most relevant, helpful websites display in the highest positions in search results.
How Do You Find Technical SEO Information?
We’ve covered the value of technical SEO from user and search engine standpoints.
But how do you start improving it?
One recommendation is to perform an SEO audit that captures all of the information you need and easily identifies areas of improvement.
An automated SEO audit can pull in all relevant data from tools like Screaming Frog, SEMrush, Supermetrics, Google Search Console, and more.
Your report can update daily, weekly, or monthly with no repeated manual effort, allowing you to fix issues, track patterns, and continuously improve.
We even created an audit template that you can use to get started.
When creating your audit, you’ll begin by running a crawl of your website using Screaming Frog.
If you’ve never used it before, Screaming Frog is an SEO tool designed to analyze the skeleton of your website and share info about its technical data.
To make sure you obtain all of the relevant info you need, make sure that any of your exports from Screaming Frog include these terms in the report:
- Crawl rates
- Broken links
- XML sitemaps
- robots.txt
- URL structure
- Duplicate content
- 404 and 500 pages
- 301 redirects
- Thin content (sometimes seen as soft 404s)
- Meta titles and descriptions
We’ll describe these terms in greater detail in the next section.
Make sure that your automated SEO audit includes and updates this information regularly so you can always see a recent crawl of your website and resolve any issues.
Practical Tips for Small Business Owners Approaching Technical SEO
Let’s dive into each facet of technical SEO in more detail and how you can approach them as a small business owner.
Crawl Rates
To locate the crawl rates for your website, you’ll want to use Google Search Console instead of Screaming Frog.
To locate the tool, navigate to settings in Google Search Console and view Crawl Stats.
You can then open the report for more information about your website’s crawl rates.
Typically, your crawl rate doesn’t matter as long as your website is less than 5,000 pages.
If you’re unsure what this is, chances are it won’t significantly impact your business and you can focus on other efforts.
Broken Links
Identifying and fixing broken links with new or updated links is important for users and Google.
A broken link occurs when users or Google cannot find or read a linked resource or webpage.
In this case, the page may have been deleted or moved, and the link was never updated accordingly.
To find a list of broken links on your website, you can follow these instructions to crawl your site with a tool like Screaming Frog.
Focus on following the first few steps, and don’t feel overwhelmed by all of the options. Most of them aren’t relevant to a small business owner’s needs.
Once you have this list of broken links, change the links to a relevant page or resource that works correctly (see the 301 redirect section below to find out how).
XML Sitemaps
Submitting your website’s sitemap helps Google see and index new web pages as they’re added.
To submit your sitemap, start by uploading it to Google Search Console.
If your website is on WordPress, use Yoast’s plugin to generate a sitemap that automatically updates when pages are changed and posts are added.
Although these automatic updates don’t mean Google will crawl your new pages right away, it does increase their chances of being found quickly.
To find your XML sitemap, head to the general settings of Yoast and then click the features tab.
Then, add “/sitemap_index.xml” to Google Search Console. You’ll find this under the “Index” section under “Sitemaps.”
Here’s what it should look like once your sitemap is submitted.
Robots.txt
Your website’s Robots.txt file tells Google the pages that should be indexed and displayed in search results (service pages, product pages, etc.) and the pages that should be blocked from search results (thank-you pages, test pages, etc.).
It’s an integral part of any SEO strategy.
If you have your website built on WordPress, make sure your robots look like this:
This tells Google to crawl your entire website while ignoring the pages below.
In this case:
- ignore all thank you pages (/ty/),
- ignore all test pages (/test/), and
- ignore all backend WordPress pages (/wp-admin/)
For this specific example to work, your thank you pages and test pages must be kept in separate directories (/ty/ or /test/) so that they can be disallowed by naming only the common directory.
I also like to use my robots.txt to manage my thank you pages.
If you have a lot of them, it can become cumbersome to manage, so the robots.txt is a helpful place to keep these pages.
However, in the example, the thank you pages listed would already be blocked by /ty/ being disallowed, so they’re mainly there for organizational purposes (and so we can be doubly certain).
With sites that do not follow a clean directory structure, you absolutely need to list thank you pages individually. Just modify the example based on the correct URLs for web pages that you don’t want Google to find.
Tip: If you put the “#” symbol before a line, it comments it out. This helps you keep a more understandable structure.
Here’s a quick way to update your robots.txt from within WordPress using Yoast.
Head over to the Yoast plugin. Then tools → File editor → modify your text in the robots.txt
URL Structure
An organized URL structure makes it easier for Google to understand how pages on your website relate to one another.
Short, user-friendly URLs also appear more professional to readers.
There are two major tips to remember with URLs.
- Be as short as possible while still describing the content of the page
- Help support the structure of the site
Adding “relevant keywords” into a URL is just a way of keyword stuffing (an outdated SEO tactic that won’t help you).
It’s also unnecessary because search engines don’t put a lot of emphasis on URLs for rankings.
If a keyword describes the page’s content, including it in the URL may be a natural choice, but don’t force it.
Studies have shown that shorter URLs tend to help rank a website better, but their structure is also important.
If you have a services section of your website and there are sub-service pages, the ideal structure should be:
- /services/service1
- /services/service2
- /services/service3
Use your SEO audit as a time to better structure your URLs.
But, for all URLs you change, make sure 301 redirects are added (the Redirection plugin is great for this). Google won’t index your new URLs immediately, and searchers who click your old URLs could encounter broken links, leading to lost rankings.
Keep in mind that you don’t want to change URLs every six months because you lose value in every redirect.
However, making one sweeping change that significantly improves the structure and length of your URLs will help you in the long run.
The impact of one round of redirects shouldn’t be an issue.
Duplicate Content
We may not be in school anymore, but plagiarism is still highly discouraged.
Don’t take content from another site exactly as it appears.
It’s okay to reference and use some parts of another website’s content, but always give a link back and always paraphrase.
Pages with 404 and 500 Errors
If pages on your website have 404 or 500 errors, they will prevent both users and Google from accessing what they need.
This could potentially impact website rankings if it’s clear that users can’t solve their problems or answer their questions on your website.
404 errors occur when a page is not found on a website.
500 errors indicate an issue with a website’s server.
Check Google Search Console and Screaming Frog for both 404s and 500s.
Then, 301-redirect them to the correct page to resolve the problem.
From there, you can mark them as resolved in Google Search Console so that Google knows the pages are fixed the next time your website is crawled.
301 Redirects
You’ve already learned that 301 redirects can solve issues caused by 404 and 500 errors.
These redirects send users to a new page if the old page has been deleted or moved.
However, you’ll want to make sure that no links on your website rely on redirects.
While it’s important to implement redirects if pages have been removed, buttons or navigation items on your website should not depend on redirects.
Essentially, you’ll want to ensure that the links you use go directly to the page intended.
Let’s say you have a link on your page meant to go to /services. However, the link that the user actually clicks is /service-area, but then it redirects the user to /services.
While the user reaches the correct destination, you’ve created a “middleman” by having the button link using the 301 redirect. This is better than a broken link, but still not ideal.
Here’s an example.
In this image, hovering over “hire candidates” brings you to /employers.
If you see a different URL when you arrive on the next page after clicking, then your page relies on a 301 redirect.
You’ll want to “take out the middleman” by having the button go to the exact URL it should.
Thin content (sometimes seen as soft 404s)
Now that we’ve learned about 404s, 500s, and 301 redirects, let’s look at soft 404s.
A soft 404 page occurs when Google does not index its content.
An example would be the /cart and /checkout pages in an online store. A cart or checkout page doesn’t belong in Google results because it isn’t relevant to users until they’ve already started shopping.
Google may view this as a “soft 404” and lower your site’s rankings as a result.
Always add a noindex tag to these types of pages to prevent this issue.
With Yoast installed on WordPress, you can do it right from the page by going to the Yoast portion of the page, clicking advanced, and changing “Allow search engines to show this Page in search results?” to No.
Meta Titles and Descriptions
Meta titles and descriptions are among the most important elements of SEO and the easiest for a small business owner to optimize.
These fields appear in search results as a user decides whether to visit your website.
As such, they need to really entice a user to click.
Although Google sometimes creates its own meta content for your pages (even if you write your own), it is certainly worth optimizing the meta fields of each of your pages.
Every page should have a custom meta title and description.
Start by making sure that you reach the requirements for each.
Meta titles should have up to 65 characters.
Meta descriptions should have up to 155 characters.
Going over these character counts will cause Google to truncate what is displayed.
Going significantly under them means you are missing out on the opportunity to clearly describe your page and its value.
Start your first audit by making sure all meta titles and descriptions meet these basic character count requirements. In subsequent audits, use data to try to improve them.
Want to know if your meta title and description could be improved for a particular page?
I like to use Google Search Console to see how that page performed in the last six months. If it has a low CTR, then you know that it could use some improvement.
Using Google Search Console to Find Site Errors
Now that you have more information about areas of technical SEO, you might be wondering how else you can find website errors and properly resolve them.
We already discussed using Screaming Frog and creating an automated SEO audit.
Another way to find information about your website’s health is through Google Search Console.
The indexing area of GSC helps you see how Google views technical aspects of your website.
Essentially, you can see how well Google crawled your website and what issues they found.
If you block certain pages from indexing with your Robots.txt file, they will appear here.
You can also discover 404 and soft 404 errors, pages with redirects, and other pages that haven’t been indexed, along with the reasons for these omissions.
Plus, you can submit your sitemap (as previously mentioned) and submit for the removal of links that aren’t associated with your website.
Technical SEO Checklist
We created this graphic for easy reference to help you optimize your website’s technical SEO.
Conclusion
As we’ve seen, technical SEO is more than just updating broken links and reviewing search engine data.
It’s also about creating an excellent experience for users so they enjoy visiting your site and keep coming back.
By regularly monitoring and improving the technical features of your site, you’ll let Google and readers know that your website is a credible source of valuable information.