How to Find and Fix Common Technical SEO Issues

Think you can’t tackle technical SEO issues in a day? Consider technical SEO to be the low-hanging fruit of your SEO endeavors; easy to identify and relatively quick to fix. And you don’t need to be a “webmaster” in order to address these issues. 

How to find & fix common technical SEO issues - DataBabe Digital

Check indexation

Is your site being indexed by Google? Indexation comes before rankings, always. Before you even think of driving any traffic to your site, you’ve search engines have to find you. 

This part is easy. Go to Google and search for { site:nameofyoursite.whatever }

If not, you’ll see something like this:

If your site is being indexed, you'll see this:


If your site isn't being indexed at all, that's a pretty big technical SEO issue but it's really easy to address.

Some things to look out for:

  • Are you seeing the approximate amount of pages being indexed?
  • Are you seeing any pages that should not be indexed?
  • Or do you notice pages missing from results that should be indexed? 

Make sure you're using Google Search Console to submit your site for indexing, or to correct any indexing issues.

Once you've submitted a sitemap, Search Console will notify you of any indexation issues.



If your site isn’t being indexed at all, check your robots.txt file. Your robots.txt file is a text file that is read by search engine spiders and indicates which URLs on that site it’s allowed or not allowed to index. Consider it the rule book for crawling your site.

An example of a robots.txt file

An example of a robots.txt file


Type { } into Google Search. Everyone’s file looks a little different, but what you don’t want to see is:

Disallow: /

If you see that, it’s a big red flag. Essentially your robots.txt file is telling search spiders not to crawl the entirety of your site. That's probably one of the biggest technical SEO issues your site could have! You should let your web developer know so she can address the issue.

On the other hand an empty Disallow line means you’re not disallowing anything, so spiders can access all sections of your site, which you may not want either.

Important to note: making changes to your robots.txt file could have severe consequences for your site if you don’t know what you’re doing. It’s always best to let your developer work with this file.

You can test your robots.txt file in Search Console

Click on Crawl --> robots.txt Tester

Check your robots file in Search Console

If you're unsure whether or not a particular page might be inaccessible to search engines, you can plug in the URL and test. If individual pages are blocked and they don't fall under any of the robots rules, you may have set the page to noindex. 



Unlike a robots.txt file, pages with a ‘noindex’ directive will be removed from Google’s index. You’re telling Google not to index that page. If your site is in the development phase, ‘noindex’ prevents it from prematurely showing up in search results. This is fine and dandy. 

<meta name="robots" content="noindex, follow">

This line tells the spider search engine spiders to not index the page but to still follow the pages that link out of it.


  • Thank you pages: If you have URL goals in your analytics setup that lead to a ‘thank you’ page, “noindex” will help keep those pages from being included in SERPs.
  • Other non-important pages: Terms & Conditions and Privacy Policies probably don’t need to be indexed. But that's up to you!


Use canonical URLs

It's common for the same content to be accessed through multiple URLs. Canonicalization is the process of designating the best url when there are several related URL choices.

This table is straight from Google: 

You can designate the canonical URL with a simple line in your page head.

<link rel="canonical" href="" />


URL Errors

Check Google Search Console for pages that are returning 404 responses. 

Click on Crawl —> Crawl Errors

If you've got any pages returning 404s, you'll see something like this:

URL Errors in Google Search Console

URL Errors in Google Search Console

What to do

Has the page moved?
If so, set up a 301 or 302 redirect to the current URL.

Has the page been deleted?
You can ask Google to remove the page from indexing or just let it die on it's own.

In Search Console click on Google Index —> Remove URLs

Temporarily remove URLs from search index
Remove from cache, temporarily hide, or both

404s aren't necessarily a bad thing either. It's a good idea to enable a custom 404 page that will route users to other areas of your site.

Create a custom 404 page that will drive users to other areas of your site

Instead of driving users away, I've made it easier for them to stick around.



SSL, or Secure Sockets Layer, secures the connection between your browser and the website you're visiting. To verify that SSL is protecting a page, check that the URL begins with https:// instead of http:// (you'll also see a green padlock icon). This ensures that you can submit information through a secure connection.

Straight from Google's Blog

Beginning in January 2017 (Chrome 56), we’ll mark HTTP pages that collect passwords or credit cards as non-secure, as part of a long-term plan to mark all HTTP sites as non-secure.

Emphasis mine. It's optional at this point but Google is making it very clear that this is something they're prioritizing.

Set your Squarespace site to Secure in your settings

Squarespace offers free SSL certificates for all Squarespace sites, even if you use a third party domain.

Navigate to Settings --> Security & SSL --> Secure





If you're using Wordpress, you can purchase an SSL certificate from your hosting provider.


XML sitemaps

A Sitemap is an XML file that lists the URLs for a site. This provides a sort of blueprint to help search engines crawl the site more intelligently. 

Check to see if your site already has a sitemap by navigating to { yoursite.whatever/sitemap.xml }.

Also, check out this post about submitting your sitemap to Search Console.

Hopefully this post gives you a better understanding of how find and fix technical SEO issues that could potentially harm your website. These are easy fixes that are hugely beneficial to your site.

Just because you don’t have an SEO checking on your site's technical issues, doesn't mean you can't be proactive and check on these issues yourself. And if you've never checked under the hood before, just remember that Google Search Console is your friend!