How to Find and Fix Common Technical SEO Issues

Think you can’t tackle technical SEO issues in a day? Consider technical SEO to be the low-hanging fruit of your SEO endeavors; easy to identify and relatively quick to fix. And you don’t need to be a “webmaster” in order to address these issues. 

 
How to find & fix common technical SEO issues - DataBabe Digital
 

Check indexation

Is your site being indexed by Google? Indexation comes before rankings, always. Before you even think of driving any traffic to your site, you’ve search engines have to find you. 

This part is easy. Go to Google and search for { site:nameofyoursite.whatever }

If not, you’ll see something like this:

If your site is being indexed, you'll see this:

google-site-search.jpg
site-does-not-exist.jpg

Some things to look out for:

  • Are you seeing the approximate amount of pages being indexed?
  • Are you seeing any pages that should not be indexed?
  • Or do you notice pages missing from results that should be indexed? 

Make sure you're using Google Search Console to submit your site for indexing, or to correct any indexing issues.

 

Robots.txt

If your site isn’t being indexed at all, check your robots.txt file. Your robots.txt file is a text file that is read by search engine spiders and indicates which URLs on that site it’s allowed or not allowed to index. Consider it the rule book for crawling your site.

 
An example of a robots.txt file

An example of a robots.txt file

 

Type { yoursite.com/robots.txt } into Google Search. Everyone’s file looks a little different, but what you don’t want to see is:

Disallow: /

If you see that, it’s a big red flag. Essentially your robots.txt file is telling search spiders not to crawl your site. You should let your web developer know so she can address the issue.

On the other hand an empty Disallow line means you’re not disallowing anything, so spiders can access all sections of your site. 

Important to note: making changes to your robots.txt file could have severe consequences for your site if you don’t know what you’re doing. It’s always best to let your developer work with this file.

 

Noindex

Unlike a robots.txt file, pages with a ‘noindex’ directive will be removed from Google’s index. You’re telling Google not to index that page. If your site is in the development phase, ‘noindex’ prevents it from prematurely showing up in search results. This is fine and dandy. 

<meta name="robots" content="noindex, follow">

This line tells the spider search engine spiders to not index the page but to still follow the pages that link out of it.

EXAMPLES OF WHEN TO USE NOFOLLOW:

  • Thank you pages: If you have URL goals in your analytics setup that lead to a ‘thank you’ page, “noindex” will help keep those pages from being included in SERPs.
  • Other non-important pages: Terms & Conditions and Privacy Policies probably don’t need to be indexed. But that's up to you!

 

Use canonical URLs

It's common for the same content to be accessed through multiple URLs. Canonicalization is the process of designating the best url when there are several related URL choices.

This table is straight from Google: 

You can designate the canonical URL with a simple line in your page head.

<link rel="canonical" href="yoursite.com" />

 

URL Errors

Check Google Search Console for pages that are returning 404 responses. 

Click on Crawl —> Crawl Errors

If you've got any pages returning 404s, you'll see something like this:

URL Errors in Google Search Console

URL Errors in Google Search Console

What to do

Has the page moved?
If so, set up a 301 or 302 redirect to the current URL.


Has the page been deleted?
You can ask Google to remove the page from indexing or just let it die on it's own. 

In Search Console click on Google Index —> Remove URLs

Temporarily remove URLs from search index
 
Remove from cache, temporarily hide, or both

Easy! 

 

SSL

SSL, or Secure Sockets Layer, secures the connection between your browser and the website you're visiting. To verify that SSL is protecting a page, check that the URL begins with https:// instead of http:// (you'll also see a green padlock icon). This ensures that you can submit information through a secure connection.

Straight from Google's Blog

Beginning in January 2017 (Chrome 56), we’ll mark HTTP pages that collect passwords or credit cards as non-secure, as part of a long-term plan to mark all HTTP sites as non-secure.

Emphasis mine. It's optional at this point but Google is making it very clear that this is something they're prioritizing.

Set your Squarespace site to Secure in your settings

Squarespace offers free SSL certificates for all Squarespace sites, even if you use a third party domain.

Navigate to Settings --> Security & SSL --> Secure

 

 

 

 

If you're using Wordpress, you can purchase an SSL certificate from your hosting provider.

 

XML sitemaps

A Sitemap is an XML file that lists the URLs for a site. This provides a sort of blueprint to help search engines crawl the site more intelligently. 

Check to see if your site already has a sitemap by navigating to { yoursite.whatever/sitemap.xml }.

Also, check out this post about submitting your sitemap to Search Console.


Hopefully this post gives you a better understanding of how find and fix technical SEO issues that could potentially harm your website. These are easy fixes that are hugely beneficial to your site.

Just because you don’t have an SEO checking on your site's technical issues, doesn't mean you can't be proactive and check on these issues yourself. And if you've never checked under the hood before, just remember that Google Search Console is your friend!