There is no need to convince you that an SEO audit is important. You already know that. But it still gives you shivers sometimes, right? 

Today, I invite you to break the daunting SEO audit down so that it becomes a no-brainer.

Here is a list of all the things you should check on a site to avoid sudden ranking drops. Follow it and search engines will never pick on your site.

You can also download a PDF cheat sheet for quick reference whenever you need it.

1. Domain 

– Affected aspects: Indexing, user experience, brand reputation, security, rankings –

Let’s start with a domain name audit as it determines your entire online presence. You need to make sure your audience can easily remember your website’s address to easily find it later on. 

Domain history

When choosing a domain name, you may have to buy an existing one. Before the purchase, make sure to trace the history of your future domain. It may turn out that this domain was involved in spammy activity, was penalized before, or had a bad reputation among users. This may influence your future site rankings.

To learn how this site looked before and what content it was delivering, use Internet Archive: Wayback Machine or any other alternative to Wayback Machine. With it, you can check if the site was full of spammy or, vice versa, quality content. Maybe, it was redirected to inappropriate sites. 

Besides, you can use the WhoIs service to check out who was the previous owner(s) of the domain. 

Multiple versions of your site

This step requires WebSite Auditor. You can download it now for free. Download WebSite Auditor

If several versions of your site coexist at the same time, make sure users get access to only one of them. For example, there may be several versions of your site. 

First, it may be www non-www versions: 

  • https://yourdomain.com
  • https://www.yourdomain.com

Second, it may be HTTPHTTPS versions:

  • http://yourdomain.com and https://yourdomain.com
  • http://www.yourdomain.com and https://www.yourdomain.com

Each of these versions is considered a separate site. Hence, only one of them should be made a master version. Otherwise, search engines will index several sites, which in turn will cause duplicate content issues. This may eventually affect your site rankings. 

To see if that’s the case on your website, check the Redirects report in WebSite Auditor. For that, go to Site Structure > Side Audit > Redirects and check out the first two factors:

spotting multiple versions of a website in WebSite Auditor
Download WebSite Auditor

If there are any issues, these two factors will be marked with a red Error icon.

Note: The multiple versions issue can be prevented in the following way:

  • By adding all versions to properties in Google Search Console (GSC) and then specifying your preferred domain. 
  • For www – non-www versions: via .htaccess file and setting the preferred domain in Google Search Console 
  • For HTTP – HTTPS: through a redirect or canonical.

Typosquatting possibility 

When you have an already established name in a niche (either big or small), that may be used by fraudsters. How?

When users make a typo (it may be a misspelling, another TLD, hyphenated spelling, etc.) while typing in your domain name, they may accidentally end up on an alternative website set up by some cyber criminals. As a result, your business may suffer losses due to traffic redirection. Besides, your reputation may be hurt if your name becomes associated with some malicious practices. 

To avoid such cases, you can simply monitor the sites’ typosquatting threats with tools like UpGuard and dnstwist

Note: You can also register all the possible domain name variations and redirect all of them to the correct version. For example, with our site we do the exact thing: if you type in https://linkassistant.com, you will end up on https://link-assistant.com anyway. 

2. Site structure

– Affected aspects: Crawlability and indexing, user experience, rankings, revenue –

If your site is larger than a couple of pages and is growing bigger, revising its structure once in a while becomes a must. 

Content taxonomy and hierarchy

Make sure that you organize the content on a site in such a way as to both satisfy users and search engines. Your task here is to check if the relationships between pages on a site (its taxonomy) are logical. 

Thus, make sure there are: 

  • top-level pages, which are the most strategically important, large, and target broad terms,
  • lower-level pages or categories supporting top-level pages that target more specific long-tail keywords.
content taxonomy

It’s only manually that you can audit and revise your site taxonomy. However, base your approach on data from Google Analytics (Behavior flow report, Pages per Session, Bounce Rate) and keyword research

Click depth 

Watch out for your important pages’ click depth. From search engines’ perspective, the placement of pages in a site structure is a signal of importance. Thus, a page that’s buried too deep in a site structure tree gets less weight (unless it’s got a lot of backlinks). 

To check your important pages’ click depth, move to the WebSite Auditor’s Pages report (the same Site Structure module). Find your top pages (sort pages by Organic Traffic or any other important metric) and check out their click depth.

checking click depth in WebSite Auditor
Download WebSite Auditor

If you find a strategically important page with a click depth of 4 and more, consider re-organizing your site structure so that this page gets closer to the homepage. 

3. Internal linking

– Affected aspects: Crawling and indexing, PageRank distribution, rankings –

Internal linking is deeply connected to site structure and greatly depends on it. However, there are things that you should consider besides it.

Some case studies clearly show that the amount of links correlates with the traffic a page gets. Thus, the more internal links the better. However, the rule is true to a certain point – when it starts to look unnatural and spammy, it’s a no. 

Moreover, links stand out. If there are too many blue links and colored buttons, the content becomes a bit stuffy and indigestible. We’ve got enough visual noise already. 

On top of all, too many internal links coming from a page dilute its own link juice. So, you need to check the following:

  • Number of internal links from and to a page
  • Number of unique internal links per page

To do that, in WebSite Auditor, go to Site structure > Pages > Links and technical factors. Here, check the Links to Page and Links from Page columns:

checking internal linking in WebSite Auditor
Download WebSite Auditor

A pro tip: Distribute internal links in such a way as to pass PageRank from the most weighty pages to the ones that don’t rank that well. This way, you will balance the PageRank of your pages and improve the overall site rankings.

Anchor text

Now check the context your links are surrounded by. John Mueller said anchor text was important for ranking. So, here is what you should check to make sure your anchor texts work for your site and do not spoil everything:

  • Type and variations. Ideally, your anchor texts should be of different types. Check your links for naked, random, generic, exact match, and repetitive anchor texts. The usage of such anchors should be minimized.  
  • Length. Again, longer anchor texts give search engines more information about a page. So, it potentially helps a linked page rank higher.

To find this information, go to WebSite Auditor’s > Site structure > Pages > Links & technical factors

che4cking anchor text in WebSite Auditor
Download WebSite Auditor

Click on any page and below you will find all its anchor texts.

Broken links

Broken links are hard to spot without a special audit. The reasons why broken links may appear:

  • You misspelled the URL in the <a> tag 
  • You changed the URL of a linked page
  • The page you link to doesn’t exist or was deleted (returns 404 Not Found).

To find broken links, find the corresponding column in the same Pages report in WebSite Auditor. 

finding broken liks in WebSite Auditor
Download WebSite Auditor

Orphan pages

Finally, you need to find out if there are orphan pages on the site – the ones that are not linked to. Such pages may not appear in the Google Index. 

The easiest way is to quickly check it via the Visualization tool in WebSite Auditor. They will be marked in gray. 

finding orphan pages in Site Visualization tool
Download WebSite Auditor

Navigation is also about internal linking. If set right, it allows users to easily find the needed content on your site. When you check the navigation on your site, pay attention to the following things:

  • Navigation menus. Check if your header and footer are composed strategically: all selling pages should be placed in the header. And less important though potentially useful links should be placed in the footer. 
  • Faceted navigation and pagination. If you run a large ecommerce site, most likely you have faceted navigation and pagination implemented. And it’s so easy to foul your site up while doing it. Here, check your rel=canonical, noindex tags, and robots.txt file. 
  • Breadcrumbs. You can implement any type of breadcrumbs – hierarchical or dynamic – but make sure you do it correctly. Breadcrumbs should be located at the top of the page, and its Schema markup should be valid. Moreover, there should be no breadcrumbs on the homepage (otherwise, it’s a link to itself).

4. URL structure

– Affected aspects: User experience, rankings –

Your URL structure is mostly determined by your site structure. However, there are still things you should watch out for.

URLs’ length

If the URLs are super long, they are not user-friendly. This may result in a poor user experience. Compare: 

  • Example 1: https://www.searchenginejournal.com/category/digital/strategy-digital/what-is-a-target-audience-and-how-do-you-find-it/467926/ 
  • Example 2: https://www.searchenginejournal.com/digital/target-audience/

You can check your lengthy URLs in WebSite Auditor’s Site Audit report.  Find the URLs section > Too Long URLs: 

finding too long URLs in WebSite Auditor
Download WebSite Auditor

If you see Error or Warning, think if you could shorten the URLs.

Note: Always choose the simplest URLs possible so that even if your page is located far from the homepage, its URL doesn’t look like example 1.

Dynamic URLs 

URLs that contain special symbols (?, _, &) and different parameters are also considered neither user nor SEO-friendly. They may be hard to perceive and can cause duplicate content issues. However, such URLs are inevitable if you have faceted navigation or/and pagination. Or when you need to track session IDs and website traffic, for example. 

To check your URLs for dynamic URLs, go to WebSite Auditor’s Site Audit report and in the same URLs section, find Dynamic URLs

finding dynamic URLs in WebSite Auditor
Download WebSite Auditor

Again, if the issue urgently needs a fix, it will be marked as Error. If it's an orange Warning, consider tackling the issue in the near future too.

5. Content

– Affected aspects: Rankings, brand reputation –

Now let’s check your content for issues that may be stopping you from reaching the top of the SERPs. 

Amount of content

The amount of content should be checked to spot the issues of thin content. This issue becomes real when you have too little content on a page, thus providing no value for users. 

To prevent that from happening, you first need to check the number of words on each page. In the same Pages report of WebSite Auditor, check the Word Count column.

checking word count in WebSite Auditor
Download WebSite Auditor

To streamline the process, filter out pages with more than, say, 300 words (300 is a ballpark figure, which depends on your site niche).

Then look through all the pages you just filtered and check content on each of them. Are these words enough to convey a message and help users? If yes, leave the page as it is. If not, consider updating it with more quality content. 

Performance 

Sometimes, content may underperform or its rankings may decline with time. You need to regularly check your site for such low-hanging fruits and optimize these pages so that the resources aren't wasted in vain. 

You can find such pages in Google Search Console. Go to the Performance report > Pages, click on the Filter icon and set Positions > Greater than > 11. Thus you will find pages that don't perform well enough to appear on the first SERP.

finding undeperforming pages in GSC

Once detected, make sure to optimize this content further – elaborate on the topic more, add keywords, revise technical SEO aspects like meta titles and descriptions, H1 – H6 tags, etc. To dive deeper into the topic, read our 8-Step Guide for Full Website Content Audit

Note: Use WebSite Auditor’s Content Editor to make sure you optimize your content enough to beat your SERP competitors. 

Lacking content 

This step requires Rank Tracker. You can download it now for free. Download Rank Tracker

I bet there are a couple of topics that your competitors have covered and you missed. We need to close the gap

To spot not yet covered topics, use Rank Tracker’s Content Gap Analysis tool. Go to Keyword Research > Keyword Gap, and add your main competitors:

doing Keyword Gap analysis in Rank Tracker
Download Rank Tracker

You will get a list of keywords you should create and optimize your future content for. 

Outdated content

Some information tends to become irrelevant and untruthful. And not taking care of it may harm your reputation and rankings as users are not likely to return to websites with outdated content. 

You can’t actually use any automation tools here to spot such content. You, however, may find all the articles that contain the past years in titles in WebSite Auditor using filters. 

finding outdated content in WebSite Auditor
Download WebSite Auditor

Note: If you make changes to your application or service, it’s also worth updating your manuals and visuals accordingly so that you do not confuse newcomers. 

E-A-T signals 

According to the recent news, E-A-T applies to every query and search result – not only to Your Money or Your Life sites (however, for those, the rules are stricter). 

To send these E-A-T signals to search engines, there should be specific content on your site. Here we talk about the author bio pages with  links to social media profiles. Besides, don’t forget to update your About us and Contact us pages with all the necessary information about your company. 

Moreover, make sure that you don’t mix YMYL and non-YMYL content on one site. Google says it may confuse them when ranking a page. 

Unfortunately, there is no way to track E-A-T issues automatically. However, you can track your Domain Strength in Rank Tracker and compare it to your competitors. It will give you a rough estimation of how authoritative your site is for search engines. 

checking domain strength in Rank Tracker
Download Rank Tracker

Interstitials

Pop-ups are important for marketing. Plus, there are pop-ups that you must put on your site (like cookies consent). However, if they overlap your content and it becomes less accessible, a pop-up presence may result in a poor user experience. If they become intrusive, that may be noticed by search engines and your rankings may be lowered.

Check the sizes of pop-ups so that they do not overlap much content, especially on mobile devices where the screens are super small. 

You should additionally check the Page Experience report in Google Search Console to make sure that you optimized your pop-ups for Web Vitals correctly.

Scrapped content aka external duplicates

It happens sometimes that disreputable sites steal your content and publish it under their names. And it is unlikely that Google will penalize them for scrapped content. As a result, you will rank with this another page for the same keywords and that’s where you may feel the consequences.

To check your site’s content for external duplicates, you can use services like Copyscape. They were created to detect plagiarism. Or, you can use social listening tools like Awario – you can specify a certain phrase from your content as a target and track all the exact matches on the web. 

If you find out that your content has been stolen, contact the intruders with your cease and desist request. 

6. Images

– Affected aspects: User experience, site speed, rankings –

Images are just as important as text in terms of content (for ecommerce sometimes, even more important). And it’s not less important than text – they are more visible and technically, they are the largest part of a page.

Below, I’ll walk you through the most important aspects of image optimization. But if you want to learn all the details of image SEO, read our Image SEO Optimization.

Format and size

You need to make sure your images are in the correct format. it should be PNG, JPEG, WebP, or AVIF.

As for the image size, I’d like to say the less the better. But it’s not quite true. You should opt for the lesser size until it damages the quality of the image and the objects in it become not distinguishable. Consider compressing your images before uploading them to your site. Most image compressors will let you significantly reduce the size of an image without compromising its quality.

You can quickly check the format and size of your images in WebSite Auditor. Go to Site structure > All resources > Images and quickly scan the list:

image size in WebSite Auditor
Download WebSite Auditor

Use filters to speed up the search for under-optimized images. For example, its size is more than 200KB. 

Name and file structure 

Just like with the URL structure, the image file structure is no less important. First, the image name and its URL path help search engines better understand what your image is about. Second, images’ names are a part of the user experience – when they save any of your images, it's great if they are correctly named and not confusing. Compare:

  • Example 1: www.link-assistant.com/images/seo-tools/1548308.jpg
  • Example 2: www.link-assistant.com/images/seo-tools/seo-tools-chart.jpg

However, don’t rush to change your image file names if you spot some not-very-optimized URLs. Google claims it may take months for them to crawl new image URLs since they are crawled not so often as pages. Therefore, it would be better not to tune hundreds of images already uploaded to your site, but to take note of all the above for future images on your site.

Alt text 

Alt text also hints to search engines what an image is about. In fact, it's an even stronger signal than the image name and file structure. So, you should check your pages for empty or badly written alt tags.

You can check all your site's alt texts in WebSite Auditor > Site Structure > Site Audit > Images > Empty Alt Text.

checking out alt text in WebSite Auditor
Download WebSite Auditor

Broken images

An image is considered broken if:

  • it returns a 4xx or 5xx status code, 
  • the image URL is not specified in the <img> tag, 
  • its URL leads to non-image content, 
  • if there's a DNS error detected. 

In these cases, users see such images as follows:

broken image

To detect broken images on a site, move to the Broken images factor.

finding broken images in WebSite Auditor
Download WebSite Auditor

This step requires SEO SpyGlass. You can download it now for free. Download SEO SpyGlass

– Affected aspects: Rankings, brand awareness –

Backlinks are one of the most important ranking factors. It makes up your site authority and directly affects your site rankings. 

Number and progress

First, you need to check the overall number of your backlinks and track the progress you’ve made over time. 

Quickly check that out in SEO SpyGlass > Backlink Profile > Summary.

backlink profile
Download SEO SpyGlass

It’s also worth comparing your figures to your competition's. It will give you an understanding of your place in a competitive landscape. To get that done, in SEO SpyGlass, go to Domain Comparison > Summary. 

backlink comparison in WebSite Auditor
Download SEO SpyGlass

Note: You can also check your competitors’ linking domains to spot some backlink prospects in the Link Intersection module. 

Quality 

The quality of backlinks plays a significant role in SEO. If you get primarily low-quality backlinks, it may do more harm than good.  

That’s why you need to check how authoritative the sites linking to you are. First, open SEO SpyGlass and go to Backlink Profile > Backlinks. There, check both the Domain inLink Rank, and the inLink Rank of a specific page linking to your website. 

checking inLink rank in SEO SpyGlass
Download SEO SpyGlass

If you see too many links marked in red, consider disavowing them in order to avoid Google penalties. 

Additionally, check the Penalty Risk each linking domain and page brings up. For that, go to Backlink Profile > Penalty Risk. There might be a number of reasons why a site/page presents a high penalty risk. You can see the exact reason by clicking the ⓘ icon near each penalty score. 

checking penalty risks
Download SEO SpyGlass

Anchor texts 

Anchor texts are also a signal of relevance – they provide more context for search engines and users as to what a linked page is about. That's why it’s important to keep track of your anchor texts. 

The matter is that if there are too many irrelevant or too generic anchor texts, it may look like spam activity. Such links may be considered low quality and won’t bring your site any good. 

Check your domain’s anchor texts in SEO SpyGlass > Backlink Profile > Anchor Texts:

checking anchor texts in SEO SpyGlass
Download SEO SpyGlass

Unusual spikes

From time to time, check your backlinks growth for unusual spikes. A rapid growth of backlinks may indicate that your site has undergone a negative SEO attack. For example, your competitors can deliberately set a bunch of spam links on you to lower you in the search results.

Of course, it won’t necessarily work out for them (Google may simply ignore such spam links). However, it won’t be superfluous to periodically check that out. 

For that, go to SEO SpyGlass > Historical Data. First, find Backlinks, set the needed data range, and see how your backlinks grew. You may notice unusual spikes with a day’s precision.

checking backlinks historical data
Download SEO SpyGlass

8. Localization

– Affected aspects: Rankings, user experience, revenue –

If your business operates in several markets and has an international website, you might have already implemented site localization to target different audiences. If so, you need to audit your localization implementation.

Hreflang implementation

rel= “alternate” hreflang setup is required for localization to show relations between pages. If it’s implemented incorrectly, that may cause a number of issues like duplicate content, de-ranking, and others. 

Here are the things to audit:

  • Return or self-links. Each language version of a page should have hreflang attributes that point to the page itself and also to other language/region versions of the page. 
  • X-default values for unmatched languages. X-default tells search engines which page version they should use for languages and regions that have not been defined through your hreflang attributes. It is not necessary but advisable to use. 
  • Language or region codes. They should be set correctly – all language codes identify the language (not region) in the ISO_639-1 codes, and the region codes go in ISO 3166-1 Alpha 2. Remember that specifying the region alone is not valid.

To check your site localization issues, proceed to the Localization report in WebSite Auditor > Site Audit

return links for localization
Download WebSite Auditor

Go through each point to make sure your international SEO works like a charm. 

Correctly localized and optimized content 

Correct localization isn’t limited to technical aspects only. Cultural peculiarities should also be considered. Most likely content created for individualistic cultures won’t be understood or correctly perceived by some of the collectivist cultures. 

Even design and layout should be localized. For example, Arabic audiences won’t understand your left-to-write writing.

Besides, in different countries, they have different search habits. They may use different search phrases and different search engines. Yes, in some countries they don’t use Google. For example, in China, it’s always Baidu, and in Korea – Naver

So, make sure you revise your content optimization strategy as well. Do that with Rank Tracker as it shows your rankings as if searched from a specific location. You can specify the preferred search engines and then add a preferred location to track rankings more precisely. 

9. Redirects

– Affected aspects: Crawlability and indexing, user experience –

Redirects may affect both indexation of your pages and user experience. That’s why let’s pay due attention to it.

Types

The most common mistake even professionals make is misusing 301 and 302 redirect types. Mostly they’re made because in most cases, 302 is set by default until you directly specify the 301 redirection. The first one is permanent and the second one is temporary. 

And here lies the problem: if you use a 301 redirect, search engines stop indexing the old URL and some of its link juice is passed to the new destination.

Conversely, if you use a 302, search engines may continue to index the old URL, and consider the new one as a duplicate, dividing the link juice between the two versions. That may hurt your pages' rankings. 

Additionally, it’s worth auditing your client-side redirects. Ideally, there shouldn’t be any. But if there are some meta refresh or JavaScript redirects, make sure you implement them correctly. 

To detect any redirect issues, go to the corresponding section in WebSite Auditor > Site Structure > Site Audit and check out the following:

auditing redirects in WebSite Auditor
Download WebSite Auditor

Everything that is marked in red should be fixed as soon as possible.

Number

Any redirects are a certain burden for your site. Their excessive amount may hurt your site speed and if your pages are well-interlinked, that may significantly complicate crawling and indexing. 

To check how many pages on your site are redirected, go to WebSite Auditor, find Site Structure > Pages and apply filters for HTTP Status Code to be = 302 or 301. 

auditing redirects
Download WebSite Auditor

To quickly check if the number of redirects can affect your site speed proceed to Site Structure > Site Audit > Page Speed and find Avoid multiple page redirects. Alternatively, you can find the same information in Google Search Console (see Experience > Core Web Vitals).

Chains and loops

If page 1 redirects to page 2 and this one, in turn, redirects to page 3, and so on, you have a redirect chain. And if a redirect chain ends up with an initial URL, it’s a redirect loop. 

As a rule, redirect chains and loops are created accidentally and are just a waste of resources (link juice, crawl budget, page speed). 

Fortunately, they can be easily detected by WebSite Auditor in the same Redirects report:

spotting redirect chains
Download WebSite Auditor

Note: If the issue occurs, just redirect from the initial page to the destination page passing all the intermediate “hops”. And if there is a loop, just remove all the redirects. 

10. HTTPS

– Affected aspects: Security, user experience –

Security is a ranking factor and should be maintained no matter what. Here are the basic things that should be audited first-hand. 

Active SSL certificate

You may purchase it once but make sure it’s being timely updated. Otherwise, you users will get a notification like this one:

your connection is not secure message

You can use any SSL checker to see if your certificate is fine. Also, don’t forget to set up notifications about the upcoming renewals. 

To check if there are any HTTPS issues on your website, go to Google Search Console > Experience report > HTTPS. 

https report

Mixed content 

Besides the fact that your site may not be served over HTTPS, there can be another issue of mixed content – when HTTP and HTTPS are met on one page. It weakens the security of the whole page.

To check your website for such type of issue, use WebSite Auditor. Go to Site Structure > Site Audit> Encoding and technical factors > HTTPS pages with mixed content issues:

finding mixed content issues
Download WebSite Auditor

11. Core Web Vitals

– Affected aspects: Site speed, user experience, rankings –

Core Web Vitals are not only about speed as many think, it’s about overall user experience – how fast pages load and how responsive and stable they are. 

Largest Contentful Paint 

LCP reflects the render time of the largest image or text block visible within the viewport, relative to when the page first started loading. In plain words, this is the metric that shows how fast it takes for content to download. 

Ideally, our LCP should be less than 2.5 seconds. The time greatly depends on your:

  • Images
  • Image tags
  • Video thumbnails
  • Background images with CSS
  • Text elements

You can check LCP metric for each page in WebSite Auditor: from Site Structure, move to the Pages report > Page Speed:

LCP
Download WebSite Auditor

You will see the list of all your pages and their LCP scores. Those requiring improvement will be marked in red. 

First Input Delay 

The responsiveness of your pages is measured with FID. Basically, this metric reflects the time it takes for a server to respond to a user’s first interaction (click on a button or link) with your site while it is loading.

Ideally, it should be 100 ms and less. What things may worsen FID:

  • Java Script 
  • Unnecessary scripts
  • Images 
  • External fonts and excessive CSS 

Again, check FID for each page in WebSite Auditor. In the same workspace (Site Structure > Pages > Page Speed) find the First Input Delay column:

FID
Download WebSite Auditor

Cumulative Layout Shift 

CLS measures every unexpected layout shift that occurs during the entire lifespan of a page. Such a layout shift happens when a visible element changes its start position.

What may worsen CLS: 

  • Images and embeds without specified dimensions 
  • Ads, embeds, and iFrames without specified dimensions
  • Dynamic content 
  • Web fonts causing flash of invisible text or unstyled text.

To check CLS for your pages, check the same name column:

CLS
Download WebSite Auditor

Note: There is a way to check your CWV for your whole site in Google Search Console. For that, go to Experience report > Core Web Vitals.  You will immediately see how many pages need better optimization:

CWV report in GSC

Alternatively, you can use WebSite Auditor to check your Core Web Vitals in bulk. From the Site Structure module, go to Site Audit > Page Speed. You will get not only the list of pages that do not pass CWV assessment but also a list of recommendations that will help you improve these metrics. 

page speed audit
Download WebSite Auditor

I also recommend reading our case study How We Improved Core Web Vitals & What Correlations We Found to learn from SEO PowerSuite’s own experience.

12. Mobile-friendliness

– Affected aspects: User experience, rankings –

Mobile friendliness is the gold standard for a quality website. However, I suggest you first check your mobile traffic and other metrics (like Bounce Rate and Conversions) to understand how much mobile traffic you get and whether you meet the needs of mobile users. 

Note that even if you don’t have so many mobile users, working on the mobile-friendliness of your website is crucial anyway. Since Google sticks to mobile-first indexing, it’s the mobile version of your site that Google sees and on the basis of which it makes ranking decisions. 

You can do that in Google Analytics > Audience > Mobile > Overview:

mobile analytics in GA

Then check the technical aspects of mobile-friendliness.

Responsive design, dynamic serving, separate mobile version

Whatever option you’ve chosen, make sure you audit the corresponding aspects:

  • If you use responsive design, check out your viewport meta tag in the page header. It should be set like this: <meta name="viewport" content="width=device-width, initial-scale=1.0">. This way, your page will display correctly on any device.
  • If you use dynamic serving, check your vary HTTP header (Vary: User-Agent). Thus, you tell search engines that different content will be served on desktops and mobile devices. 
  • If you have a separate mobile version, check the use of the link rel=alternate tag. It’s used to indicate the relation between desktop and mobile versions of your website to search engines. 

Note: From an SEO perspective, responsive design is a preferred option. So, choose it over others.

Readability and touchpoints 

This is something that is easy to check with your eyes using Device Mode in your browser. Pay attention to how the following things look on different devices:

  • Sizes of elements (text, images, icons) are optimal to be readable 
  • Touchpoint elements are not too close to each other

You can check out your mobile usability for any issues in Google Search Console > Experience > the Mobile Usability report. If something is wrong, there will be details:

mobile usability report in GSC


Alternatively, you can check that out separately for each page with Mobile-Friendly Test

13. Code and script

– Affected aspects: Crawlability and indexing, site speed, rankings –

Now, let’s audit your code and script as these are the technical factors that may severely impact your SEO. 

Unnecessary script code

You need to keep your code clean and neat. If there are unnecessary elements (for example, multiple head elements, etc.), it may slow down page speed.

What you need is to minify your source code. For that, you need to detect these unused JavaScript and CSS codes first.

Once detected, remove this unused code to speed up your page load.

Analytics tags 

You can’t do any SEO or marketing without analytics tools. If something is wrong with it, tracking becomes impossible, or even worse, your data can get into the hands of 3rd parties. 

So you need to study your source code for the analytics snippets and make sure they are set up correctly. 

Rel canonical

If you have similar content on several pages of your site, search engines won’t ever understand what page to rank till you tell them. That’s why you need the rel=”canonical” element in your page markup. It tells search engines which version should be given preferences.

There may be broken canonical links or multiple canonical URLs.

To detect if there are any, go to Site Structure > Pages and add the needed columns in WebSite Auditor’s workspace – Rel Canonical and Multiple Rel Canonical URLs:

adding columns in WebSite Auditor
Download WebSite Auditor

Meta titles and descriptions

Titles and descriptions not only should be present on your page, be descriptive and contain a keyword. There also should be no duplicates among them. Besides, both titles and descriptions should be of optimal length.

You can spot issues with your meta titles and descriptions in WebSite Auditor > Site Audit > On-Page

finding duplicate titles and descriptions
Download WebSite Auditor

H1-H6 tags

H1-H6 tags’ aim is to inform search engines of what your page is about, and what its structure is. Plus, they help users navigate the page. Examine your H tags and make sure:

  • They are consistent and hierarchical. As a rule, H2 are main points, H3 are sub-points, and so on. It’s not advisable to go for content structure deeper than H4, otherwise, it will be too difficult to comprehend for users. 
    h1-h6 heirarchy

     

  • Single H1 on a page. The H1 heading is the most important one and should concisely summarize the content’s key message in one phrase. Basically, it’s your headline.

You can conveniently audit your H1-H6 tags in WebSite Auditor > Site Structure > Pages > the On-page tab. You just need to add the appropriate columns in your workspace. 

addig columns
Download WebSite Auditor

Robots meta tags

Robots meta tags (both the meta robots tag and the x-robots tag) help control crawling and indexing. With their help, we tell search engines if we want them to follow links found on a page and index page and images found there. Sometimes, meta robots tags are used to control snippets and to show cached results on SERPs. 

These are the most common values added to robots tag:

  • Index
  • Noindex 
  • Follow
  • Nofollow
  • None
  • Nocache
  • Nosnippet

Very often, these tags are implemented incorrectly. For example, some important pages might be tagged as noindex or a page may be specified in the robots.txt file and tagged as noindex simultaneously (which makes the noindex tag inefficient).

To avoid possible issues, check if any of your papes with a noindex tag got into the robots.txt file. For that, go to WebSite Auditor > Site Structure > Pages. Add the Robots Instructions column to your workspace to see those pages.

robots instructions
Download WebSite Auditor

Make sure you rebuild the project, enable expert options, and unclick the Follow robots.txt instruction option so that the tool can see the instructions but not follow them.

tuning crawler settings
Download WebSite Auditor

Structured data

Structured data is needed to embrace all your search appearance opportunities. It helps search engines to understand your page faster. Besides, using structured data may become your chance to get yourself rich snippets instead of plain blue links (which may result in higher click-through rates). 

For different types of pages, there will be their own markup, so we’ll focus on auditing for faults and finding opportunities. 

Audit current markup for issues

Go to Google Search Console > Enhancements to see if all markups on your website work as intended. If you stumble across any invalid items, make sure to check the reasons behind that.

review snippets in GSC

If you need to audit a specific page, use Rich Results Test

Audit for opportunities

It may be that you miss out on something you can implement but haven’t done yet. You need to detect the features that you can optimize for. 

First, check which pages have structured data markup and which do not in WebSite Auditor: Site Structure > Pages > the Open Graph & Structured Data Markup tab:

structured data in WebSite Auditor
Download WebSite Auditor

For those pages that aren’t marked up, think of any possibilities. For example: 

  • Product pricing pages and Product markup 
  • Product videos and VideoObject markup
  • Product reviews  and Review markup

Once opportunities are detected and you are ready to implement the markup, use Schema Markup Validator to make sure you did everything correctly.

14. Sitemap

– Affected aspects: Crawlability and indexing –

Once the rest of the possible issues have been worked out, you need to look at the things that may prevent your page from appearing in search results. And one of the basic things is an XML sitemap, of course. 

There can be several issues here: 

No sitemap

If there is no sitemap, it’s not an issue as such. However, without it, the crawler doesn't know what pages to prioritize. You may want to instruct Google what pages not to crawl and how often your pages are updated so that they are crawled accordingly. 

And if you have a huge website with a complicated structure and high click depth or an international one, you can’t do without an XML sitemap. 

Sub-sitemaps

If you run a large site, segmenting your sitemaps by sections is a good SEO practice. 

For example, if you have an e-commerce website, you can create a single sitemap for your static pages (Privacy Policy, Copyright Policy, Terms of Use, etc.) and then different sitemaps for your category pages. Or if you have a business site, you have more static product pages and a blog section that is updated more frequently . In this case, you create two different sitemaps. 

By creating sub-sitemaps, you manage the crawl budget more effectively. That’s why make sure you have several sitemaps based on how static your pages are. 

Empty, blank, and 404 sitemap

There are a dozen possible faults that may cause issues with your sitemap. It may be due to the wrong format or wrong HTML tags or incorrect sitemap URL. 

You can observe your sitemaps in Google Search Console > Index> Sitemaps

sitemaps in GSC

If there is something wrong with your sitemap, its status will be corresponding.

Wrong pages listed

A sitemap may become outdated (when some pages have already been removed from your site or redirected, but they are still on the sitemap ) or you simply might put the wrong URLs onto your sitemap.

So first of all, you need to check your sitemaps for pages that shouldn’t be included:

  • Pages blocked in robots.txt file 
  • Pages with a noindex tag or X-robots tag
  • Pages redirected with 301 status code (permanently)
  • Deleted pages with 404 status code
  • Canonicalized pages. 

Note: Generate your XML sitemaps right in WebSite Auditor’s Sitemap Generator to avoid any mistakes.

15. Robots.txt file

– Affected aspects: Crawlability and indexing –

Finally, the last thing to check is your robots.txt file. It enables you to block the crawling of certain pages. Here are the common issues that might appear:

Wrong pages disallowed/allowed

You may accidentally block the wrong page. Or it may be so that you deleted a page or set a permanent redirect but the page remained in a sitemap file. 

To see what pages are blocked from crawling, go to Site Structure > Site Audit > Indexing and Crawlability. Find the Resources Restricted from Indexing factor and check out the list of pages and their robots instructions. 

finding indexibility issues in WebSite Auditor
Download WebSite Auditor

If you see that some important pages were blocked, fix the issue by removing the corresponding robots.txt rule. You can manage your robots.txt file right in Website Auditor.

Logic errors

There also may be another issue: you block a page but it’s still crawled and indexed because the page is well-interlinked. 

You can check what pages are blocked in the robots.txt file but still are linked to with Website Auditor.

For that go to Site Structure > Pages and look over the pages, their robots instructions, and whether they have any internal links. 

logic errors
Download WebSite Auditor

Note: You can use Google’s robots.txt Tester to spot any issues that occurred. 

Take-home message

Proactive diagnosis, though terribly routine, is better than no-doubt-exciting rehabilitation of lost rankings. 

As you can see, there are plenty of things that can be managed incorrectly in terms of SEO. So, don’t wait until something bad happens to your site, run an SEO audit regularly. Download the PDF to keep the cheat sheet at hand when needed. 




 
Article stats:
Linking websites N/A
Backlinks N/A
InLink Rank N/A
Data from: backlink checker tool.
Got questions or comments?
Join our community on Facebook!