Technical SEO Checklist


1-Findability Checks

Google Search Console GoogleP age Speed Insights Google Sheet (Excel) Screaming Frog SEO Spider Pingdom (Uptime and Website Speed ) GTMetrix(Speed) HTML Site Map Generator XML Site Map Generator Video XML Site Map Generator Image XML Site Map Generator No Follow Browser Pluging
WHAT INDEXATION MEAN? The quantity of the pages that your Website possesses WHY THAT'S MATTER The algorithm favors rich content Website. If your business is in a competitive field, low standard pages such as 4 or 5 just won't anymore. If there are more contents on your page this will confirm if Google noticed it. Low indexation can indicate many factors issues( internal link issue, low domain authority or rending blocking code so on..) HOW I CAN FIND MY SITE'S INDEX RATE? 1- By heading to search console Search console indexation 2. Typing by entering your URL and see it on Google SERPs
Server Uptime mean? It's the process by which your hosting server keeps your downtime and your Website Online. Why does it matter? Website stability and speed accuracy is important because it allows the search engine to index your website consistently How to check my site's uptime? 1- Run a free check using Pingdom 2-Request free quality paid service in real-time from Pingdom if you are facing issues. 3-Large Website should regularly monitor as it is essential.
Why does it matter? You can decide how bots can access your website or not Crawl directive is strictly recommended for search engines How to check my robots.txt file? 1-Visit your is accessible on the Web 2- In Google Search Console navigate to crawl > robots.txt tester
What's meta robots tag? It's an HTML tag that you inserted between the code which  would instructed search engines on how to interpret the page-specific content < meta name ="robots"content="noindex"> Let's search engine know not to index the page.Also, this page can be use to index or not index images,pages follow or nofollow. Why robots tag is essential? If accidentally developer inserts "noindex" directives on important page, this could  low the site index rate. If there are duplicate contents on your websites, noindex tags could help  avoid duplicate penalty. Where to find my robots tags? 1-Insert your URL into Screaming Frog 2-Under internal, scroll to the right 3- Will see meta robots 1 and 2 4- Find directives and see which columns have a specific instruction Robots txt index at ecomprep
What 40x errors mean? Bad request happens when that request send to the server fails. Several numbers of client-side errors are (401-403) but the common is 404 and it happens when traffic is indexed straight to the page that no longer exists. Why 40 errors matter?
  • When your website isn't rendering properly, its crutial for your Seo ranking factor.
  • 404 is bad practices for Seo- moving pages or deleting can cause bad  User Experience(UX)
  • So before doing so, ensure proper steps have been taken
  • Inbound link 404 pages should be 301(permanently redirected) to a similar content.
Where and how to check my site errors?
  1. Log into Google Search Console
  2. Navigate into Index> Coverage
  3. Find list of 40x errors happening on your site.
What is an XML sitemap? XML file enables search engines to understand and crawl your Website. Why do XML sitemaps matter? It helps to improve and index your website. It helps against duplicate content- also gives ownership when the pages are indexed and listed Where to find my site XML? Type your in Google Search Console, navigate to crawl> Sitemaps.
What a meta robots tag mean? The meta robots are HTML tags inserted in between the code on pages and instructed the search engine to specify the page content. An example, instruct search engine to do not index that page.It can be use to index or not index pages,images and follow-nofollow links on a page Note: Do not confuse this NOFOLLOW with the rel="nofollow" link attribute. "Nofollow" in the head section will no follow all links on the page. What's the way to check my site’s meta robots tags?
    Insert your URL into Screaming Frog SEO. Look for “Internal” tab, scroll to the right. Find out “Meta Robots 1” and “Meta Robots 2”. Notice the two columns and the directives of each page on your website.
WHAT'S client Site 40X Errors Mean ?
  • 40x are “Bad Request” errors happen when the demand to the web server, fails.
  • There are several clients side errors that can occur (401, 403) but the well knpwn common is a 404 error, which most of the time happens when traffic is directed to a page that no longer exists.
  • Full list of 40X errors Why do 40x errors matter?
  • In general,anytime your website isn’t rendering properly for visitors, this is a bad experience .
  • 404 errors is absolutely a big NO to SEO best practices. If for some reason a page should be remove or delete, we need to ensure that the proper steps have been fulfiled for a good user experience (UX) and minimal loss of inbound link equity.
  • 404 pages that has inbound links or quality inbound traffic should be 301 (permanently redirected) to a similar piece of content on your site.
  • How can I check my site’s 40x errors?
  • Log into your Google Search Console.
  • Navigate to Index> Coverage.
  • You will see a list of 4ox errors occurring on your site.
  • Are these pages meant to be off comptetely? Do they have sufficiant inbound links targeted to them? Plan setting up 301 redirects to manage them.
  • A Site Map that stays on Webpage rather than XML File
  • Why do HTML site maps is important?
  • HTML sitemaps would help easy navigation for website users.
  • HTML SiteMaps by Matt Cutts
  • How to verify if my site has an HTML sitemap?
  • Do you have a page on your website that links to every page on your site?
  • Use this tool to generate an HTML sitemap.
  • What is a video XML sitemap?
  • A video sitemap enables search engines to quickly find videos content and index it for searches.
  • Why do video XML sitemaps are needed?
  • When video have been created video content, a sitemap will help you rank in Google Videos field, This is a the best method to easily increase organic traffic.
  • How do I verify if my site has a video XML sitemap?
  • Visit – do you have one listed?
  • In Google Search Console, navigate to Crawl > Sitemaps.
  • Use this tool to generate a video sitemap.
  • Submit your video sitemap in Google Search Console, (Crawl > Sitemaps > Add/Test Sitemap).
  • Image XML sitemap what it's?
  • An image sitemap helps to pop up rapidly image on search engines when index .
  • Why do image XML sitemaps are important?
  • People actively bypass organic results for image results, depending on the query.
  • Getting your images ranked for image based searches can skyrocket organic traffic How can I find out if my site has an image XML sitemap?
  • Visit – do you have one listed?
  • n Google Search Console, navigate to Crawl > Sitemaps.
  • Use this tool to generate an image sitemap.
  • Submit your video sitemap in Google Search Console, (Crawl > Sitemaps > Add/Test Sitemap).
  • Pagination, what's it?
  • HTML commands (“rel = prev” and “rel = next”) on content that overlaps over into different pages, but should be consider as one.
  • For example, your blog ( might have multiple pages ( as you create more content.
  • Pagination tags should be used on the “Next page” and “Previous page” links to inform search engines of this.
  • This also applies to eCommerce websites with multiple product pages and long guides that is broken down into multiple pages.
  • Why does pagination matter?
  • These tags can help avoid duplicate content penalties and very low indexation rates.
  • What is a 404 page?
  • Inform users and search engines when a page has been removed from your site.
  • Why do 404 pages matter?
  • Sometimes removing content (aka 404’ing it) is what’s necessary for your site, even if users will still be visiting it.
  • If that’s the case, a custom 404 page should inform users where to find other important pages on your site.
  • How can I verify if my site has a 404 page?
  • Type into a browser
  • What is a subdomain?
  • Extensions of your root domain that you can use for a number purposes:
  • Hosting content – setting up a blog –
  • Testing – a place to test offline content before pushing live to your final domain –
  • Private content – we use to host internal training content for staff members.
  • Why do subdomains is important?
  • They should be strategically plan into your SEO. For example:
  • should be set to “index” and have it’s personal set of directives (robots.txt and sitemap.xml) for proper search engine indexation.
  • should be set to “noindex” to avoid duplicate content penalties and searchers stumbling on content that wasn’t approved to go live yet.
  • How can I check my subdomains?
  • You can use this free tool to find and discover your site’s subdomains.

    2-Architecture Checks

    2-Architecture Checks

    What are breadcrumbs?
  • A trail, or secondary navigation, clearly visible to website users to help them navigate your website.
  • Why do breadcrumbs matter?
  • Search engines crawl from page to page through links. Breadcrumbs enforce page hierarchy and navigation to search engines.
  • Breadcrumbs also help users to navigate content, particularly on eCommerce websites with a number of product categories and high page depth.
  • If you have a content heavy website, users can get lost deep in your site. Breadcrumbs help to easily find their place and continue browsing, without having to use the top level navigation to return.
  • This check is better for eCommerce sites, less important for smaller sites with low pages.
  • How can I check my site’s breadcrumbs?
  • If you have breadcrumbs enabled, you should see them appear underneath your navigation as you dive deep into your site.
  • Not every website needs breadcrumbs, but content heavy and eCommerce websites should always have them setup.
  • How to set up Breadcrumbs in WP? How to code Breadcrums Menu?
    What is a TLN?
  • The main menu and navigation on a website.
  • Why do TLNs matter?
  • TLNs have a tremendous impact on both search indexation and the overall user experience.
  • Users want a logical, easy to use and find menu that clearly directs them where they need to go.
  • Search engines want the same thing – your top pages should be linked to from the TLN, whether that’s a dedicated section or a drop down depends on the amount of content on your website.
  • How can I analyze my site’s TLN?
  • Are your target pages linked to from your TLN?
  • Are you using SEO optimized titles in your TLN?
  • Is your TLN well organized for user’s to find what they need with minimal clicks?
  • Your TLN should be coded in HTML, NOT JS!
  • Why do footers matter?
  • You can’t link to every page on your website from your TLN (unless you have a small website).
  • Footers provide a great opportunity to pass equity to various pages or sections on your website.
  • How can I analyze my site’s footer?
  • It really depends on your website’s goals, niche, and depth of pages.
  • Generally speaking, I like to use the footer to link to important, non-sales pages.
  • Google’s latest update looks hard at the quality and depth of content on your website to ensure that you’re a legitimate, functioning business.
  • FAQ, locations page, privacy policy, careers, HTML sitemap and other pages important pages that search engines look for to determine the quality of your website.
  • It’s important to get these pages crawled by search engines, the footer is the best place to show search engines that these pages are an important part of your website.
  • What is site depth / structure?
  • This refers to the number of ‘clicks’ your pages are away from the starting URL.
  • Why does site depth / structure matter?
  • Both search engines and users shouldn’t have to click 1,000 times to get to important content on your website.
  • General SEO best practices state to keep important contnet (i.e. pages you want to rank) within 4 click of the starting URL (i.e. your home page).
  • Basically what that means is target pages should be easily accessible from top level nav, footer or located within a few clicks of these pages.
  • How can I analyze site depth / structure?
  • In Screaming Frog, run a crawl of your website.
  • All the way to the right, click on “Site Structure”.
  • You will see stats about how many pages you have and their depth from the starting URL.
  • Analyze which pages are over 3 clicks and decide if there’s a better place for them to live within your website.
  • 3-URL Analysis

    3-URL Analysis

    What are hyphens in the URL?
  • The default URL structure should use hyphens (” – “). For example: Good: Bad: Worse:,seo,professionals/
  • Why do hyphens matter?
  • Using _ or , as your URL structure causes search engines to read URL strings wrong.
  • Search engines read – as spaces. Using them ensures your content will be read the right way.
  • How can I check my URLs?
  • In Screaming Frog, run a crawl of your website.
  • Set the tab to “URI”.
  • In the search bar, enter “_”.
  • Filter the results to see if your URLs contain underscores. NOTE: you can also export your crawl to Excel for better filtering and analysis.
  • What is URL friendliness?
  • URLs should be structured (when possible) to be clean, short, memorable and shareable.
  • User experience signals (SERP click through rate in particular) are increasingly important ranking factors. Short, clean and readable URLs drive more SERP clicks than non friendly URLs.
  • For example, which would you click?
  • How can I check my URLs
  • In Screaming Frog, run a crawl of your website.
  • Set the tab to “URI”.
  • Export data to Excel and analyze.
  • It could be a CMS issue forcing URLs into unfriendly states. If URLs are human generated, you should recommend creating a URL structuring guide for those pushing URLs live to ensure friendliness going forward.
  • Consider changing extremely unfriendly for cleaner ones, 301 redirect old into the new.
  • What are absolute and relative URLs?
  • Relative URLs are often used by web developers as shorthand to code internal links on a website. They do not contain the full URL, but still link to the destination page.
  • Absolute URLs contain the full URL string when linking internally to another page.
  • Why does this matter?
  • Relative URLs are SLIGHTLY better for page loading times and easier for developers when coding HTML.
  • Absolute URLs are better for SEO as they contain the full URL string, better optimized for search engine crawling.
  • How can I check my URLs?
  • On any given page on the website, right click and select “view source code”.
  • Find an internal link.
  • If it’s coded as anchor, it’s an absolute link. relative-URLs