Is Your Website Crawler Friendly?

In addition to making sure the content on your landing pages is spiderable, you should also confirm that search engine robots will be able to crawl your site effectively.

What is crawlability?

We define crawlability as the ability for search engine robots to crawl and index the pages on a website. If the links on your site are formatted in a way that is not search-engine friendly, then your site’s crawlability would be considered extremely poor.

It is very difficult to achieve good search engine rankings if you have poor crawlability because you will essentially be eliminating the ability to benefit from internal off-page SEO. In general, formatting links to be text- or image-based should be the goal. You should avoid using unspiderable JavaScript or Flash links as your navigation.

Tools to Use

One of the best tools that we use to estimate how a search engine robot will crawl your site is Xenu’s Link Sleuth. The tool will begin crawling your site from the home page and then spider every page that it finds during the scan. It will tell you about any broken links and pages that redirect.

You can also export linking statistics from the scan including how many inbound and outbound links each page has. If Link Sleuth cannot reach pages that you know exist, then you can assume that search engine robots will not find them either (unless there are external links to the page).

You can also use site: commands to determine how many pages the search engines have indexed from your site (check out my SEO bookmarklets that help save HUGE amounts of time with commands).

If you have a hundred pages and Google has only indexed 20 of them then you could have a crawlability issue. You may also want to check if your pages are supplemental. Your crawlability might need to be improved if a large percentage of your pages are in Google’s supplemental index.

Diagnosing the crawlability issue

Remember that if your links are contained within frames, JavaScript or Flash, then they may not be getting indexed. You should compare Google’s text-only version of your page with the full page. If the links do not appear in the cached text, then they are probably not distributing PageRank properly.

Another crawlability test can be done by disabling JavaScript in your browser. If you cannot see and/or click a link on the page, then the search engines might not be able to follow it. You should strongly consider reformatting your links or creating new links to improve crawlability if your site fails either of these tests.

If you’re unsure where to start, get in touch. I’m an expert SEO consultant and audited hundreds of websites for this exact aspect of search engine optimisation.

I’ll help identify crawlability issues and provide fixes if necessary.

Request your SEO audit today.