In addition to making sure the content on your landing pages is spiderable, you should also confirm that search engine robots will be able to crawl your site effectively.
What is crawlability?
We define crawlability as the ability for search engine robots to crawl and index the pages on a website. If the links on your site are formatted in a way that is not search-engine friendly, then your site’s crawlability would be considered extremely poor.
Tools to Use
One of the best tools that we use to estimate how a search engine robot will crawl your site is Xenu’s Link Sleuth. The tool will begin crawling your site from the home page and then spider every page that it finds during the scan. It will tell you about any broken links and pages that redirect.
You can also export linking statistics from the scan including how many inbound and outbound links each page has. If Link Sleuth cannot reach pages that you know exist, then you can assume that search engine robots will not find them either (unless there are external links to the page).
You can also use site: commands to determine how many pages the search engines have indexed from your site (check out my SEO bookmarklets that help save HUGE amounts of time with commands).
If you have a hundred pages and Google has only indexed 20 of them then you could have a crawlability issue. You may also want to check if your pages are supplemental. Your crawlability might need to be improved if a large percentage of your pages are in Google’s supplemental index.
Diagnosing the crawlability issue
If you’re unsure where to start, get in touch. I’m an expert SEO consultant and audited hundreds of websites for this exact aspect of search engine optimisation.
I’ll help identify crawlability issues and provide fixes if necessary.
Request your SEO audit today.