スキップしてコンテンツへ
ナレッジベース
Semrushツールキット
SEOツールキット
Site Audit
Why are only a few of my website’s pages being crawled?

Why are only a few of my website’s pages being crawled?

Why are only a few of my website’s pages being crawled? image 1

If you’ve noticed that only 4-6 pages of your website are being crawled (your home page, sitemaps URLs and robots.txt), most likely this is because our bot couldn’t find outgoing internal links on your Homepage. Below you will find possible reasons for this issue.

Our crawler could have been blocked on some pages in the website’s robots.txt or by noindex/nofollow tags. You can check if this is the case in your Crawled pages report:

Site Audit blocked pages

You can inspect your Robots.txt for any disallow commands that would prevent crawlers like ours from accessing your website.

If you see the following code on the main page of a website, it tells us that we’re not allowed to index/follow links on it and our access is blocked. Or, a page containing at least one of the two: "nofollow", "none", will lead to a crawling error.

<meta name="robots" content="noindex, nofollow">

You will find more information about these errors in our troubleshooting article.

Site Audit is currently equipped to parse homepages not larger than 4MB.

Why are only a few of my website’s pages being crawled? image 4

The limit for other pages of your website is 2MB. In case a page has too large HTML size, you will see the following error:

Why are only a few of my website’s pages being crawled? image 5