ナレッジベース
Semrushツールキット
SEOツールキット
サイト診断
Why do working pages on my website appear as broken?

Why do working pages on my website appear as broken?

Why do working pages on my website appear as broken? image 1

There may be a time where Site Audit may show that some of your internal pages are reported as broken but are actually still fully up-and-running. While this does not happen often, it can be confusing for some users.

Generally, when this happens it's because of a false-positive. The three most common reasons for a false-positive reading are:

  • Our Site Audit crawler could have been blocked for some pages in robots.txt or by noindex tags

    Why do working pages on my website appear as broken? image 2
  • Hosting providers might block Semrush bots as they believe it is a DDoS attack (massive amount of hits during a short period of time)
  • At the moment of the campaign re-crawl, the domain could not be resolved by DNS
  • Website server cache storing old data and providing it to crawler bots

If you believe it’s happening because of a crawler issue you can learn how to troubleshoot your robots.txt in this article

You can also lower the crawl speed in order to avoid a large number of hits on your pages at one time. That is why you could see this page as a working one, which is right, and our bot was unable to do so and reported the false-positive result.

Site Audit crawler settings

 

To solve the problem with the server cache, please try clearing it and then re-running Site Audit again — this will bring you updated results that include all your fixes.

よくある質問 もっと表示する
マニュアル もっと表示する
ワークフロー もっと表示する