Why does Semrush say I have duplicate content?
If the Site Audit bot identifies multiple pages with 80% similarities in content, it will flag them as duplicate content.
HTTP & HTTPS / WWW & non-WWW
In most cases, domains have duplicate content due to http/https issues. According to W3C standards, whenever you have two versions of a URL (one on http and the other https), they are considered two separate documents.
The same goes for when a site has a www version of a page as well as a non-www version of the same page - search bots see these as two separate documents.
So when the SemrushBot sees these two separate documents, it will identify them as duplicate because that’s how the GoogleBot would see them.
To avoid this issue, you need to use canonical tags pointing from the duplicate pages to the correct page that you set as the canonical (or indexed) version.
You should also set up a 301 redirect from the http page to the https page so that users and search engine bots strictly see your https version.
Pages with little content
Another reason would be if Site Audit sees two pages that have the same content in the header and footer of your website, but there is so little body content (1 or 2 sentences on the page) that the bot sees the pages as at least 80% similar and therefore duplicates. In this case, you would need to expand the content on your pages so that bots can identify them as unique.
- What Issues Can Site Audit Identify?
- How many pages can I crawl in a Site Audit?
- How long does it take to crawl a website? It appears that my audit is stuck.
- How do I audit a subdomain?
- Can I manage the automatic Site Audit re-run schedule?
- Can I set up a custom re-crawl schedule?
- How is Site Health Score calculated in the Site Audit tool?
- How Does Site Audit Select Pages to Analyze for Core Web Vitals?
- How do you collect data to measure Core Web Vitals in Site Audit?
- Why is there a difference between GSC and Semrush Core Web Vitals data?
- Why are only a few of my website’s pages being crawled?
- Why do working pages on my website appear as broken?
- Why can’t I find URLs from the Audit report on my website?
- Why does Semrush say I have duplicate content?
- Why does Semrush say I have an incorrect certificate?
- What are unoptimized anchors and how does Site Audit identify them?
- What do the Structured Data Markup Items in Site Audit Mean?
- Can I stop a current Site Audit crawl?
- Site Auditの構成
- Troubleshooting Site Audit
- Site Audit 概要レポート
- Site Auditテーマ別レポート
- Site Auditで問題点を見る
- Site Audit クロールされたページのレポート
- Site Auditの統計
- Exporting Site Audit Results
- How to Optimize your Site Audit Crawl Speed
- How To Integrate Site Audit with Zapier