Duplicate Content and SEO

While having duplicate content on your website doesn’t result in any sort of Google action penalty, it does not help your overall SEO efforts.

Google actually only indexes a tiny percentage of webpages on the internet – only 4% according to this report. They don’t want to index more than they need to, so there is no point in having multiple pages in their index that use the same or nearly identical content.

What is duplicate content in SEO?

Duplicate content refers to generally identical large blocks of content that can be found across webpages on a single domain or across multiple domains. Duplicate content is bad for SEO as Google does not like keeping identical pieces of content in their index, and will therefore choose to index and display only one version of the content in search results.

What is considered duplicate content?

Duplicate content generally refers to large chunks of text on a page that can be found verbatim on another webpage. This can be in the form of product descriptions, republished posts, press releases, and other content that is repurposed without being rewritten.

Can I have two websites with the same content?

Yes you can, however Google will only display one version of the content in search results. Because you cannot establish one version of the content as canonical with separate domains, it may be difficult to know which version search engines will prefer.

Do duplicate page titles affect SEO?

Yes. It’s not just on-page content that you should check. If different pages use the same page titles, AKA title tag, then Google will have a tough time deciding which page to rank well for the keyword(s) specified in your page title. Your page titles are supposed to provide a quick overview of the content found on the page. Assuming that the on-page content is different, then it should be assumed that the page title should also be different.

A good way to check for duplicate page titles and is using a tool like Screaming Frog. It gives you a report of pages with duplicate page titles.

How does Google detect duplicate content?

Generally, Google will be able to detect duplicate content simply by crawling a website. Google will likely choose to index one version of the content, or only display one page with the content in search results. This is often determined by many factors including but not limited to:

  • Which page is in the sitemap
  • Canonical tag
  • Referring internal and external links
  • Your specified preferred version of your website

Do duplicate meta descriptions hurt SEO?

Duplicate content exampleJust like page titles, meta descriptions across your website should also be unique. Using similar or the same meta descriptions prevent you from being able to include target keywords in your meta description and provide users with a brief summary of what to expect on your webpage before actually visiting it. Again, Screaming Frog is a great tool for identifying pages with duplicate meta descriptions.

How do I check my website for duplicate content?

There are some tools like SEO Review Tools’ duplicate content checker. You can also grab a few lines of text, add quotation marks between the text, and do a Google search for the text. This will only return webpages that have that exact text on their page.

omitted duplicate contentYou may also be able to find other webpages with that text by repeating the search with the omitted results included.

How To Fix Duplicate Content Issues

Once you have found sources of duplicate content, there are a few options for addressing the duplicate content issues:

Rewrite content

If you have several distinct pages that happen to use the same content, it is likely best to rewrite the content altogether. This can add up to a lot of time and work, but in the long run, it is likely in your best interest.

Canonical tags

Implement canonical tags for the preferred version of the page that has two different URLs. For example, if you have both URLs https://www.example.com/services and https://example.com/services live on your website, then you can establish the preferred URL to search engines by creating a canonical tag.

301 Redirects

Ideally, you shouldn’t need to use canonical tags for identical pages and you will use a 301 redirect in the above example.

Check your sitemap

If you have canonical tags set up on your website and have submitted a sitemap, make sure that you are not providing conflicting information to search engines. For example, it would be a problem if your canonical tags tell Google to index this URL: https://example.com/services but your sitemap lists this URL instead: https://www.example.com/services

Robots.txt

You do have the option of blocking crawlers from accessing webpages on your website in your robots.txt file, however, this is often a last resort option and Google does not recommend this practice. Instead, try to utilize the canonical tag method or create 301 redirects.

Image by Clker-Free-Vector-Images from Pixabay

Michael Hall

Michael Hall is an Account Manager at Netvantage Marketing, which specializes in SEO, PPC and social media. Mike also runs our Denver office.

Leave a Reply

Your email address will not be published. Required fields are marked *