How To Avoid The Google.com Duplicate Content Filter

How To Avoid The Google.com Duplicate Content Filter?

More and more webmasters are building sites with publicly available content (data feeds, news feeds, short articles). A number of copies of the exact same content in a search engine does not actually do any great and so Google obviously chose to weed out some of this replicate material to be able to deliver cleaner and better search outcomes.

If a web designer was releasing the exact very same content on more than one domain, all domains in question were eventually gotten rid of from Google’s index. Soon after this started some web designer forums saw the very same problems and stories again and if 1 & 1 was put together a clear image of the scenario was readily available: a replicate content filter was applied.

Duplicate content is not always bad and will always exist in one method or the other. News sites are the very best example of duplicate content Nobody anticipates those to be dropped from Google’s index.

How can webmasters prevent the duplicate content filter? There are quite a few things webmasters can do when using duplicate content of any sort and still produce unique pages and content from it.

Let’s see a few of these alternatives described here.

1) Unique material on pages with replicate material

On pages where replicate content is being utilized, unique material should be added. If you (the web designer) can add 15% – 30% distinct content to pages where you show replicate material the total ratio of duplicate content compared to the total content of that page goes down.

2) Randomization of material.

Ever seen those “Quote of the Day” thingies on some websites? It adds a random quote of the day to a page at any given time. Each time you come back the page will look different. Those scripts can be utilized for many more things than just displaying a quote of the day with just a few code modifications.

With some creativity a webmaster can use such a script to produce the impression pages are always upgraded and constantly different. This can be a fantastic tool to avoid Google to use the replicate content filter.

3) Unique content.

Yes, distinct material is still king. Sometimes you just can not work around using duplicate material at all. That is alright. However how about including unique material to your website, too. If the general ratio of unique content and duplicate content is well-balanced opportunities that the replicate content filter applies to your website are much lower.

I personally recommend that a site has at least 30% of distinct content to provide (I admit – I am in some cases having difficulties myself to reach that level but I attempt).

Will this assurance that your website stays in Google’s index? I do not know. To be most effective a website ought to be totally unique. Distinct material is exactly what draws visitors to a website. Everything else can be discovered elsewhere, too and visitors have no reason to simply check out one particular website if they can get the exact same thing elsewhere.

4) Rewriting

Another way to address the issue is to rewrite your articles. You can do this manually but if you have several sites and several posts on each site, this can prove very difficult to do. You may want to look into Spin Rewriter. You can set up automation and have your articles rewritten and posted as new content. This will keep Google on your site for faster and better indexing.


This was just a very brief history and discussion on Google Unique Content filter. Please let us know any info that I could add to this post. If a correction is in order please be kind and let us know that as well.


Kindest Regards




Leave a Reply