Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Percentage of duplicate content allowable
-
Can you have ANY duplicate content on a page or will the page get penalized by Google?
For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse?
If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content?
thanks!
-
I dont believe you have aproblem if you havea bit of duplicate content, google does not penilize you for duplicate content, it just dosent award you points for it.
-
That sounds like something Google will hate by default. Your problem there is page quantity to quality and uniqueness ratio.
-
It's quite difficult to provide the exact data as Google algorithm is Google's hidden treasure. Better to keep yourself safe by creating completely unique content, Referring to your example of Wikipedia definition, you can add something like " ACCORDING TO WIKIPEDIA ..... " while copying definition or adding reference links while copying any content from other sources.
Remember that Google is not only giving importance to unique content but it should be of high quality. That means the article should be innovative like a complete new thing & well researched, so it mustn't be of 200 or less words. So Google will compare the quality of the whole article with the copied content & then it'll decide whether it's a duplicate content article or not.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database.
So the only unique content is a single sentence?
Within that sentence many of the words would need to be common as well. Consider a simple site that offered the population for any given location. "The population of [California] is [13 million] people."
In the above example only 3 words are unique. Maybe your pages are a bit more elaborate but it seems to me those pages are simply not indexable. What you can do is index the main page where users can enter the location they wish to learn about, but not each possible result (i.e. California).
Either add significantly more content, or only index the main page.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database. All pages are relevant to users and provide more value than other results in serps, but i think a penalty is in place that the farmer update may have detected with a sort of auto-penalty against us.
I sent in a reconsideration request last week, the whole project is on hold until we get a response. I'm expecting a generic answer from them.
We are debating on either writing more unique content for every page or entering in more statistical data to run some cool correlations. The statistical data would be 3x more beneficial to the user I feel, but unique content is what Google seeks and a safer bet just to get us indexed properly.
-
We're currently observing a crumbling empire of websites with auto-generated content. Google is somehow able to understand how substantial your content is and devalue the page and even the whole site if it does not meet their criteria. This is especially damaging for sites who have say 10% of great unique content and 90% of their pages are generated via tagging, browsable search and variable driven paragraphs of text.
Having citations is perfectly normal but I would include reference section just in case.
-
You can have some duplicate content in the manner you mentioned above. It is a natural and expected part of the internet that existing sources of information will be utilized.
There is not any magic number which says "30% duplication is ok, but 31% is not". Google's algorithms are private and constantly changing. Use good sense to guide you as to whether your page is unique and offers value to users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using Yoast variables for meta content overwrite any pages that already have custom meta content?
The question is about the Yoast plugin for WP sites. Let's say I have a site with 200 pages and custom meta descriptions / title tags already in place for the top 30 pages. If I use the Yoast variable tool to complete meta content for the remaining pages (and make my Moz issue tracker look happier), will that only affect the pages without custom meta descriptions or will it overwrite even the pages with the custom meta content that I want? In this situation, I do want to keep the meta content that is already in place on select pages. Thanks! Zack
On-Page Optimization | | rootandbranch0 -
What should I be shooting for for search visibility percentage?
Realistically - what's a "good" search visibility score? I'm working on a site that has been around less than a year. We are doing 3 blogs a week with carefully selected keywords. I know it will take time and lots of SEO work - but I'm interested in any ideas on what I should shoot for. Thanks for any thoughts! (Also not a local listing or anything - national search.)
On-Page Optimization | | mm19804 -
Thoughts on archiving content on an event site?
I have a few sites that are used exclusively to promote live events (ex. tradeshows, conference, etc). In most cases these sites content fewer than 100 pages and include information for the upcoming event with links to register. Some time after the event has ended, we would redesign the site and start promoting next years event...essentially starting over with a new site (same domain). We understand the value that many of these past event pages have for users who are looking for info from the past event and we're looking for advice on how best to archive this content to preserve for SEO. We tend to use concise urls for pages on these sites. Ex. www.event.com/agenda or www.event.com/speakers. What are your thoughts on archiving the content from these pages so we can reuse the url with content for the new event? My first thought is to put these pages into an archive, like www.event.com/2015/speakers. Is there a better way to do this to preserve the SEO value of this content?
On-Page Optimization | | accessintel0 -
Does hover over content index well
i notice increasing cases of portfolio style boxes on site designs (especially wordpress templates) where you have an image and text appears after hover over (sorry for my basic terminology). does this text which appears after hover over have much search engine value or as it doesnt immediately appear on pageload does it carry slightly less weight like tabbed content? any advice appreciated thanks neil
On-Page Optimization | | neilhenderson0 -
Duplicate Content - Bulk analysis tool?
Hi I wondered if there's a tool to analyse duplicate content - within your own site or on external sites, but that you can upload the URL's you want to check in bulk? I used Copyscape a while ago, but don't remember this having a bulk feature? Thank you!
On-Page Optimization | | BeckyKey0 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
Duplicate page titles and Content in Woocommerce
Hi Guys, I'm new to Moz and really liking it so far!
On-Page Optimization | | jeeyer
I run a eCommerce site on Wordpress + WooCommerce and ofcourse use Yoast for SEO optimalisation I've got a question about my first Crawl report which showed over 600 issues! 😐 I've read that this is something that happens more often (http://cloudz.click/blog/setup-wordpress-for-seo-success). Most of them are categorized under:
1. Duplicate Page Titles or;
2. Duplicate Page Content. Duplicate Page Titles:
These are almost only: product category pages and product tags. Is this problem beeing solved by giving them the right SEO SERP? I see that a lot of categories don't have a proper SEO SERP set up in yoast! Do I need to add this to clear this issue, or do I need to change the actual Title? And how about the Product tags? Another point (bit more off-topic) I've read here: http://cloudz.click/community/q/yoast-seo-plugin-to-index-or-not-to-index-categories that it's advised to noindex/follow Categories and Tags but isn't that a wierd idea to do for a eCommerce site?! Duplicate Page Content:
Same goes here almost only Product Categories and product tags that are displayed as duplicate Page content! When I check the results I can click on a blue button for example "+ 17 duplicates" and that shows me (in this case 17 URLS) but they are not related to the fist in any way so not sure where to start here? Thanks for taking the time to help out!
Joost0 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0