Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content from on Competitor's site?
-
I've recently discovered large blocks of content on a competitors site that has been copy and pasted from a client's site. From what I know, this will only hurt the competitor and not my client since my guy was the original. Is this true? Is there any risk to my client? Should we take action?
Dino
-
Thanks, EGOL. It's definitely weaker, but will ask them to change it.
-
...this will only hurt the competitor and not my client... Is this true?
No
Is there any risk to my client?
Yes
Should we take action?
Maybe
If copied content is placed on a stronger site or simply a site that Google likes better it can...
-
outrank the original site
-
pull traffic away from the original site for primary and long tail keywords
-
acquire links, likes and tweets that should belong to original site
-
cause Panda problems for the original site
So, if the competitor who took the content is strong then your client could have a huge problem. If the competitor is weak then you probably don't have a problem now but could have one at a future time.
Go out looking at the SERPs for short and long tail keywords to see if this site is competitive.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Word Count - Content site vs ecommerce site
Hi there, what are your thoughts on word count for a content site vs. an ecommerce site. A lot of content sites have no problem pushing out 500+ words per page, which for me is a decent amount to help you get traction. However on ecommerce sites, a lot of the time the product description only needs to be sub-100 words and the total word count on the page comes in at under 300 words, a lot of that could be considered duplicate. So what are your views? Do ecommerce sites still need to have a high word count on the product description page to rank better?
On-Page Optimization | | Bee1590 -
How do I fix my portfolio causing duplicate content issues?
Hi, Im new to this whole duplicate content issue. I have a website, fatcatpaperie.com that I use the portofolio feature in Wordpress as my gallery for all my wedding invitations. I have a ton of duplicate content issues from this. I don't understand at all how to fix this. I'd appreciate any help! Below is an example of one duplicate content issue. They have slightly different names, different urls, different images and all have no text. But are coming up as duplicates. Would it be as easy as putting a different metadescription for each?? Thanks for the help! Rena | "Treasure" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/treasure-designers-fine-press 1 0 0 0 200 3 duplicates "Perennial" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/perennial-by-designers-fine-press 1 0 0 0 200 1 of 3 duplicates "Primrose" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/8675 1 0 0 0 200 2 of 3 duplicates "Catalina" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/catalina-designers-fine-press |
On-Page Optimization | | HonestSEOStudio0 -
Duplicate content with tagging and categories
Hello, Moz is showing that a site has duplicate content - which appears to be because of tags and categories. It is a relatively new site, with only a few blog publications so far. This means that the same articles are displayed under a number of different tags and categories... Is this something I should worry about, or just wait until I have more content? The 'tag' and 'category' pages are not really pages I would expect or aim for anyone to find in google results anyway. Would be glad to here any advice / opinions on this Thanks!
On-Page Optimization | | wearehappymedia1 -
Content hidden behind a 'read all/more..' etc etc button
Hi Anyone know latest thinking re 'hidden content' such as body copy behind a 'read more' type button/link in light of John Muellers comments toward end of last year (that they discount hidden copy etc) & follow up posts on Search Engine Round Table & Moz etc etc ? Lots of people were testing it and finding such content was still being crawled & indexed so presumed not a big deal after all but if Google said they discount it surely we now want to reveal/unhide such body copy if it contains text important to the pages seo efforts. Do you think it could be the case that G is still crawling & indexing such content BUT any contribution that copy may have had to the pages seo efforts is now lost if hidden. So to get its contribution to SEO back one needs to reveal it, have fully displayed ? OR no need to worry and can keep such copy behind a 'read more' button/link ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
Multilingual site with untranslated content
We are developing a site that will have several languages. There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated. We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have: etc In the spanish version, we would point to the french version and the english version etc. My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages? I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
On-Page Optimization | | jorgeapartime0 -
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy. For example - the old structure : country / city / city area Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city : We needed to change the structure to be : country / region / area / city / cityarea So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too. Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301). So my question is (sorry for long waffle) : Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually? Thanks for any help anyone can give.
On-Page Optimization | | TinkyWinky0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0