Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content behind a Paywall
-
We have a website that is publicly visible. This website has content.
We'd like to take that same content, put it on another website, behind a paywall.
Since Google will not be able to crawl those pages behind the paywall is there any risk to ua doing this?
Thanks!
Mike
-
Hi Mike, just to be clear on what Thomas is suggesting, as I think he might be getting mixed up between noindex and robots.txt.
If you simply add noindex,nofollow to a bunch of pages, this could still get you in trouble. Noindex doesn't mean DO NOT CRAWL, it means DO NOT INDEX. There's a big difference.
If something has noindex, Google can still crawl that content but they won't put it in the search results.
The only way to completely make sure that Google won't crawl content is by blocking it in robots.txt or in your case putting it behind a username and password.
So to answer your question, yes it's fine as long as it's behind a login, Google can't punish you for it since they can't see it.
I hope this helps,
Craig
-
Make sure you make the pages so that there are no follow pages and no index there's no risk of Google penalizing you for duplicate content whatsoever.
Build away. I hope I have helped you
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reusing content on different ccTLDs
We have a client with many international locations, each of which has their own ccTLD domain and website. Eg company-name.com, company-name.com.au, company-name.co.uk, company-name.fr, etc. Each domain/website only targets their own country, and the SEO aim is for each site to only rank well within their own country. We work for an individual country's operations, and the international head office wants to re-use our content on other countries' websites. While there would likely be some optimsation of the content for each region, there may be cases where it is re-used identically. We are concerned that this will cause duplicate content issues. I've read that the separate ccTLDs should indicate to search engines that content is aimed at the different locations - is this sufficient or should we be doing anything extra to avoid duplicate content penalties? Or should we argue that they simply must not do this at all and develop unique content for each? Thanks Julian
Content Development | | Bc.agency0 -
Internal blog with history and some SEO value versus new external blogs with specialized content?
We operate a blog inside a folder on our site and considering the launch of 4 highly focused blogs with specialized content which are now categories on the internal blog. Wondering if there is more value in using the external new blogs or just keep growing the internal blog content. Does fact that the internal blog is buried amongst millions of pages have any impact if we want the content indexed and value given to the links from the blog content to our main site pages.
Content Development | | CondoRich0 -
Should cornerstone content have 3,500 words? Does Google discern words from the main text and from the references?
Is it true that cornerstone content should have at least 3,500 words? I've done some research and found that the recommended amount is between 2K-10k. Also, the content that we create/publish has a lot of references/citations at the end of each article. Does Google discern words from the main text and from the references? Meaning should I count references as part of the word count? Thanks for the help!
Content Development | | kvillalobos0 -
I want to use some content that I sent out in a newsletter and post as a blog, but will this count as duplicate content?
I want to use some content that I sent out in a newsletter a while ago - adding it as a blog to my website. The newsletter exists on a http://myemail.constantcontact.com URL and is being indexed by Google. Will this count as duplicate content?
Content Development | | Wagada0 -
Images & Duplicate Content Issues
Here's a scenario for you: The site is running WordPress and the images are uploaded to the media section. You can set image attributes there such as the Description & Alt Tag. Let's say you'd like to reuse the same image in two different blog posts. The image keeps the same Description & Alt Tag associated with it in the media section. Would this be considered duplicate content? What would be the best practice in this case to reuse the same image in multiple posts?
Content Development | | VicMarcusNWI0 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
Is there a way to repost content (with permission) to another site without being penalized by Google?
I write a monthly Social Media Marketing column for a local Business Journal and the column is printed in their paper as well as posted on their website. Is there any way I can repost these articles on my website's blog without being penalized by Google for "duplicate content"?
Content Development | | vyki0 -
Onsite Content - Word Count & KW Density
Does the word count of a webpage make a difference to search engines? Are longer word counts on pages indexed higher or given higher priority? For example,say you have 300 words of copy packed with 20 keywords, and say you also have 700 words of copy that have the same 20 keywords worked in, does Google have a preference over which one it ranks higher?
Content Development | | greentent0