Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Let your reputation grow with Reviews AI
      Moz Local

      Let your reputation grow with Reviews AI

      Learn more
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Can PDF be seen as duplicate content? If so, how to prevent it?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Can PDF be seen as duplicate content? If so, how to prevent it?

    Intermediate & Advanced SEO
    7
    20
    12862
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • Gestisoft-Qc
      Gestisoft-Qc Subscriber last edited by

      I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it.

      We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process.

      However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents.

      Any one has insight on how to deal with PDF provided by third parties?

      Thanks in advance.

      1 Reply Last reply Reply Quote 1
      • ilonka65
        ilonka65 last edited by

        It looks like Google is not crawling tabs anymore, therefore if your pdf's are tabbed within pages, it might not be an issue: https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html

        1 Reply Last reply Reply Quote 0
        • ASriv
          ASriv Subscriber last edited by

          Sure, I understand - thanks EGOL

          1 Reply Last reply Reply Quote 0
          • EGOL
            EGOL @ASriv last edited by

            I would like to give that to you but it is on a site that I don't share in forums.  Sorry.

            1 Reply Last reply Reply Quote 0
            • ASriv
              ASriv Subscriber last edited by

              Thanks EGOL

              That would be ideal.

              For a site that has multiple authors and with it being impractical to get a developer involved every time a web page / blog post and the pdf are created, is there a single line of code that could be used to accomplish this in .htaccess?

              If so, would you be able to show me an example please?

              EGOL 1 Reply Last reply Reply Quote 0
              • EGOL
                EGOL last edited by

                I assigned rel=canonical to my PDFs using htaccess.

                Then, if anyone links to the PDFs the linkvalue gets passed to the webpage.

                1 Reply Last reply Reply Quote 0
                • ASriv
                  ASriv Subscriber last edited by

                  Hi all

                  I've been discussing the topic of making content available as both blog posts and pdf downloads today.

                  Given that there is a lot of uncertainty and complexity around this issue of potential duplication, my plan is to house all the pdfs in a folder that we block with robots.txt

                  Anyone agree / disagree with this approach?

                  1 Reply Last reply Reply Quote 0
                  • Dr-Pete
                    Dr-Pete Staff @ATMOSMarketing56 last edited by

                    Unfortunately, there's no great way to have it both ways. If you want these pages to get indexed for the links, then they're potential duplicates. If Google filters them out, the links probably won't count. Worst case, it could cause Panda-scale problems. Honestly, I suspect the link value is minimal and outweighed by the risk, but it depends quite a bit on the scope of what you're doing and the general link profile of the site.

                    1 Reply Last reply Reply Quote 0
                    • ATMOSMarketing56
                      ATMOSMarketing56 Subscriber last edited by

                      I think you can set it to public or private (logged-in only) and even put a price-tag on it if you want. So yes setting it to private would help to eliminate the dup content issue, but it would also hide the links that I'm using to link-build.

                      I would imagine that since this guide would link back to our original site that it would be no different than if someone were to copy the content from our site and link back to us with it, thus crediting us as the original source. Especially if we ensure to index it through GWMT before submitting to other platforms. Any good resources that delve into that?

                      Dr-Pete 1 Reply Last reply Reply Quote 0
                      • Dr-Pete
                        Dr-Pete Staff last edited by

                        Potentially, but I'm honestly not sure how Scrid's pages are indexed. Don't you need to log in or something to actually see the content on Scribd?

                        1 Reply Last reply Reply Quote 0
                        • ATMOSMarketing56
                          ATMOSMarketing56 Subscriber last edited by

                          What about this instance:

                          (A) I made an "ultimate guide to X" and posted it on my site as individual HTML pages for each chapter

                          (B) I made a PDF version with the exact same content that people can download directly from the site

                          (C) I uploaded the PDF to sites like Scribd.com to help distribute it further, and build links with the links that are embedded in the PDF.

                          Would those all be dup content? Is (C) recommended or not?

                          1 Reply Last reply Reply Quote 0
                          • EGOL
                            EGOL @Gestisoft-Qc last edited by

                            Thanks!. I am going to look into this.  I'll let you know if I learn anything.

                            1 Reply Last reply Reply Quote 0
                            • Dr-Pete
                              Dr-Pete Staff @Gestisoft-Qc last edited by

                              If they duplicate your main content, I think the header-level canonical may be a good way to go. For the syndication scenario, it's tough, because then you're knocking those PDFs out of the rankings, potentially, in favor of someone else's content.

                              Honestly, I've seen very few people deal with canonicalization for PDFs, and even those cases were small or obvious (like a page with the exact same content being outranked by the duplicate PDF). It's kind of uncharted territory.

                              1 Reply Last reply Reply Quote 3
                              • EGOL
                                EGOL @Gestisoft-Qc last edited by

                                Thanks for all of your input Dr. Pete. The example that you use is almost exactly what I have - hundreds of .pdfs on a fifty page site. These .pdfs rank well in the SERPs, accumulate pagerank, and pass traffic and link value back to the main site through links embedded within the .pdf. The also have natural links from other domains. I don't want to block them or nofollow them butyour suggestion of using header directive sounds pretty good.

                                1 Reply Last reply Reply Quote 0
                                • Dr-Pete
                                  Dr-Pete Staff @Gestisoft-Qc last edited by

                                  Oh, sorry - so these PDFs aren't duplicates with your own web/HTML content so much as duplicates with the same PDFs on other websites?

                                  That's more like a syndication situation. It is possible that, if enough people post these PDFs, you could run into trouble, but I've never seen that. More likely, your versions just wouldn't rank. Theoretically, you could use the header-level canonical tag cross-domain, but I've honestly never seen that tested.

                                  If you're talking about a handful of PDFs, they're a small percentage of your overall indexed content, and that content is unique, I wouldn't worry too much. If you're talking about 100s of PDFs on a 50-page website, then I'd control it. Unfortunately, at that point, you'd probably have to put the PDFs in a folder and outright block it. You'd remove the risk, but you'd stop ranking on those PDFs as well.

                                  1 Reply Last reply Reply Quote 2
                                  • EGOL
                                    EGOL @Gestisoft-Qc last edited by

                                    @EGOL: Can you expend a bit on your Author suggestion?

                                    I was wondering if there is a way to do rel=author for a pdf document.  I don't know how to do it and don't know if it is possible.

                                    1 Reply Last reply Reply Quote 0
                                    • Gestisoft-Qc
                                      Gestisoft-Qc Subscriber @Dr-Pete last edited by

                                      To make sure I understand what I'm reading:

                                      • PDFs don't usually rank as well as regular pages (although it is possible)
                                      • It is possible to configure a canonical tag on a PDF

                                      My concern isn't that our PDFs may outrank the original content but rather getting slammed by Google for publishing them.

                                      Am right in thinking a canonical tag prevents to accumulate link juice? If so I would prefer to not use it, unless it leads to Google slamming.

                                      Any one has experienced Google retribution for publishing PDF coming from a 3rd party?

                                      @EGOL: Can you expend a bit on your Author suggestion?

                                      Thanks all!

                                      EGOL Dr-Pete 5 Replies Last reply Reply Quote 0
                                      • Dr-Pete
                                        Dr-Pete Staff last edited by

                                        I think it's possible, but I've only seen it in cases that are a bit hard to disentangle. For example, I've seen a PDF outrank a duplicate piece of regular content when the regular content had other issues (including massive duplication with other, regular content). My gut feeling is that it's unusual.

                                        If you're concerned about it, you can canonicalize PDFs with the header-level canonical directive. It's a bit more technically complex than the standard HTML canonical tag:

                                        http://googlewebmastercentral.blogspot.com/2011/06/supporting-relcanonical-http-headers.html

                                        I'm going to mark this as "Discussion", just in case anyone else has seen real-world examples.

                                        Gestisoft-Qc 1 Reply Last reply Reply Quote 2
                                        • EGOL
                                          EGOL last edited by

                                          I am really interested in hearing what others have to say about this.

                                          I know that .pdfs can be very valuable content.  They can be optimized, they rank in the SERPs, they accumulate PR and they can pass linkvalue.  So, to me it would be a mistake to block them from the index...

                                          However, I see your point about dupe content... they could also be thin content.  Will panda whack you for thin and dupes in your PDFs?

                                          How can canonical be used... what about author?

                                          Anybody know anything about this?

                                          1 Reply Last reply Reply Quote 3
                                          • MargaritaS
                                            MargaritaS last edited by

                                            Just like any other piece of duplicate content, you can use canonical link elements to specify the original piece of content (if there's indeed more than one identical piece). You could also block these types of files in the robots.txt, or use noindex-follow meta tags.

                                            Regards,

                                            Margarita

                                            1 Reply Last reply Reply Quote 5
                                            • 1 / 1
                                            • First post
                                              Last post

                                            Got a burning SEO question?

                                            Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                                            Start my free trial


                                            Browse Questions

                                            Explore more categories

                                            • Moz Tools

                                              Chat with the community about the Moz tools.

                                            • SEO Tactics

                                              Discuss the SEO process with fellow marketers

                                            • Community

                                              Discuss industry events, jobs, and news!

                                            • Digital Marketing

                                              Chat about tactics outside of SEO

                                            • Research & Trends

                                              Dive into research and trends in the search industry.

                                            • Support

                                              Connect on product support and feature requests.

                                            • See all categories

                                            Related Questions

                                            • davidmac

                                              Upper and lower case URLS coming up as duplicate content

                                              Hey guys and gals, I'm having a frustrating time with an issue. Our site has around 10 pages that are coming up as duplicate content/ duplicate title. I'm not sure what I can do to fix this. I was going to attempt to 301 direct the upper case to lower but I'm worried how this will affect our SEO. can anyone offer some insight on what I should be doing? Update:  What I'm trying to figure out is what I should do for our URL's. For example, when I run an audit I'm getting two different pages: aaa.com/BusinessAgreement.com and also aaa.com/businessagreement.com. We don't have two pages but for some reason, Google thinks we do.

                                              Intermediate & Advanced SEO | | davidmac
                                              1
                                            • nchlondon

                                              Directory with Duplicate content? what to do?

                                              Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.

                                              Intermediate & Advanced SEO | | nchlondon
                                              0
                                            • AMHC

                                              Removing duplicate content

                                              Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?

                                              Intermediate & Advanced SEO | | AMHC
                                              0
                                            • iam-sold

                                              How can I prevent duplicate pages being indexed because of load balancer (hosting)?

                                              The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!

                                              Intermediate & Advanced SEO | | iam-sold
                                              0
                                            • jyoung222

                                              How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN

                                              A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target.  We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!

                                              Intermediate & Advanced SEO | | jyoung222
                                              0
                                            • sbaylor

                                              Artist Bios on Multiple Pages: Duplicate Content or not?

                                              I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution?  Is this effort even necessary?</p> <p>Thoughts?</p></iframe>

                                              Intermediate & Advanced SEO | | sbaylor
                                              0
                                            • WebbyNabler

                                              Duplicate Content From Indexing of non- File Extension Page

                                              Google somehow has indexed a page of mine without the .html extension.  so they indexed  www.samplepage.com/page, so I am showing duplicate content because Google also see's  www.samplepage.com/page.html   How can I force google or bing or whoever to only index and see the page including the .html extension?  I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!

                                              Intermediate & Advanced SEO | | WebbyNabler
                                              0
                                            • gregelwell

                                              Could you use a robots.txt file to disalow a duplicate content page from being crawled?

                                              A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?

                                              Intermediate & Advanced SEO | | gregelwell
                                              0

                                            Get started with Moz Pro!

                                            Unlock the power of advanced SEO tools and data-driven insights.

                                            Start my free trial
                                            Products
                                            • Moz Pro
                                            • Moz Local
                                            • Moz API
                                            • Moz Data
                                            • STAT
                                            • Product Updates
                                            Moz Solutions
                                            • SMB Solutions
                                            • Agency Solutions
                                            • Enterprise Solutions
                                            • Digital Marketers
                                            Free SEO Tools
                                            • Domain Authority Checker
                                            • Link Explorer
                                            • Keyword Explorer
                                            • Competitive Research
                                            • Brand Authority Checker
                                            • Local Citation Checker
                                            • MozBar Extension
                                            • MozCast
                                            Resources
                                            • Blog
                                            • SEO Learning Center
                                            • Help Hub
                                            • Beginner's Guide to SEO
                                            • How-to Guides
                                            • Moz Academy
                                            • API Docs
                                            About Moz
                                            • About
                                            • Team
                                            • Careers
                                            • Contact
                                            Why Moz
                                            • Case Studies
                                            • Testimonials
                                            Get Involved
                                            • Become an Affiliate
                                            • MozCon
                                            • Webinars
                                            • Practical Marketer Series
                                            • MozPod
                                            Connect with us

                                            Contact the Help team

                                            Join our newsletter
                                            Moz logo
                                            © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                            • Accessibility
                                            • Terms of Use
                                            • Privacy

                                            Looks like your connection to Moz was lost, please wait while we try to reconnect.