How we help users and webmasters with duplicate content
We've designed algorithms to help prevent duplicate content from negatively affecting webmasters and the user experience.
1. When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.
2. We select what we think is the "best" URL to represent the cluster in search results.
_3. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL._Consolidating properties from duplicates into one representative URL often provides users with more accurate search results.
If you find you have duplicate content as mentioned above, can you help search engines understand your site?
First, no worries, there are many sites on the web that utilize URL parameters and for valid reasons. But yes, you can help reduce potential problems for search engines by:
1. Removing unnecessary URL parameters -- keep the URL as clean as possible.
2. Submitting a Sitemap with the canonical (i.e. representative) version of each URL. While we can't guarantee that our algorithms will display the Sitemap's URL in search results, it's helpful to indicate the canonical preference.