Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I disable the indexing of tags in Wordpress?
-
Hi,
I have a client that is publishing 7 or 8 news articles and posts each month. I am optimising selected posts and I have found that they have been adding a lot of tags (almost like using hashtags) .
There are currently 29 posts but already 55 tags, each of which has its own archive page, and all of which are added to the site map to be indexed (https://sykeshome.europe.sykes.com/sitemap_index.xml).
I came across an article (https://crunchify.com/better-dont-use-wordpress-tags/) that suggested that tags add no value to SEO ranking, and as a consequence Wordpress tags should not be indexed or included in the sitemap.
I haven't been able to find much more reliable information on this topic, so my question is - should I get rid of the tags from this website and make the focus pages, posts and categories (redirecting existing tag pages back to the site home page)?
It is a relatively new websites and I am conscious of the fact that category and tag archive pages already substantially outnumber actual content pages (posts and news) - I guess this isn't optimal.
I'd appreciate any advice.
Thanks
-
Yes it would be best if you were the tags option off, It excellent performance working for example Shillong Teer Club chart
-
Disabling the indexing of tags in WordPress can be beneficial for SEO purposes, as it prevents search engines from indexing individual tag pages, which may otherwise lead to duplicate content issues. However, whether to disable tag indexing depends on your specific website goals and content structure. If you use tags sparingly and they add value to your site's organization, leaving them indexed may be beneficial. Evaluate your SEO strategy and content structure to determine the best approach for your WordPress site.
-
I'm having the same problem right now, my site is fairly new but it was already receiving some organic traffic then, all of a sudden traffic sank like a brick and i can't find the answer why.
Only thing that changed was adding tags to blog posts which i think might be creating duplicate content so I'm proceeding to disable those, will leave categories alive for the moment because they were bringing traffic but if nothing changes after it will deindex them as well, site in question is sluthpass hope i can recover traffic after disabling those annoying tags.
-
Hello experts, I have disabled the tags but it is still showing in Google. What should I do now? Here you can check redeemcodecenter.com, Thanks in Advance.
-
If you have a large number of tags that don't add clear value to your site's content, disabling tag indexing in WordPress can be beneficial for search engine optimization. However, if your tags are well-curated and provide meaningful navigation for your users, enabling indexing can improve discoverability and site organization. Evaluate the relevance and usefulness of your tags, considering both SEO considerations and user experience before making a decision.
-
I experienced the same problem. I once read an article that Google prefers websites that have a neat structure. and in my opinion, it is difficult to make tags more structured on my website. my site is malasngoding.com . what do you think?
-
Even i was also looking the answer for same for my website https://abcya.in/ i think tags should not be indexed.
-
I don't think it's a good idea. I'm testing tons of articles with and without tags for my website, Dizzibooster. It seems that adding tags will provide an edge for indexing purposes. However, you can test these things yourself."
-
I am facing the same issue on my website AmazingFactsHindi. As per our expert discussion and after reading this forum, I decided to de-index all the Tags and Category pages that are creating duplicate issues on our website.
-
We had similar questions on SEO. We experimented with disabling tags for the last 4 weeks. The only impact so far, I was able to find is that the [thecodebuzz(https://www.thecodebuzz.com/) website did not get hits for a few impressions which were based on tags keys. We are still evaluating the impact.
-
Heyo,
If your tags and categories are providing value to your users or helping with your site's SEO, you might not want to remove them from search engine indexes. I disabled it on my site OceanXD, And it was a good decision for me. -
I have a same question but I have found blocking category and tag is good for SEO.
I have blogsite Tech News Blog I have crated around 400 tag but I have seen this was crating duplicate issue.My personal opinion tag and category de index will be better for SEO.
-
@JCN-SBWD You can index your tags in as much as it doesn't affect the indexing of your posts. Tags do get traffic as well. The only reason why I stopped indexing my tags is because it affects the indexing of my post. Tags got indexed in a matter of minutes while it takes hours, sometimes days before my posts get indexed.
-
I would recommend to disable tags indexing as there are cases where you are multiple tags for same topic. You can index categories as mentioned above that they are more structure and define your website in some way. If you write custom excerpt for each post, it helps categories to have unique content for each post except.
-
It’s a good idea to block tags, since they are duplicate content and may dilute the performance of your real pages. But if you find certain tag or author pages bring valid traffic, you can make an exception for them. It's up to you
-
Can you please explain what exactly you do.
-
-
Many thanks for the prompt response and also for confirming my suspicions, it is much appreciated.
The robots suggestion is handy too.
-
Personally I usually do this as well as robots.txt blocking them to save on crawl allowance, but you should no-index first as if Google is blocked from crawling (robots.txt) then how will they find the no-index tags? So it needs to be staggered
I find that the tag URLs result in quite messy SERPs so I prefer to de-index those and then really focus on adding value to 'actual' category URLs. Because categories have a defined structure they're better for SEO (IMO)
Categories are usually good for SEO if you tune and tweak them up (and if their architecture is linear) but tags are very messy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with metatag noindex is STILL being indexed?!
Hi Mozers, There are over 200 pages from our site that have a meta tag "noindex" but are STILL being indexed. What else can I do to remove them from the Index?
Intermediate & Advanced SEO | | yaelslater0 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
How (or if) to apply re canonical tags to Shopify?
Anyone familiar with Shopify will understand the problems of their directory structure. Every time you add a product to a 'collection' it essentially creates a duplicate. For example... https://www.domain.com/products/product-slim-regular-bikini may also appear as: https://www.domain.com/collections/all/products/product-slim-regular-bikini https://www.domain.com/collections/new-arrivals/products/product-slim-regular-bikini https://www.domain.com/collections/bikinis/products/product-slim-regular-bikini etc, etc It's not uncommon to have up to six duplicates of each product. So my question is twofold: Firstly, should I worry about this from an SEO point of view? I understand the desire to minimise potential duplicate content issues and also in focussing the 'juice' on just one page per product. But I also planned on trying to build the authority of the collection pages. If I request Google not to index the product pages which link off the collections, does this not devalue these collections pages? Secondly, I understand the correct way to fix these is using 'rel canonical' tags, but I'm not clear about HOW to actually do this. Shopify support has not been very helpful. They have provided two different instructions, so just added to the confusion (see below). Shopify instruction #1: Add the following to the theme.liquid file... <title><br />{{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
Intermediate & Advanced SEO | | muzzmoz
{% if page_description %} {% endif %} Shopify instruction #2: Add the following to each individual product page... So, can anyone help clarify: The best strategic approach to this inherent SEO issue with Shopify (besides moving to another platform!)? and If 'rel canonical' tags is the way to go, exactly where and how to apply them? Regards, Murray1 -
Google indexing pages from chrome history ?
We have pages that are not linked from site yet they are indexed in Google. It could be possible if Google got these pages from browser. Does Google takes data from chrome?
Intermediate & Advanced SEO | | vivekrathore0 -
Lowercase VS. Uppercase Canonical tags?
Hi MOZ, I was hoping that someone could help shed some light on an issue I'm having with URL structure and the canonical tag. The company I work for is a distributor of electrical products and our E-commerce site is structured so that our URL's (specifically, our product detail page URL's) include a portion (the part #) that is all uppercase (e.g: buy/OEL-Worldwide-Industries/AFW-PG-10-10). The issue is that we have just recently included a canonical tag in all of our product detail pages and the programmer that worked on this project has every canonical tag in lowercase instead of uppercase. Now, in GWT, I'm seeing over 20,000-25,000 "duplicate title tags" or "duplicate descriptions". Is this an issue? Could this issue be resolved by simply changing the canonical tag to reflect the uppercase URL's? I'm not too well versed in canonical tags and would love a little insight. Thanks!
Intermediate & Advanced SEO | | GalcoIndustrial0 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
Indexed Pages in Google, How do I find Out?
Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞
Intermediate & Advanced SEO | | JohnPeters0 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0