Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Noindex,follow is a waste of link juice?
-
On my wordpress shopping cart plugin, I have three pages /account, /checkout and /terms on which I have added “noindex,follow” attribute. But I think I may be wasting link juice on these pages as they are not to be indexed anyway, so is there any point giving them any link juice? I can add “noindex,nofollow” on to the page itself. However, the actual text/anchor link to these pages on the site header will remain “follow” as I have no means of amending that right now. So this presents the following two scenarios – No juice flows from homepage to these 3 pages (GOOD) – This would be perfect then, as the pages themselves have nofollow attribute. Juice flows from homepage to these pages (BAD) - This may mean that the juice flows from homepage anchor text links to these 3 pages BUT then STOPS there as they have “nofollow” attribute on that page. This will be a bigger problem and if this is the case and I cant stop the juice from flowing in, then ill rather let it flow out to other pages. Hope you understand my question, any input is very much appreciated. Thanks
-
If you no index a page, link juice will flow to that page still. if you no follow it, it will still flow but will not flow out of it again.
you should always add noindex,follow if you want the link juice to return to your index pages. Even then some link juice will be lost that stays on that noindex page
I tried also could not find it. but here is a quote from Matt Cutts "Eric Enge: Can a NoIndex page accumulate PageRank?
Matt Cutts: A NoIndex page can accumulate PageRank, because the links are still followed outwards from a NoIndex page.
Eric Enge: So, it can accumulate and pass PageRank.
Matt Cutts: Right, and it will still accumulate PageRank, but it won't be showing in our Index. So, I wouldn't make a NoIndex page that itself is a dead end. You can make a NoIndex page that has links to lots of other pages.
For example you might want to have a master Sitemap page and for whatever reason NoIndex that, but then have links to all your sub Sitemaps.
Eric Enge: Another example is if you have pages on a site with content that from a user point of view you recognize that it's valuable to have the page, but you feel that is too duplicative of content on another page on the site
That page might still get links, but you don't want it in the Index and you want the crawler to follow the paths into the rest of the site.
Matt Cutts: That's right. Another good example is, maybe you have a login page, and everybody ends up linking to that login page. That provides very little content value, so you could NoIndex that page, but then the outgoing links would still have PageRank.
Now, if you want to you can also add a NoFollow metatag, and that will say don't show this page at all in Google's Index, and don't follow any outgoing links, and no PageRank flows from that page. We really think of these things as trying to provide as many opportunities as possible to sculpt where you want your PageRank to flow, or where you want Googlebot to spend more time and attention."
http://www.stonetemple.com/articles/interview-matt-cutts.shtml
-
I just wanted to share I completely agree with EGOL and the understanding he shared. I skipped responding to this question because I didn't want to respond with all the explanation of the disclaimers, where EGOL tackled the question anyway and offered great details in both the original reply and follow up.
-
Great answer, and in this specific case, i have "noindex, follow" attribute on my pages too that i do not want to be indexed.
Regarding competitors - I study them, onsite and link profiles, specially the successful ones to learn from it. Most of the SEO strategies ive learned have been by reading forums / blogs etc. Quite often people have conflicting views there. So i try to find real life examples of stuff that is quite likely working for a successful site, try to see a pattern in there, and where i spot one, i try to implement that on my sites.
You on the other hand - have experience and proven philosophies :), something i am dying to acquire.
Thanks
-
Here is a philosophy that I have... (I am not trying to be a wise guy... just sayin'....)
I don't pay a lot of attention to the methods used by my competitors. Instead I decide what I think will work best for me and then do it.
Right now I have pages on my site that I don't want in the search engines index. So I have code on them as follows....
name="robots" content="noindex, follow" />
I believe that code keeps them out of the index but allows pagerank to flow through them to other pages. I offer that here so that anyone can tell me if it is wrong.
I welcome anyone who can set me straight or anyone who can suggest a better method.
However, I am not going to look at my competitors and try to figure out what they are doing because there is a very good chance that they don't know what they are doing. (I think your competitors don't know what they are doing.)
I have absolutely no problem with doing things differently from my competitors. In fact I think that mimicking them is the best way to finish behind them.
-
EGOL, thank you so much for your input, i really value your opinion. However, i have a follow up question, and i maybe muddled up with things here, but here it is -
Many of my successful competitors in various niches have added rel=nofollow to certain internal pages.
For example -
1. On homepage of this wordpress site, the anchor text link to wp tag pages have rel=nofollow. The tag pages themselves are "noindex,follow".
2. Also all links in the header are rel=nofollow. The only follow links are post pages, and post pages are being used for navigation.
Any page that has a rel=nofollow anchor text is "noindex,follow" itself. Nowhere a "noindex" has been added to a wholepage, its only on certain anchor text links.
Is that slightly different from making the whole page nofollow? because here only pages are being stopped from getting any link juice.
-
I am going to explain how I understand this. I could be wrong on some of the details because of two different reasons.... 1) I simply am wrong... or .... 2) I am correct according to what search engines have said in public but they are doing something different in practice.
When nofollow was first introduced a lot of people used it to "sculpt" the flow of pagerank. They were told at that time by some search engine employees that pagerank did not flow into nofollowed pages. That is how search engines who made public statements about it were supposed to be treating it in the beginning.
Later we learned that google (and maybe other) search engines changed their mind on how they handle nofollow and that change was to evaporate ALL pagerank that would have flowed into a nofollow link. In that situation it would be a bad idea to use nofollow because the pagerank was permanently lost.
Do they still handle nofollow links that way? I don't know.
However...... how I currently understand it is that if you designate a page as noindex / follow then pagerank flows into that page and through the links on that page. This would conserve any pass-through pagerank but would result in a loss of any pagerank that is retained in that page (or maybe it all passes through since the page is no index - I don't know).
So, if I had pages that I wanted to link to on my site but didn't want in the index I would use noindex / follow to allow the pagerank that flows into those pages to pass through to other pages on my site. But I would never be sure that it really works that way. Also, keep in mind that there are numerous search engines and there could be many different ways of treating these links - and pagerank is a substance unique to google.
If anyone understands this differently or suspect that it does not work as explained, please let us know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Passing link juice via javascript?
Hello Client got website with javascript generated content. All links there (from mainpage to some deeper page) are js generated. In code there're only javascripts and other basic typical code but no text links (<a href...="" ).<="" p=""></a> <a href...="" ).<="" p="">The question is: are those js links got the same "seo power" as typical html href links?.For example majestic.com can't scan website properly and can't show seo metrics for pages. I know google crawls them (links and pages) but are they as good as typical links?</a> <a href...="" ).<="" p="">Regards,</a>
Intermediate & Advanced SEO | | PenaltyHammer0 -
Does Navigation Bar have an effect on the link juice and the number of internal links?
Hi Moz community, I am getting the "Avoid Too Many Internal Links" error from Moz for most of my pages and Google declared the max number as 100 internal links. However, most of my pages can't have internal links less than 100, since it is a commercial website and there are many categories that I have to show to my visitors by using the drop down navigation bar. Without counting the links in the navigation bar, the number of internal links is below 100. I am wondering if the navigation bar links affect the link juice and counted as internal links by Google. The Same question also applies to the links in the footer. Additionally, how about the products? I have hundreds of products in the category pages and even though I use pagination I still have many links in the category pages (probably more than 100 without even counting the navigation bar links). Does Google count the product links as internal links and how about the effect on the link juice? Here is the website if you want to take a look: http://www.goldstore.com.tr Thank you for your answers.
Intermediate & Advanced SEO | | onurcan-ikiz0 -
Will I lose Link Juice when implementing a Reverse Proxy?
My company is looking at consolidating 5 websites that it has running on magento, wordpress, drupal and a few other platforms on to the same domain. Currently they're all on subdomains but we'd like to consolidate the subdomains to folders for UX and SEO potential. Currently they look like this: shop.example.com blog.example.com uk.example.com us.example.com After the reverse proxy they'll look like this: example.com/uk/ example.com/us/ example.com/us/shop example.com/us/blog I'm curious to know how much link juice will be lost in this switch. I've read a lot about site migration (especially the Moz example). A lot of these guides/case studies just mention using a bunch of 301's but it seems they'd probably be using reveres proxies as well. My questions are: Is a reverse proxy equal to or worse/better than a 301? Should I combine reverse proxy with a 301 or rel canonical tag? When implementing a reverse proxy will I lose link juice = ranking? Thanks so much! Jacob
Intermediate & Advanced SEO | | jacob.young.cricut0 -
Sponsored blog - pass any link juice?
Hello there, If a quality blog in our specific niche writes an article about us which is clearly labelled "sponsored post" as we have either paid them or given them a product, will Google discount that link going back to our website? Should we request for the link to be "no-follow"? Thanks Robert
Intermediate & Advanced SEO | | roberthseo0 -
Noindex xml RSS feed
Hey, How can I tell search engines not to index my xml RSS feed? The RSS feed is created by Yoast on WordPress. Thanks, Luke.
Intermediate & Advanced SEO | | NoisyLittleMonkey0 -
Redirect ruined domain to new domain without passing link juice
A new client has a domain which has been hammered by bad links, updates etc and it's basically on its arse because of previous SEO guys. They have various domains for their business (brand.com, brand.co.uk) and want to use a fresh domain and take it from there. Their current domain is brand.com (the ruined one). They're not bothered about the rankings for brand.com but they want to redirect brand.com to brand.co.uk so that previous clients can find them easily. Would a 302 redirect work for this? I don't want to set up a 301 redirect as I don't want any of the crappy links pointing across. Thanks!
Intermediate & Advanced SEO | | jasonwdexter0 -
Paging. is it better to use noindex, follow
Is it better to use the robots meta noindex, follow tag for paging, (page 2, page 3) of Category Pages which lists items within each category or just let Google index these pages Before Panda I was not using noindex because I figured if page 2 is in Google's index then the items on page 2 are more likely to be in Google's index. Also then each item has an internal link So after I got hit by panda, I'm thinking well page 2 has no unique content only a list of links with a short excerpt from each item which can be found on each items page so it's not unique content, maybe that contributed to Panda penalty. So I place the meta tag noindex, follow on every page 2,3 for each category page. Page 1 of each category page has a short introduction so i hope that it is enough to make it "thick" content (is that a word :-)) My visitors don't want long introductions, it hurts bounce rate and time on site. Now I'm wondering if that is common practice and if items on page 2 are less likely to be indexed since they have no internal links from an indexed page Thanks!
Intermediate & Advanced SEO | | donthe0