Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Link Juice + multiple links pointing to the same page
-
Scenario
The website has a menu consisting of 4 linksHome | Shoes | About Us | Contact Us
Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes
In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links.Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact?Any other advise or best practice would be appreciated.
Thanks Mark
-
Hi Remus & Kurt,
Thank you for your advise.
Mark
-
Remus's answer is good. I would add to that that Google has their first link filter. If you have two links pointing from page A to page B, Google only passes link authority (pagerank) and reputation (keywords in the anchor text and relevant surrounding text) through the first link that appears in the code. The second link does not pass anything. So, whatever the anchor text of the first link in the code is, that's the anchor text Google is going to use (Remus is right that anchor text has become less important).
The second link does, however, dilute the amount of pagerank passed. So, like Remus pointed out, each link in your scenario only passes 20% of the pagerank. Since Google ignores the second link to the shoe page, that 20% of pagerank does not get passed. I'm not sure if it stays on the page or just gets lost.
So, what does this all mean? From an SEO standpoint, you want the link with the targeted keyword to be first in the code if you have more than one link to a page. Also, you don't really want to have two links to the same page on that one page. Now, that's from an SEO perspective. From a user perspective, it may make perfect sense to have that second link and the page may convert better. So, you'd just have to decide which is more important...and it's probably the user perspective that's more important.
Kurt Steinbrueck
OurChurch.Com -
Hi Mark, really good questions.
- How many links would Google count as part of the link juice model?
There are a lot of opinions about this subject and there is no clear answer (it's really hard to test). Some time ago Google removed the effect of "nofollow" attribute for internal links, to cut the advantage SEO's gained by "pagerank sculpting". I think they did this so that search engine optimizers don't have a big advantage over standard websites. My personal opinion is that in terms of link juice lost Google would count 5, but the page benefiting won't get double the value. I think Google would only count the advantages of one of those links, whichever the best (probably the one in content. But on the other side, the link juice lost is not so important. The rest of the pages won't necessarily rank for popular terms.
I think that in-content links get way more advantages than just the "juice" and anchor text. The neighboring text is also important, the fact that it's in a block of text it's also important. Also it brings value to the users, who, might want to see all the shoes models when reading about them. I think you should definitely use this approach but just make sure you don't take it to an extreme.
-
20% to each link, but the shoes page won't get 20x2 from those 2 duplicate, maybe it will get 25 + some other advantages (personal oppinion!)
-
Changing anchor text had some effect in the past, but recently anchor text has less and less importance. It probably still has value. It's still an important ranking factor for 2013, and I would use it if I was you. But I would bring it to a new level. I would also think about the words in the context of the link. Try to link from all the relevant sections of the websites, and as you point to the shoes page from different contexts, naturally, the anchor text will change. For example you could link through our "shoe collection" from an article which compares between your shoes and competitor shoes.
I wrote an article for YouMoz a few years ago, some concepts might be a bit outdated because the ranking factors changed a lot since then. However it might give you some ideas to explore from a new perspective
-> An Intelligent Way to Plan Your Internal Linking Structure
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Can Google Bot View Links on a Wix Page?
Hi, The way Wix is configured you can't see any of the on-page links within the source code. Does anyone know if Google Bots still count the links on this page? Here is the page in question: https://www.ncresourcecenter.org/business-directory If you do think Google counts these links, can you please send me URL fetcher to prove that the links are crawlable? Thank you SO much for your help.
Intermediate & Advanced SEO | | Fiyyazp0 -
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
How can I find all broken links pointing to my site?
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code? Thank you in advance!
Intermediate & Advanced SEO | | StevenLevine3 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
How to properly link to products from category pages?
Hi All, We have an e-commerce website and the category pages are built so that there is a product image and below it there is the title. Both the image and the title are in a href (each on its own). I encountered the following unfinished discussion here at MOZ:
Intermediate & Advanced SEO | | BeytzNet
http://www.seomoz.org/q/how-to-optimize-achor-text-links-on-ecommerce-category-page#post-93758 The discussion states that its improper. The question is - if it is wrong then why? (maybe because Google will give its weight to the image anchor instead of the text anchor since it is higher in the page). The other question is how to resolve the matter?
Should I add nofollow to the image href? Thanks0