Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should 301-ed links be removed from sitemap?
-
In an effort to do some housekeeping on our site we are wanting to change the URL format for a couple thousand links on our site. Those links will all been 301 redirected to corresponding links in the new URL format. For example, old URL format: /tag/flowers as well as search/flowerswill be 301-ed to, new URL format: /content/flowers**Question:**Since the old links also exist in our sitemap, should we add the new links to our sitemap in addition to the old links, or replace the old links with new ones in our sitemap? Just want to make sure we don’t lose the ranking we currently have for the old links.Any help would be appreciated. Thanks!
-
I'm going to disagree a little bit with the other commenters. I've done quite a few large scale redirect projects and I'm not 100% opposed to using a "dirty sitemap" for a short duration. The better option is to leave some internal links pointed at the old URLs. I know what the search engines say, but I also know what I've experienced when it comes to getting 301'd links crawled again.
Read this post by Everett Sizemore for more info at what I'm describing:
http://cloudz.click/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well
-
"A sitemap should only contain links to active pages."
Hi shawn81
Alex is absolutely correct there.
In fact, Duane Forrester has said repeatedly that Bing absolutely does not like to find such pages in a sitemap and that you should make sure there are never 3XX, 4XX or 5XX status pages included because it will stop Bingbot from crawling your site.
While Googlebot is not so sensitive, the reality is that all search engines allocate a certain amount of crawl capacity for your site...if your sitemaps include a load of pages that are not likely to be indexed, the result is twofold:
- you are wasting capacity on useless pages and the crawler may never get to the stuff you really want indexed
- if the crawler encounters a lot of non-active pages when it crawls, future crawl capacity (not to mention trust) is likely to be reduced
Replace the old URLs with the new and give the bots a little thrill of adventure
Hope that helps,
Sha
- you are wasting capacity on useless pages and the crawler may never get to the stuff you really want indexed
-
There shouldn't be any 301 links in a sitemap. A sitemap should only contain links to active pages. So in your case, you should remove all the 301 links and replace them with the new links.
Couple notes - Having 301 links in your sitemap won't hurt your site or SEO unless the sitemap is so huge that you need to split it up into multiple files. But you should really only have the final links in the sitemap, neither people nor bots want to be redirected around. If you properly 301'd the crawlers will automatically update their links.
Changing links around in the sitemap generally won't hurt your site. Especially if the links no longer exist and you're improving the list. There are very few cases where making changes will hurt the site.
-
We have had a problem with this ourselves. We put a 301 redirect on our domain when we were building a new site (went from new. to www.) and search engines are still crawling the new. domain. Bing webmaster tools registers it as an error because they can't find the old site. I would lean toward removing it just because your users are probably being redirected somewhere they wouldn't necessarily want to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use a 301 redirect to pass 'back link' juice to a different domain?
Hi, I have a backlink from a high DA/PA Government Website pointing to www.domainA.com which I own and can setup 301 redirects on if necessary. However my www.domainA.com is not used and has no active website (but has hosting available which can 301 redirect). www.domainA.com is also contextually irrelevant to the backlink. I want the Government Website link to go to www.domainB.com - which is both the relevant site and which also should be benefiting from from the seo juice from the backlink. So far I have had no luck to get the Government Website's administrators to change the URL on the link to point to www.domainB.com. Q1: If i use a 301 redirect on www.domainA.com to redirect to www.domainB.com will most of the backlink's SEO juice still be passed on to www.domainB.com? Q2: If the answer to the above is yes - would there be benefit to taking this a step further and redirect www.domainA.com to a deeper directory on www.domianB.com which is even more relevant?
Technical SEO | | DGAU
ie. redirect www.domainA.com to www.domainB.com/categoryB - passing the link juice deeper.0 -
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
Updating inbound links vs. 301 redirecting the page they link to
Hi everyone, I'm preparing myself for a website redesign and finding conflicting information about inbound links and 301 redirects. If I have a URL (we'll say website.com/website) that is linked to by outside sources, should I get those outside sources to update their links when I change the URL to website.com/webpage? Or is it just as effective from a link juice perspective to simply 301 redirect the old page to the new page? Are there any other implications to this choice that I may want to consider? Thanks!
Technical SEO | | Liggins0 -
301 vs 302 & Link Juice
Has any one come across any recent cases of a 302 link passing more link juice than before?
Technical SEO | | CeeC-Blogger0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
What is link Schemes?
Hello Friends, Today I am reading about link schemes on http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66356 there are a several ways how to avoid Google penalties and also talk about the low quality links. But I can't understand about "Low-quality directory or bookmark site links" Is there he talked about low page rank, Alexa or something else?
Technical SEO | | KLLC0 -
HTML Sitemap Pagination?
Im creating an a to z type directory of internal pages within a site of mine however there are cases where there are over 500 links within the pages. I intend to use pagination (rel=next/prev) to avoid too many links on the page but am worried about indexation issues. should I be worried?"
Technical SEO | | DMGoo0 -
Add to Cart Link
We have shopping cart links (<a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p"></a> <a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p">The SEOMoz site crawls are flagging these as a massive number of 302 redirects and I also wonder what sort of effect this is having on linkjuice flowing around the site. </a> <a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p">I can see several possible solutions: Make the links nofollow Make the links input buttons Block /cart/add with robots.txt Make the links 301 instead of 302 Make the links javascript (probably worst care) All of these would result in an identical outcome for the UX, but are very different solutions. What would you suggest?</a>
Technical SEO | | Aspedia0