Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best way to handle outdated & years old Blog-posts?
-
Hi all,
We have almost 1000 pages or posts from our blog which are indexed in Google. Few of them are years old, but they have some relevant and credible content which appears in search results. I am just worried about other hundreds of non-relevant posts which are years old. Being hosting hundreds of them, our website is holding lots of these useless indexing pages which might be giving us little negative impact of keeping non-ranking pages. What's the best way to handle them? Are these pages Okay? Or must be non-indexed or deleted?
Thanks
-
I can tell you what we do. With over 35 000 posts we are always tweaking the better ones and dropping the dead weight. I'm currently going through a bunch of posts from 2010. I run two quick tests on them - I check the PA [page authority] on Moz. If it's 1 then that's one strike, anything higher I consider working on the post. Next is a quick check in Google Analytics for traffic over the past 6 months. As you can imagine many posts from 7 years ago have 0 traffic. This is strike 2 and in my ballpark, 2 strikes means 'you're out!'. I delete the posts, a hard 404. As we cut the driftwood from our nets I feel we will be more efficient at catching more fish.
-
I have a couple of suggestions for this.
1. You can 301 redirect the pages/posts that are low-quality to higher quality/more relevant pages, or you could even decide to rewrite the topics to update them in a new post to improve the content and then 301 the old posts to the updated content. If you decide to 301, you should make sure to use a rel=canonical tag so Google knows what pages are the right ones to index.
2. You can also recycle your high performing content into new posts. For example, if you had a post that was 10 Best _____ for 2010, you could rewrite the same post now, updating any neccessary info, and name it 10 Best _____ for 2017.
Hope that helps some. Let me know if you have any other questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
Why is old site not being deindexed post-migration?
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
Algorithm Updates | | ggpaul5620 -
Republish An Updated Blog or not?
We need to update one of our older posts (a new federal ruling came down that now applies) but do I just update it or should I republish it? I know that just republishing something that hasn't been changed, can get you flagged, but if it's an update, I would think that would be appropriate to republish. I'd just like some guidance before I proceed. Thanks and Happy Friday! Ruben
Algorithm Updates | | KempRugeLawGroup0 -
301-Redirects, PageRank, Matt Cutts, Eric Enge & Barry Schwartz - Fact or Myth?
I've been trying to wrap my head around this for the last hour or so and thought it might make a good discussion. There's been a ton about this in the Q & A here, Eric Enge's interview with Matt Cutts from 2010 (http://www.stonetemple.com/articles/interview-matt-cutts-012510.shtml) said one thing and Barry Schwartz seemed to say another: http://searchengineland.com/google-pagerank-dilution-through-a-301-redirect-is-a-myth-149656 Is this all just semantics? Are all of these people really saying the same thing and have they been saying the same thing ever since 2010? Cyrus Shepherd shed a little light on things in this post when he said that it seemed people were confusing links and 301-redirects and viewing them as being the same things, when they really aren't. He wrote "here's a huge difference between redirecting a page and linking to a page." I think he is the only writer who is getting down to the heart of the matter. But I'm still in a fog. In this video from April, 2011, Matt Cutts states very clearly that "There is a little bit of pagerank that doesn't pass through a 301-redirect." continuing on to say that if this wasn't the case, then there would be a temptation to 301-redirect from one page to another instead of just linking. VIDEO - http://youtu.be/zW5UL3lzBOA So it seems to me, it is not a myth that 301-redirects result in loss of pagerank. In this video from February 2013, Matt Cutts states that "The amount of pagerank that dissipates through a 301 is currently identical to the amount of pagerank that dissipates through a link." VIDEO - http://youtu.be/Filv4pP-1nw Again, Matt Cutts is clearly stating that yes, a 301-redirect dissipates pagerank. Now for the "myth" part. Apparently the "myth" was about how much pagerank dissipates via a 301-redirect versus a link. Here's where my head starts to hurt: Does this mean that when Page A links to Page B it looks like this: A -----> ( reduces pagerank by about 15%)-------> B (inherits about 85% of Page A's pagerank if no other links are on the page But say the "link" that exists on Page A is no longer good, but it's still the original URL, which, when clicked, now redirects to Page B via a URL rewrite (301 redirect)....based on what Matt Cutts said, does the pagerank scenario now look like this: A (with an old URL to Page B) ----- ( reduces pagerank by about 15%) -------> URL rewrite (301 redirect) - Reduces pagerank by another 15% --------> B (inherits about 72% of Page A's pagerank if no other links are on the page) Forgive me, I'm not a mathematician, so not sure if that 72% is right? It seems to me, from what Matt is saying, the only way to avoid this scenario would be to make sure that Page A was updated with the new URL, thereby avoiding the 301 rewrite? I recently had to re-write 18 product page URLs on a site and do 301 redirects. This was brought about by our hosting company initiating rules in the back end that broke all of our custom URLs. The redirects were to exactly the same product pages (so, highly relevant). PageRank tanked on all 18 of them, hard. Perhaps this is why I am diving into this question more deeply. I am really interested to hear your point of view
Algorithm Updates | | danatanseo0 -
Server Location & SEO
So I just read an interesting Tweet: #SEO Tip: #Google takes into account the location of the server (the IP) when projecting the search results #web This is something I had not thought of. I suppose my question then is HOW does it factor this information into it's results? For some reason, one of our sites is hosted on a Canadian server. We are a cloud hosting company and we serve all of NA with data centers in the US and Canada... For whatever reason we've used the Canadian server farm for our web server. Could this possibly be hurting our NA google SERPs? Anyone have any thoughts on this?
Algorithm Updates | | jesse-landry0 -
Frequency & Percentage of Content Change to get Google to Cache Every Day?
What is the frequency at which your homepage (for example) would have to update and what percentage of the page's content would need to be updated to get cached every day? What are your opinions on other factors.
Algorithm Updates | | bozzie3110 -
Local SEO-How to handle multiple business at same address
I have a client who shares the same address and suite number with multiple business. What should be done to optimize their website and citations for local SEO? Is this a huge issue? What should we do so our rankings aren't affected. Will changes take a long time to take place? Thanks
Algorithm Updates | | caeevans0