Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it a good idea to remove old blogs?
-
So I have a site right now that isn't ranking well, and we are trying everything to help it out. One of my areas of concern is we have A LOT of old blogs that were not well written and honestly are not overly relevant. None of them rank for anything, and could be causing a lot of duplicate content issues. Our newer blogs are doing better and written in a more Q&A type format and it seems to be doing better.
So my thought is basically wipe out all the blogs from 2010-2012 -- probably 450+ blog posts.
What do you guys think?
-
You may find this case study helpful of a blog that decided to exactly that:
http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
-
It depends on what you mean by "remove."
If the content of all those old blogs truly is poor, I'd strongly consider going through 1 by 1 and seeing how you can re-write, expand upon, and improve the overall blog post. Can you tackle the subject from another angle? Are there images, videos, or even visual assets you can add to the post to make it more intriguing and sharable?
Then, you can seek out some credible places to strategically place your blog content for additional exposure and maybe even a link. Be careful here, however. I'm not talking about forum and comment spam, but there may be some active communities that are open to unique and valuable content. Do your research first.
When going through each post 1 by 1, you'll undoubtedly find blog posts that are simply "too far gone" or not relevant enough to keep. Essentially, it wouldn't even be worth your time to re-write them. In this case, find another page on your website that's MOST SIMILAR to the blog post. This may be in topic, but also could be an author's page, another blog post that is valuable, a contact page, etc. Then perform 301 redirects of the crap blog posts to those pages.
Not only are you salvaging any little value those blog posts may have had, but you're also preventing crawl and index issues by telling the search engine bots where that content is now (assuming it was indexed in the first place).
This is an incredibly long content process and should take you months. Especially if there's a lot of content that's good enough to be re-written, expanded upon, and added to. However making that content relevant and useful is the best thing you can do. It's a long process, but if your best content writers need a project, this would be it.
To recap: **1) **Go through each blog post 1 by 1, determine what's good enough to edit, what's "too far gone." 2) Re-write, edit, add to (content and images/videos) and re-promote them socially and to appropriate audiences and communities. 3) For the posts that were "too far gone," 301 redirect them to the most relevant posts and pages that are remaining "live."
Again, I can say firsthand that this is a LONG process. I've done it for a client in the past. However, the return was well worth the work. And by doing it this way and not just deleting posts, you're preventing yourself a lot of crawl/index headaches with the search engines.
-
we have A LOT of old blogs that were not well written and honestly are not overly relevant.
Wow.... it is great to hear someone looking at their content and deciding that he can kick it up a notch. I have seen a lot of people would never, ever, pull the kill switch on an old blog post. In fact they are still out there hiring people to write stuff that is really crappy.
If this was my site I would first check to be sure that I don't have a penguin or unnatural links problem. If you think you are OK there, here is what I would do.
-
I would look at those blog posts to see if any of them have any traffic, link or revenue value. Value is defined as... A) Traffic from any search engine or other quality source, B) valuable links, C) viewing by current website visitors, D) traffic who enter through those pages making any income through ads or purchases.
-
If any of them pass the value test above then I would improve that page. I would put a nice amount of work into that page.
-
Next I would look at each of those blog posts and see if any have content value. That means an idea that could be developed into valuable content... or valuable content that could be simply rewritten to a higher standard. Valuable content is defined as a topic that might pull traffic from search or be consumed by current site visitors.
-
If any pass the valuable content test then I would improve them. I would make them kickass.
-
After you have done the above, I would pull the plug on everything else.... or if I was feeling charitable I would offer them to a competitor.
Salutes to you for having the courage to clean some slates.
-
-
I would run them through Copyscape to check for plagiarism/duplicate content issues. After that, I would check for referral traffic. If there are some pages that draw enough traffic, you might not want to remove them. Finally, round it off with a page level link audit. Majestic can give you a pretty good idea of where they stand.
The pages that don't make the cut should be set to throw 410 status codes. If you still don't like the content on pages with good links and/or referral traffic, 301 those to better content on the same subject.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing the Trailing Slash in Magento
Hi guys, We have noticed trailing slash vs non-trailing slash duplication on one of our sites. Example:
Intermediate & Advanced SEO | | brandonegroup
Duplicate: https://www.example.com.au/living/
Preferred: https://www.example.com.au/living So, SEO-wise, we suggested placing a canonical tag on all trailing slash pointing to non-trailing slash. However, devs have advised against removing the trailing slash from some URLs with a blanket rule, as this may break functionality in Magento that depends on the trailing slash. The full site would need to be tested after implementing a blanket rewrite rule. Is any other way to address this trailing slash duplication issue without breaking anything in Magento? Keen to hear from you guys. Cheers,0 -
Reverting back to old domain name.
I've recently been asked by a client if I can foresee any issues with reverting back to their original domain name. With the original domain name they had a pretty decent DA for their sector which they have now lost. Although I do appreciate that over time this might come back, the CEO is very keen to switch back to the old domain. They do currently have 301 redirects from the old domain to the new and have implemented rel canonical. As yet they have not notified Google of the change of address using Webmaster Tools. Can anyone forsee any issues with returning back to the old domain name? They have only been using the new domain name for a couple of months which currently has a DA for 1.
Intermediate & Advanced SEO | | Macrofireball0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0 -
How to Remove Joomla Canonical and Duplicate Page Content
I've attempted to follow advice from the Q&A section. Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com. Secondly, I'd added the canonical statement in the header of the web pages. I have cut the Duplicate Page Content in half ... now I have a remaining 40 pages to fix up. This is my practice site to try and understand what SEOmoz can do for me. I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.
Intermediate & Advanced SEO | | deskstudio0 -
Create a new XML Sitemap for a blog subdomain?
What would be the best way to go about this? A site just put a blog on http://blog.domain.com/ Should there be a separate XML Sitemap for that particular subdomain or should the original XML Sitemap for the main domain be sufficient? Looking forward to your responses. Thanks
Intermediate & Advanced SEO | | iAnalyst.com0