Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain vs Main Domain Penalties
-
We have a client who's main root.com domain is currently penalized by Google, but the subdomain.root.com is appearing very well. We're stumped - any ideas why?
-
Extremely helpful insight Marie - I will be contacting you directly soon.
It appears that the duplicate content you've found (and other dupe content we've found) is actually our content that other sites have repurposed. Seems like Google has determined our site as the culprit, so this would be an issue we need to address - the only thought that comes to mind right away is adding an 'Author' tag, then start working on what appears to be a hefty cleanup project, something that looks like you are an expert on and will most likely be working directly with you in the near future!
The 2nd level pages that have little content and lots of links are 'noindex,follow' but I'm nervous about the number of these tags throughout our site which could be seen as spammy to a search engine. Of note, the 2nd level page section you have found ranks quite well since it is a subdomain which is interesting. Our suspicion is that since we made the 404 (200 success) error that Google detected on Dec. 9, 2011, we have been on some sort of Google 'watch-list' and any little thing we do incorrectly that they find, we immediately are penalized.
The homepage description of our company is reused on industry directories that we are listed on, so perhaps we must consider re-writing our description to be unique, and adding more content to the homepage would be a good thing and is certainly easily doable.
-
You have some significant duplicate content issues with www.ides.com. There is not a lot of text on your home page and what is there is duplicated in many places across the web.
Your second level pages are all just links. I would noindex, follow these.
I looked at two inner pages:
http://plastics.ides.com/generics/6/alphamethylstyrene-ams - extremely thin content
Here is a Google search for text I copied from the styrene-acrylonitrile page. There are 247 pages that use this opening sentence.
My guess is that this is indeed a Panda issue. But please know that I've only just taken a quick look so I can't say for sure. What doesn't make sense is that your traffic drops don't happen on Panda dates which really should be the case if it was Panda.
Panda definitely can affect just one part of a site (such as a root and not a subdomain). I would work on making these pages completely unique and also noindexing the thin pages.
-
Thank you Marie,
We 301 redirect any traffic going to root.com to www.root.com, and any content that we moved from www.root.com to subdomain.root.com has been completely removed from www.root.com. There doesn't appear to be any duplicate content between the two. There is some duplicate content that we treat with canonicals on subdomain.root.com - very small portion of total pages (less than 1%).
As for your other questions, no warnings in WMT. Robots txt file looks clean, canonicals are in place correctly, and no accidental non-indexing that we know of.
Here is the actual site that might help to look at:
http://www.ides.com
http://plastics.ides.com/materials
http://www.ides.com/robots.txt -
I think the answer here depends on whether or not you have actually been penalized and why the site is dropping out of the SERPS. Do you have a warning in WMT? If not, then you're probably not penalized.
It's unlikely to be Penguin because Penguin did not refresh lately. Similarly, Panda did not refresh on the days you mentioned. So, it's not likely a penalty but rather some type of site structure issue.
Is there duplicate content between the subdomain and the root? If so, then Google will choose one as the owner and not show the other prominently. Any issues with robots.txt? Are the canonicals set correctly? Any chance of accidental noindexing?
-
Subdomains and root domains are not necessarily always owned by the same person and therefore will not always be given the same penalties. As Scott mentioned, they are seen as different sites.
e.g. If I create a new WordPress account and create me.wordpress.com and then build a black hat site which gets penalized, this is not going to affect you.wordpress.com or www.wordpress.com.
-
Thank you all for your insight - good stuff, but still stumped.
Here's everything we know that might help point out why the main domain (ie www.root.com) was penalized by Google. We redirect root.com to www.root.com with a 301 redirect, and it is setup this way in Google Webmaster Tools too.
December 9, 2011 - the site's 404 error page was incorrectly setup as a 200, resulting in a quick bloat of 1 million plus pages. The website dropped from Google immediately. The error page was correctly setup 2 days later. The site still appeared in Google's index via site: query. However the site didn't reappear in Google's SERPs until May 2, 2012.
October 25, 2012 - the website again drops from Google for an unknown reason. We then moved a significant portion of content from www.root.com to subdomain.root.com. Pages from subdomain.root.com began appearing within 3 days as high they appeared previously on Google. From December 9, 2011 throughout this entire time we were correcting any errors reported in Google Webmaster Tools on a daily basis.
February 26, 2013 - The website yet again is dropped from Google, the subdomain.root.com continues to appear and rank well.
Due to moving most of the content from www.root.com to subdomain.root.com, the index for www.root.com from October 2012 dropped from 142,000 slowly to an average of 21,400 ending at today's 4,230. However this index count fluctuates greatly every few days (probably due to moving content from www.root.com to subdomain.root.com).
Of note, the site is NOT a content farm, but legitimate unique technical content that is hosted for hundreds of clients.
Again any ideas are most welcome!
-
From my understanding subdomains are considered completely separate from root domains unless you have a 301 redirect or conical that tells search engines you want them to consider the root or the subdomain to be the same; for example, http://www.yourdomain.com (subdomain) points to http://yourdomain.com
Therefore, you could have a subdomain out rank a root domain, or in your case a root domain penalized and the subdomain continue to rank well. The fact that they share an IP address shouldn't affect all the domains under that IP as many websites are on shared hosting which use the same IP address.
-
This isn't necessarily surprising. Penalties and negative ranking algorithms can be applied at a page level, a subdomain level, a root domain level, etc.
For example, HubPages used subdomains to help escape from a Panda slap.
Another example: Google placed a manual penalty on a single page of BBC's website.
-
hmmm...
do they point to the same IP address?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Clients domain expired - rankings lost - repurchased domain - what next?
Its only been 10 days and i have repurchased the domain name/ renewed. The who is info, website and contact information is all still the same. However we have lost all rankings and i am hoping that our top rankings come back. Does anyone have experience with such a crappy situation?
Technical SEO | | waqid0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Transfer a Main Domain to a Sub-Domain
My IT department tells me they want to transfer my main site domain, which has been in existence since 1999 as an e-commerce site (maindomain.com) to a sub-domain (www2.maindomain.com) or a completely new domain (newdomain.net). This is because we are launching a new website and B2C e-commerce engine, but we still have to maintain the legacy B2B e-commerce engine which contains hard-coded URLs, and both systems can't use the same domain. I've been researching the issue across SEOmoz, but I haven't come across this exact type of scenario (mostly I've seen a sub-domain to new domain). I see major problems with their proposal, including negative SEO impact, loss of domain authority/ranking and issues with branding. Does anyone know the exact type of impact I can expect to see in this scenario and specific steps I should go about to minimize the impact? Btw, I will be using Danny Dover's guide on properly moving domains where appropriate. Thanks!
Technical SEO | | AscendLearning0 -
How to increase your Domain Authority
Hi Guys, Can someone please provide some pointers on how to best increase your Domain Authority?? Thanks Gareth
Technical SEO | | GAZ090 -
Mobile Domain Setup
Hi, If I want to serve a subset of pages on my mobile set from my desktop site or the content is significantly different, i.e. it is not one to one or pages are a summarised version of the desktop, should I use m.site.com or is it still better to use site.com? Many thanks any help appreciated.
Technical SEO | | MarkChambers0 -
Will errors on a subdomain effect the overall health of the root domain?
As stated in the question, we have 2 sub domains that contain over 2000 reported errors from SEOMOZ. The root domain has a clean bill of health, and i was just wondering if these errors on the sub-domains could have a negative effect on the root domain in the eyes of Google. Your comments will be appreciated. Regards Greg
Technical SEO | | AndreVanKets0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0