Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The use of a ghost site for SEO purposes
-
Hi Guys,
Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site.
After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com.
It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain?
Does anyone have any experience of this as a tactic?
Thanks,
Dan
-
The appropriate solution in this place would be to employ the hreflang tag to related the two geographically separate sites together. However, before you take that step, I would look to make sure the previous SEO company which created the .com did not point any harmful links at the .com domain which would make it inadvisable to connect the two sites together. Use OpenSiteExplorer to look at the backlink profile and use Google Webmaster Tools to authorize the .com site and look for any manual penalty notices. If all looks clear, go ahead forward with the implementation of the hreflang tag.
Good luck and feel free to ask more questions here!
-
Thanks Both,
Pretty much confirms our thoughts here and yes Eddie - it appears to be a smash and grab job.
Dan
-
Certainly sounds dodgy, but suddenly removing all of those backlinks might cause you some SEO issues.
Depending on how Google is currently reading your site it may improve as your site would seem less spammy without them or it may really hurt the site (at least to start with) loosing that many back links would maybe make Google think something is up with your site?
I would take the bullet and remove the duplicate content but warn your clients that it may take a while for the natural benefits to come through. Because if your site isn't penalised yet for having that many dodgy backlinks and duplicate content it soon will be!
-
Certainly seems like the wrong thing to do. A good test is that if you think it may be dodgy it probably is. I certainly wouldn't recommend it as a tactic. There are potentially multiple issues with this....duplicate content as you mentioned but also dilution of real links. Good quality legitimate links could link to the Ghost site and therefore not count for the real site.
I have seen issues where it is a legitimate attempt to have a .com and .co.uk on the same shop and ended up with both versions online due to incompetent development but I didn't have to deal with cleaning it up.
Un-picking that could be messy. A good example of quick fix SEO for a fast buck I suspect.
-
5mil+ links?! Wow!
What's their spam score? I'm surprised they are not blocked or something
To answer your question - what does common sense tells you? The job of google and google bots is pretty much based on common sense. So, duplicate content website, ridiculous amount of links, no referral traffic - all these are obvious signals to run, Forrest, run!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the proper URL length? in seo
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google. but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me? my competitors have 8 characters domain url and keywords length of 13 and my site has 15 character domain url and keywords length of 13 which one will be prefered by google.
White Hat / Black Hat SEO | | calvinkj0 -
Old subdomains - what to do SEO-wise?
Hello, I wanted the community's advice on how to handle old subdomains. We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org. As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these? I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore. Many thanks in advance.
White Hat / Black Hat SEO | | e.wel0 -
Moz was unable to crawl your site? Redirect Loop issue
Moz was unable to crawl your site on Jul 25, 2017. I am getting this message for my site: It says "unable to access your homepage due to a redirect loop. https://kuzyklaw.com/ Site is working fine and last crawled on 22nd July. I am not sure why this issue is coming. When I checked the website in Chrome extension it saysThe server has previously indicated this domain should always be accessed via HTTPS (HSTS Protocol). Chrome has cached this internally, and did not connect to any server for this redirect. Chrome reports this redirect as a "307 Internal Redirect" however this probably would have been a "301 Permanent redirect" originally. You can verify this by clearing your browser cache and visiting the original URL again. Not sure if this is actual issue, This is migrated on Https just 5 days ago so may be it will resolved automatically. Not sure, can anybody from Moz team help me with this?
White Hat / Black Hat SEO | | CustomCreatives0 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Negative SEO Click Bot Lowering My CTR?
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term. Is there any way to detect this activity? Background: We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO. We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only) Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory. Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
White Hat / Black Hat SEO | | TheDude0 -
Do pingbacks in Wordpress help or harm SEO? Or neither?
Hey everyone, Just wondering, do pingbacks in Wordpress help or harm SEO? Or neither?
White Hat / Black Hat SEO | | jhinchcliffe1 -
Using an auto directory submission
Has anyone used easysubmits.com and what's your experience with it? Any other directory submission or link building tools that help automate and manage the process like easysubmits.com says they can do? I'm just looking at it currenlty and wanted to hear others thoughts before I get taken in by some black hat method that hurts my websites instead.
White Hat / Black Hat SEO | | Twinbytes0