Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Redirect the main site to keyword-rich subfolder / specific page for SEO
-
Hi,
I have two questions.
Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords?
Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions.
My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder.
Thanks in advance,
Sam
-
Thanks. One more question: does this also mean that the main page www.example.com/index.php (whether the index.php is shown to the user or not) gets all the same domain authority as the domain itself (www.example.com) as it is the main page?
-
Choose a domain and stick with it, then build pages out from there. Redirect the non-www to the www (if this is what you choose to go with) and forget about the rest... redirect site-wide.
Admittedly I'm simplifying it for two reasons: 1.) I'm not quite sure I get it, this is rather confusing and 2.) it is that simple.
You want a domain and then you want pages and subdirectories targeting your keywords. That's it that's all. I would not build links for the non-www if you are going to redirect for the www. Build links for the domain you settle on.
I'm not sure I'm helping but hope so!
-
Thanks a lot for your comments and advice, now things are starting to get more clear for me. However, I have one more question. I have now completed a detailed level of analysis and I discovered the following.
Our company is having these domains (same structure as in these):
When you go to either of these addresses, it is directed by using 301 to the following URL: http://namegroup.com/en/accounting-outsourcing-and-legal-services (no external linking root domains)
In addition, in the past some part of the link building has been made to http://www.namegroup.com (15 linking root domains), some to http://namegroup.com (1 linking root domain), some to www.name-group.com (6 linking root domains), some to http://namegroup.com/en/main (2 linking root domains). And now all is directed by using 301 to http://namegroup.com/en/accounting-outsourcing-and-legal-services (no external linking root domains). These pages have been the main page at the time these links were created.
It would make the most sense for me to start using www.namegroup.com as the main URL (as this URL has the most linking root domains), and then redirect all the rest here by using 301. And when possible, change the link in the rest of the linking domains to direct to www.namegroup.com
It is quite a big mess now, and I would like to bring some order and consistency here (also use in the future only this form). Why I am wondering whether I should make this, is that since I optimized the title tags and changed the URL for the current one (http://namegroup.com/en/accounting-outsourcing-and-legal-services) some weeks ago, we are ranking very well in Google for some of the most important keywords that we now have in the title tag and URL (we are in first page of SERP, in third and sixth place). I think it is mostly because of optimizing the title tags (but perhaps there is effect of the URL change as well).
Should I still do the change, and start using www.namegroup.com as the main domain, and redirect all others by using 301 there? What do you think?
If I would not change anything, and would keep the current main page URL, should I focus my link building for the URL http://namegroup.com or http://namegroup.com/en/accounting-outsourcing-and-legal-services? Somehow I feel like I would not like to focus the link building for the current URL, in case we decide to change it in the future (it is also quite long) and would prefer to focus link building for http://namegroup.com or www.namegroup.com.
Thank you in advance for your valuable comments.
Best regards
Sam
-
I agree with Chris and Jesse here!
For question one, you should not do this just because you want to have keywords in your URL as Google is more looking in to the quality of content that is available on the site instead of relaying only on keywords based domains and URLs. You can also go with the Jesse’s idea to create an internal page that contains keywords you want!
For question 2 I believe it’s your I will not comment until you asked me to move to sub domain.... sub folders are fine but Google treats sub domains as a separate domain but for sub folder both versions are just fine to me!
Hope this helps!
-
Sam, from an SEO standpoint, there's no need to jump through any hoops in order to get keyword into your URLs as the value that that brings is negligible and still decreasing. On the other hand, it can bring value in the form of click throughs once the result makes near the top of the the search results.
As far as the folders and URLs go, a URL that shows the directory (folder) but no page name is simply the default page for that directory. Just as the /index.php isn't usually shown in the URL for a domiain's homepage (the default page for the domain), the /index.php is often not shown in the URL for the default page in a directory.
-
Question 1: No! Why not just create the internal page and have it target the specific keyword? Your homepage is your brand, not a product/service page. Those are internal. They will rank for whatever you are targeting (if your SEO campaign is strong). Why are you worried about what your homepage ranks for?
Short answer: No. Make internal product/service pages targeting specific keywords and do not redirect your home page.
Question 2: Huh? Those two examples seemed exactly the same to me. Are you asking why some pages will show a sub directory and some pages will show the html page in the URL? If so, it's all in your structure. It doesn't really matter which way you wanna do it but having multiple directories may give you the opportunity to attach keyword targets such as "example.com/services/stuff-i-do.html" as opposed to "example.com/stuff-i-do.html"
The former example will bring the word "services" into your string.. IF you are trying to get your page to just read "example.com/services" then just create that directory and drop an index page in.
Hope this answers your questions or at least comes close.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Has any one seen negative SEO effects from using Google Translate API
We have a site currently in development that is using the Google Translate API and I am having a massive issue getting screaming frog to crawl and all of our non-native English speaking employees have read through the translated copy in their native language and the general consensus is it reads at a 5th grade level at best. My questions to the community is, has anyone implemented this API on a site and has it a) helped with gaining traffic from other languages/countires and b) has it hurt there site from an SEO standpoint.
International SEO | | VERBInteractive0 -
International Site Merge
Hello, I've never had to deal with an international site before, let alone a site merge. These are two large sites, we've got a few smaller old sites that are currently redirecting to the main site (UK). We are looking at moving all the sites to the .com domain. We are also currently not using SSL (on the main pages, we are on the checkout). We also have a m.domain.com site. Are there any good guides on what needs to be done? My current strategy would be: Convert site to SSL. Mobile site and desktop site must be on the same domain. Start link building to the .com domain now (weaker link profile currently) What's the best way of handling the domains and languages? We're currently using a .tv site for the UK and .com for the US. I was thinking, and please correct me if i'm wrong, that we move the US site from domain.com to domain.com/us/ and the domain.tv to domain.com/en/ Would I then reference these by the following: What would we then do with the canonicals? Would they just reference their "local" version? Any advice or articles to read would really be appreciated.
International SEO | | ThomasHarvey0 -
What are the best practices for translation of city/state names for international SEO? (ie. New York in English vs. Nueva York in Spanish)
I'm working on international SEO / translation of a global travel site. While we have a global keyword research and translation strategy in process for each market they serve, I've run into a unique question. Overall, we are translating (and localizing) content for each market but aren't sure what to do with location names. Each country/state has cities and locations that have their own dedicated pages. I see three options for these location names (when titling a page and writing content): keep them in English, translate the names in the market languages, or use a combination of the two. The challenge with altering the location names to the market languages is that they are truly not known by those names. Though there are some instances where it may make sense…for instance **New York **in Spanish would be "Nueva York" with **‘**Nueva' being the Spanish translation of ‘new’. There are other instances, where no translation exists. If you’ve had a similar experience I'd love to hear your approach/recommendation.
International SEO | | JonClark150 -
Massive jump in pages indexed (and I do mean massive)
Hello mozzers, I have been working in SEO for a number of years but never seen anything like a jump in pages indexed of this proportion (image is from the Index Status report in Google Webmaster Tools: http://i.imgur.com/79mW6Jl.png Has anyone has ever seen anything like this?
International SEO | | Lina-iWeb
Anyone have an idea about what happened? One thing that sprung to mind might be that the same pages are now getting indexed in several more google country sites (e.g. google.ca, google.co.uk, google.es, google.com.mx) but I don't know if the Index Status report in WMT works like that. A few notes to explain the context: It's an eCommerce website with service pages and around 9 different pages listing products. The site is small - only around 100 pages across three languages 1.5 months ago we migrated from three language subdomains to a single sub-domain with language directories. Before and after the migration I used hreflang tags across the board. We saw about 50% uplift in traffic from unbranded organic terms after the migration (although on day one it was more like +300%), especially from more language diversity. I had an issue where the 'sort' links on the product tables were giving rise to thousands of pages of duplicate content, although I had used the URL parameter handling to communicate to Google that these were not significantly different and only to index the representative URL. About 2 weeks ago I blocked them using the robots.txt (Disallow: *?sort). I never felt these were doing us too much harm in reality although many of them are indexed and can be found with a site:xxx.com search. At the same time as adding *?sort to the robots.txt, I made an hreflang sitemap for each language, and linked to them from an index sitemap and added these to WMT. I added some country specific alternate URLs as well as language just to see if I started getting more traffic from those countries (e.g. xxx.com/es/ for Spanish, xxx.com/es/ for Spain, xxx.xom/es/ for Mexico etc). I dodn't seem to get any benefit from this. Webmaster tools profile is for a URL that is the root domain xxx.com. We have a lot of other subdomains, including a blog that is far bigger than our main site. But looking at the Search Queries report, all the pages listed are on the core website so I don't think it is the blog pages etc. I have seen a couple of good days in terms of unbranded organic search referrals - no spike or drop off but a couple of good days in keeping with recent improvements in these kinds of referrals. We have some software mirror sub domains that are duplicated across two website: xxx.mirror.xxx.com and xxx.mirror.xxx.ca. Many of these don't even have sections and Google seemed to be handling the duplication, always preferring to show the .com URL despite no cross-site canonicals in place. Very interesting, I'm sure you will agree! THANKS FOR READING! 79mW6Jl.png0 -
SEO for .com vs. .com.au websites
I have a new client from Australia who has a website on a .com.au domain. He has the same domain name registered for .com. Example: exampledomain.com.au, and exampledomain.com He started with the .com.au site for a product he offers in Australia. He's bringing the same product to the U.S. (it's a medical device product) and wants us to build a site for it and point to the .com. Right now, he has what appears is the same site showing on the .com as on the .com.au. So both domains are pointing to the same host, but there are separate sections or directories within the hosting account for each website - and the content is exactly the same. Would this be viewed as duplicate content by Google? What's the best way to structure or build the new site on the .com to get the best SEO in the USA, maintain the .au version and not have the websites compete or be viewed as having duplicate content? Thanks, Greg
International SEO | | gregelwell0 -
Does Keyword and Location Matter?
Hi Everyone, I'm always learning, but here's a question. I would like to know if keyword and location truly matter. For example, I've been trying to rank my website for a LONG time for a UK English term. My site is hosted in the US. My site has great content and internal and external links using the keyword. I cannot seem to climb the SERPs although my "American" keywords do fine and I see results. If anyone wants to take a look, that would be great. My website is JourneyBeyondTravel.com and I wish to rank for "Morocco Holidays" (I am at about #20 currently). I am also having trouble with "Morocco travel" although I have continuously ranked well for "Morocco tours" and "trips." Along this same line, I've been doing some quality guest posting and blogging. I've used longer phrase-type keywords (3-5 words) in the article text that have keyword terms in them. Should keywords be varied like this? How long until I see results? And, should I look for blogs in different countries to keep things balanced (such as blogging on .co.uk sites so that I can get link juice for UK keywords). Thanks again! Thomas
International SEO | | journeybeyondtravel0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0