Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best way to indicate multiple Lang/Locales for a site in the sitemap
-
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing.
Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using
<loc>http://www.example.co.uk/en_GB/"</loc>
<xhtml:link<br>rel="alternate"
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/>Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br>
Thanks!
-
Yes, you are doing the right thing. You may also want to look at including Meta Tags in the as well. ()
-
Maybe the best solution is using a tool like this one by MediaFlow: http://www.themediaflow.com/resources/tools/href-lang-tool/.
You feed the tool with an .csv file and it returns you with a sitemaps.xml with all the hreflang annotations included.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration due to Corporate Acquisition
Hey everyone, Wanted to check-in on something that I've been thinking way too much about lately. I'll do my best to provide background, but due to some poor planning, it is rather confusing to wrap your head around. There are currently three companies involved, Holding Corp (H Corp) and two operating companies, both in the same vertical but one B2B and the other is B2C. B2C corp has been pushed down the line and we're focusing primarily on H Corp and B2B brand. Due to an acquisition of H Corp and all of it's holdings, things are getting shuffled and Ive been brought in to ensure things are done correctly. What's bizarre is H Corp and it's web property are the dominant authority in SERPs for the B2B brand. As in B2B brand loses on brand searches to H Corp, let alone any product/service related terms. As such, they want to effectively migrate all related content from H Corp site to B2B brand site and handover authority as effectively as possible. Summary: Domain Migration from H Corp site to B2B Brand site. Ive done a few migrations in my past and been brought in to recover a few post-launch so I have decent experience and a trusted process. One of my primary objectives initially is change as little as possible with content, url structure (outside the root) etc so 301s are easy but also so it doesn't look like we're trying to play any games. Here's the thing, the URL structure for H Corp is downright bad from both a UX perspective and a general organizational perspective. So Im feeling conflicted and wanted to get a few other opinions. Here are my two paths as I see and Id love opinions on both: stick with a similar URL structure to H Corp through the migration (my normal process) but deviate from pretty much every best practice for structuring URLs with keywords, common sense and logic. Pro: follow my process (which has always worked in the past) Con: don't implement SEO/On-page best practices at this stage and wait for the site redesign to implement best practices (more work) Implement new URL structure now and deviate from my trusted process. Do you see a third option? Am I overthinking it? Other important details: B2B brand is under-going a site redesign, mostly aesthetic but their a big corporation and will likely take 6-9 months to get up. Any input greatly appreciated. Cheers, Brent
Web Design | | pastcatch1 -
Have an eBook. What is best practice for SEO?
Hello We have a free eBook - its a great resource and great piece of content. It is available to download on our website here - http://re-timer.com/the-product/how-to-sleep-better/ The book is available as a whole or as individual chapters (i.e. http://re-timer.com/app/uploads/2015/07/Chapter8.pdf?b0df38). The PDF chapters appear to be doing well in Google search for certain keywords. I can't measure this in GA though. I would like the eBook to assist the SEO of my website overall. If I create a web page and 'embedded' the PDF into it will Google still crawl this page? At the moment we are also using this to collect email addresses, this is a nice to have and it is OK if people get the eBook without doing this (if they find a chapter in Google they currently don't have to enter their email address). I'm sure lots of people have eBooks now. What is best practice and the best way to use this as a tool to maximise SEO for the whole website (http://re-timer.com)? Thank you! Laura
Web Design | | LauraFalls1 -
Best practice for multilanguage website ( PHP feature based on Browser or Geolocalisation)
Hi Moz Experts I would like to know what does it the best practice for multilanguage website for the default language ? There are several PHP features to help users to get the right language when they come from SEO and direct; present the default language by browser language, by gelolocalisation, etc. However, which one is the most appropriate for Quebec company that try to get outside Canada ? PRO and CONS. Thank you in advance.
Web Design | | johncurlee0 -
One Page Guide vs. Multiple Individual Pages
Howdy, Mozzers! I am having a battle with my inner-self regarding how to structure a resources section for our website. We're building out several pieces of content that are meant to be educational for our clients and I'm having trouble deciding how to layout the content structure. We could either layout all eight short sections on a single page, or create individual pages for each section. The goal is obviously to attract new potential clients by targeting these terms that they may be searching for in an information gathering stage. Here's my dilemma...
Web Design | | jpretz
With the single page guide, it would be nice because it will have a lot of content (and of course, keywords) to be picked up by the SERPS but I worry that it is going to be a bit crammed (because of eight sections) for the user. The individual pages would be much better organized and you can target more specific keywords, but I worry that it may get flagged for light content as some pages may have as little as a 150 word description. I have always been mindful of writing copy for searchers over spiders, but now I'm at a more technical crossroads as far as potentially getting dinged for not having robust content on each page. Here's where you come in...
What do you think is the better of the two options? I like the idea of having the multiple pages because of the ability to hone-in on a keyword and the clean, organized feel, but I worry about the lack of content (and possibly losing out on long-tail opportunities). I'd love to hear your thoughts. Please and thank you. Ready annnnnnnnnnnnd GO!0 -
White Text / Black Background & SEO Impact
Does anyone know of any testing / studies with evidence that Google prefers dark text on a light background vs. light text on a dark background? I have a website that currently has light text on a black background, and really like the way it looks, but am concerned that the style may be hurting SEO. Moreover, redesigning something inverse with the same quality would be a large project and fairly costly, so I'd like to make sure the benefit will really be worth the cost before moving forward.
Web Design | | Bromtec0 -
What is the best tool to view your page as Googlebot?
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
Web Design | | EcommerceSite0 -
Best Website Builder - Help Me Choose
I need to built a multi language site (to built a Pilates, Yoga site) and I will use a site builder. After posting questions on wix.com I came to the fact I should continue my research because there are not SEO friendly. Do you have a suggestions? Limited to html knowledge, using a website builder is my only option. Here are some of the features I need: Multilanguage Web Site Mobile version SEO Friendly Nice Template Selections( this is important) HTML customization Twitter, Facebook, Blog... I'm not looking at free website builder, when you want good features, there is a price to paid. Thank you for your help and suggestions, BigBlaze
Web Design | | BigBlaze2050 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0