Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I Add Location to ALL of My Client's URLs?
-
Hi Mozzers,
My first Moz post! Yay! I'm excited to join the squad
My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc.
I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like:
example.com/weddings/planners-washington-dc-md-va
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-vaOR
example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lightingIn both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it.
Thoughts?
Thank you!!
-
No website in particular that springs to mind, I'm afraid. But it's not uncommon practice, and I'm sure you'll find plenty within your industry from a little competitor research.
Good luck!
-
This is great stuff. Thank you! Would you happen to have an example of a site that does this well? I think you're spot on in your suggestions and would love to see it in practice.
-
(I had posted my response, but Moz didn't fancy saving it for some reason and it's just gone. So I'll try and remember what I typed and repost it...)
I wouldn't dilute the site authority by using subdomains for your locations.
As a user, I would recommend your main site navigation lists the different event types (weddings, parties, corporate, etc) and branch your locations from there.
e.g.
-
Weddings - /weddings/ (Weddings)
-
Miami - /weddings/miami/ (Weddings in Miami)
-
Planners - /weddings/miami/planners/ (Wedding Planners in Miami)
-
DJs - /weddings/miami/djs/ (Wedding DJs in Miami)
-
Ballroom Lighting - /weddings/miami/ballroom-lighting/ (Ballroom Lighting for Weddings in Miami)
That structure seems the most logical to me, but you should do your own research to back this up. Conduct thorough keyword research for each service in each location and structure your landing page content accordingly. For example, main category pages broadly targeting root keyword, but display "cards" or sections that link to each location without optimising those main category pages for the locations - save this for the location-based landing pages. So this sub-navigation is in the body, rather than in the main navigation, for user-friendliness.
I think with something like events, you don't want to shove the locations in the user's face first thing. Let them see what you offer (the different event types), then delve down into the locations, and the specific services within those locations.
People are free to disagree with me, and I welcome critique on these thoughts. I do think with SEO, it gets to a point after "best practices" that it comes down to more of personal preferences.
-
-
Excellent advice Ria. I'll likely give that advice to the client.
Another question that brewed from this: how then should main navigation be handled as we expand? obviously we can't have D.C. centric keywords in the main navigation as the business expands. I think we could create unique content and landing pages for each individual service and location, but how would that be incorporate into the overall user flow and URL structure?
Would it be more of a sitemap play? If someone goes to www.example.com, should they be given an option to choose their location then be routed to that specific city's subdomain and yhenbrowse from there?
I guess my main question is, how exactly should we structure the site navigation for users from multiple cities to both please UX and the big G?
Thank you!
-
For a handful of different locations, it's quite common to structure them as different subdirectories, as you said. site.com/weddings/miami/planners or /miami/weddings/planners - whichever makes the most sense for your customer base and how you're targeting the content.
Just ensure that these are not considered doorway pages or appear to be too templated. Make each landing page for each location unique, and tailored specifically to your customers in each location. If you have nothing unique to say, then you don't need separate pages. It would be best to target the different locations on the same landing pages. But you being the expert in the industry, I can imagine it'll be easy enough to cater toward each audience specifically. Especially when you're not dealing with tens if not hundreds or thousands of different towns.
If you are certain on expanding to different cities soon, then it might be best to begin the URL structuring with /washington-dc/ subdirectory somewhere, so you don't have to change this later.
-
Thank you, Ria. That's very helpful.
Im curious, when the business expands to different cities in the coming months (for example, Miami and Chicago are being considered, not yet finalized), then in that case I would assume we need to have location in the URL path for the sake of designation and differentiation. This may be a sub folder in and of itself though. Thoughts?
-
I'd avoid adding the location in the URL if you only work with those services for a single location. It looks messy to the user, and can look spammy to Google. And it would save you from having to change the URL and set up redirects, if you need to remove the location keywords from the URL at a later date in order to please the Big G. Optimising for location within the content, title and meta can be easily tweaked with time. Tweaking URLs can be a lot messier.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
What are Soft 404's and are they a problem
Hi, I have some old pages that were coming up in google WMT as a 404. These had links into them so i thought i'd do a 301 back to either the home page or to a relevant category or page. However these are now listed in WMT as soft 404's. I'm not sure what this means and whether google is saying it doesn't like this? Any advice welcomed.
Intermediate & Advanced SEO | | Aikijeff0 -
Should I 'nofollow' links between my own sites?
We have five sites which are largely unrelated but for cross-promotional purpose our company wishes to cross link between all our sites, possibly in the footer. I have warned about potential consequences of cross-linking in this way and certainly don't want our sites to be viewed as some sort of 'link ring' if they all link to one another. Just wondering if linking between sites you own really is that much of an issue and whether we should 'nofollow' the links in order to prevent being slapped with any sort of penalty for cross-linking.
Intermediate & Advanced SEO | | simon_realbuzz0 -
Include Cross Domain Canonical URL's in Sitemap - Yes or No?
I have several sites that have cross domain canonical tags setup on similar pages. I am unsure if these pages that are canonicalized to a different domain should be included in the sitemap. My first thought is no, because I should only include pages in the sitemap that I want indexed. On the other hand, if I include ALL pages on my site in the sitemap, once Google gets to a page that has a cross domain canonical tag, I'm assuming it will just note that and determine if the canonicalized page is the better version. I have yet to see any errors in GWT about this. I have seen errors where I included a 301 redirect in my sitemap file. I suspect its ok, but to me, it seems that Google would rather not find these URL's in a sitemap, have to crawl them time and time again to determine if they are the best page, even though I'm indicating that this page has a similar page that I'd rather have indexed.
Intermediate & Advanced SEO | | WEB-IRS0