Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Location Pages On Website vs Landing pages
-
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example.
One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com.
I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past.
What are are your thoughts and and resources so I can convince my team on the best practice.
-
Hi KJ,
Agree with the consensus here that building mini sites is not the right approach. Take whatever energy you would have put into developing these and channel it into making the landing pages for your locations the best in their industry/towns. I was just watching a great little video by Darren Shaw in which this is one of the things he covers. Might be worth sharing with your team:
http://www.whitespark.ca/blog/post/70-website-optimization-basics-for-local-seo
And earlier this year, Phil Rozek penned some pretty fine tips on making your pages strong:
I am curious about one element of your original post. You mention, "We have been having a terrible time in the local search results for 20 + locations." I wasn't sure whether you were saying that you've never done well in them, were doing well in them until something changed (such as the universal rollout of Local Stacks) or something else. With the latter, I would guess that a huge number of businesses are now struggling to cope with the fact that there are only 3 spots to rank for any keyword, necessitating greater focus on lower volume keywords/categories, organic and paid results. Everybody but the top 3 businesses is now in this boat. Very tough.
-
Hi KJ,
First things first, do you have a physical address for each location and are these set up in Google My Business? I doubt you have premises in each location, so ranking for all the areas is going to be an uphill task.
Google is smart and knows if you have physical premises in the targeted location, after all it's all about delivering highly relevant results to its users. Lets say for example you're an electrician and a user searches for "Electrician in Sheffield" - realistically, if you only have premises in Leeds, it's going to be difficult to rank above the company who is actually located in Sheffield.
I would firstly target 2-3 of your primary locations and focus on building 10x content, I would aim to write 1000+ words for each page (completely unique content) whilst focusing on your set keywords, but be natural and don't keyword stuff. Put reviews from customers in that specific area on the landing page and build citations from local directories.
Again, you can't build citations unless you have physical premises in the location. Trust me, I've done it for years for a Roofing company and it's taken some time to see the results. He's #1 for the city he is located in, but for other cities it's a very difficult task. Writing about the same service for each location is a daunting task too, you should consider Great Content to outsource the content if you're stuck for ideas. It's a low budget solution and will save you mountains of time.
I would also use folders and not subdomains. Build a 'service areas' page, examples of urls for the roofing company below.
-
Hello KJ,
You absolutely don't want to begin creating subdomains for different locations. That will split your link flow across multiple domains (rather than consolidating it within a single domain).
It sounds like you are attempting a silo structure for your website (multiple locations targeting the same keyword) but this can be seen as stuffing if performed incorrectly. Using multiple pages to rank for a single keyword is problematic as it hits both Panda and Penguin red flags. What you want to do is begin ranking for different keywords or at least ensuring that your content for each of these locations pages is unique and sufficiently long (500 words+) to avoid arousing suspicion.
Your site structure sounds like it is okay. For example, a silo we put in place for one of our clients followed the following pattern:
domain.com/country/region/city/service
We hit about 15 cities using this tactic, and they have been sitting 1st page for the last year or so. We also built sufficient links to the home page and relevant pages and ensured that our technical SEO was spotless, so perhaps these are the areas you might engage your team to move forward on.
If you want to know more about our process, feel free to touch base and I will provide what advice I can.
Hope this helps and best of luck moving forward!
Rob
-
Right. You will not beat the other folks with the subdomain approach. You are getting beat because your competitors are taking the time to make better content in a niche. Find a way to get better content on those pages and mark them up with schema to make the info more readable to the search engines and possibly get an enhanced listing the SERPs.
We went through a site relaunch and the review schema on locations got messed up. Did not impact our rankings, but did impact click through from the search engines. None of the stars were showing up in the SERPs due to the schema goof up. Got the schema fixed and traffic was back up.
This link will point you toward the relevant Moz resources
If you are happy with my response, please feel free to mark as a "Good Answer" thanks!
-
I agree with you. Some marketing people believe that we cannot beat out smaller companies is that we are too diverse in services. We do great with niche keywords and markets, but are being beat by companies who only focus on one of our key services. That is why they thought sub domains would do better, but I remember Rand posting something on sub domains vs sub folders, but cannot find the original source.
Thanks for your answer...
-
This is similar to the question on if a blog should be on a subdomain (blog.website.com) vs a folder (website.com/blog).
Most people agree that the use of the folder is the better option as with every blog post that you get links to etc, you are building your domain authority and generally speaking, rising tides raise all ships.
You would run into the same issue with your option to setup subdomains for each location. You would also end up having to deal with separate webmaster accounts for each etc. I don't think the subdomain is the solution. I run a site with thousands of locations and using a folder structure the business pages rank well for a given location, if you search on the name of the location, so I know it works and I manage it at scale.
I would get back to looking at any technical issues you have and your on page options for the pages. Anything you can further do to make these pages 10x better than any other page on the net for those locations?
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
If my website uses CDN does thousands of 301 redirect can harm the website performance?
Hi, If my website uses CDN does thousands of 301 redirect can harm the website performance? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
301 vs 410 redirect: What to use when removing a URL from the website
We are in the process of detemining how to handle URLs that are completely removed from our website? Think of these as listings that have an expiration date (i.e. http://www.noodle.org/test-prep/tphU3/sat-group-course). What is the best practice for removing these listings (assuming not many people are linking to them externally). 301 to a general page (i.e. http://www.noodle.org/search/test-prep) Do nothing and leave them up but remove from the site map (as they are no longer useful from a user perspective) return a 404 or 410?
Intermediate & Advanced SEO | | abargmann0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0