Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Website Redesign - What to do with old 301 URLs?
-
My current site is on wordpress. We are currently designing a new wordpress site, with the same URLs.
Our current approach is to go into the server, delete the current website files and ad the new website files.
My current site has old urls which are 301 redirected to current urls. Here is my question. In the current redesign process, do i need to create pages for old the 301 redirected urls so that we do not lose them in the launch of the new site? or is the 301 command currently existing outside of our server so this does not matter?
Thank you in advance.
-
your redirects has to be in your server if not it'd be hard to manage that by yourself.
What you ahve to be sure is how those redirects have been done.
If they're on a dedicated HTACCESS, then it would be fine as, like you said, you'll ba mainitainig hte exact same URL structure, but if they're maintained by a WP plulgin or JS, be sure to not ovverwrite it, or if you want to get rid of it, export your rules and rewrite them on your htaccess.
-
Hi there
If you are keeping your URL structure the same, then you should not have to update your redirects file, so long as all of the destination pages are staying in place. If you are removing any current URLs and those have pages being redirected to them, you will have to update that in your redirects file.
I would do a backlink audit for the old URLs, however, and see if any are worth updating to their new URLs so the new URLs can get equity.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Mergers & Acquisitions - Website Transition Good practice
Hi everyone, I was wondering if anyone has come across good practice for maintaining websites after a merger or acquisition where there needs to be an association between two websites of the two companies involved. For an acquisition, I'm considering moving the acquired company to a sub domain of the parent company e.g. aquiredcompany.parentcompany.com. On both websites there wmay be a prominant popup so visitors can switch between the websites if they have visited the incorrect one. One worry I have is the acquired company has some good rankings, which I want to keep. I will of course manage the process through 301 redirects. But I was wondering if anyone has any thoughts on this approach or can suggest any better solutions. Thanks in advance, Stuart
Web Design | | Stuart260 -
SEO strategy for UK / US websites
Hi, We currently have a UK-focused site on www.palmatin.com ; We're now targeting the North American market as well, but the contents of the site need to be different from UK. One option was to create another domain for the NA market but I assume it would be easier to rank with palmatin.com though. What would you suggest to do, if a company is targeting two different countries in the same language? thanks, jaan
Web Design | | JaanMSonberg0 -
Question Mark In URL??
So I am looking at a site for a client, and I think I already have my answer, but wanted to check with you guys. First off the site is in FLASH and HTML. I told the client to dump the flash site, but she isn't willing right now. So the URLS are generated like this. Flash: http://www.mysite.com/#/page/7ca2/wedding-pricing/ HTML: http://www.mysite.com/?/page/7ca2/wedding-pricing/ checking the site in Google with a site:mysite, none of the interior pages are indexed at all. So that is telling me that Google is pretty much ignoring everything past the # or ?. Is that correct? My recommendation is to dump the flash site and redo the URLS in a SEo friendly format.
Web Design | | netviper0 -
Subdomains For Real Estate Website
I am currently working on a proposal for a clients Wordpress website development which includes ongoing SEO after the website is developed. I have looked into a number of options and the one that seems the most cost effective involves using subdomains for the individual listings pages. What I want: clientsdomain.com/listings/idxnumber/ What I can get for a decent price: listings.clientsdomain.com/idxnumber/ So the majority of the website will actually exist on a subdomain because the IDX API will automatically populate pages for all of the MLS listings in the area (hundreds or thousands). Meanwhile the domain itself will have all the neighborhood pages and other optimized content, blogs and whatnot. My concern is that dividing the website like this will have negative effects on SEO. There wont be duplicate content across subdomain and main domain, but they will share a lot of links back and forth. I haven't found any recent sources on the topic. Almost everything I have found says that dividing a website in this manor is bad for SEO, but these articles are often many years old. Does anyone know of a Wordpress plugin/IDX company that can provide a solution that doesn't use a subdomain and actually just lists each MLS page within a directory? I am open to using another platform, I am just most familiar with Wordpress. Will using a subdomain in the ways mentioned above have a profound negative effect on SEO? Thank you for your time in responding, I greatly appreciate it.
Web Design | | TotalMarketExposure0 -
Redirects (301/302) versus errors (404)
I am not able to convincingly decide between using redirects versus using 404 errors. People are giving varied opinions. Here are my cases 1. Coding errors - we put out a bad link a. Some people are saying redirect to home page; the user at least has something to do PLUS more importantly it does NOT hurt your SEO ranking. b. Counter - the page ain't there. Return 404 2. Product removed - link1 to product 1 was out there. We removed product1; so link1 is also gone. It is either lying in people's bookmarks, OR because of coding errors we left it hanging out at some places on our site.
Web Design | | proptiger0 -
Rankings Dropped After Redesign
Hi, I've recently redesigned our website with the main changes being sidebar changes and source ordering (making the main content appear before the sidebars). No URL changes have been made. A few days after making these changes our positions dropped heavily and have been dropping ever since. It's been a week and a half now and traffic is down by around 40%. Google has the new changes cached. Do people feel this just a temporary drop and will we rankings to go back at least or should we revert to the old structure? Website: http://www.diyorgasms.co.uk (NSFW) Thanks
Web Design | | diyorgasms0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0