Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to set cannonical link rel to CS CART
-
I whant to specify a link rel cannonical for each category page, how to do that without changing the code (just from admin section), because filters and sorting search are making the site dublicate content with their parameters;
If there is a way please specify the method,
i whant to avoid hours of working in a script like this.
Thank's.
-
thanks Tom
-
Take a look at the SEO forums on the CS Cart forums:
http://forum.cs-cart.com/forumdisplay.php?f=45
I bet they know the best way!
-
Thank you Thomas,
I think i'll have to find another way. I specified alredy the parameters in google webmastertools but it seems not to wrok in case of ajax filters
Best,
Ion
-
Hi there,
I am afraid there is no way of not working with a script, when using rel=canonical. Either you have to develop a script that generates the canonical URL by itself, or you have to insert the meta tag with the right URL manually.
I would strongly discourage NOT spending any time on rel canonical, as it can mess you up site's rankings completely. Read this post by Dr. Pete if you need to convince yourself to spend some our doing the proper development: http://www.seomoz.org/blog/catastrophic-canonicalization
Best,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Pages with excessive number of links
Hi all, I work for a retailer and I've crawled our website with RankTracker for optimization suggestions. The main suggestion is "Pages with excessive number of links: 4178" The page with the largest amount of links has 634 links (627 internal, 7 external), the lowest 382 links (375 internal, 7 external). However, when I view the source on any one of the example pages, it becomes obvious that the site's main navigation header contains 358 links, so every new page starts with 358 links before any content. Our rivals and much larger sites like argos.co.uk appear to have just as many links in their main navigation menu. So my questions are: 1. Will these excessive links really be causing us a problem or is it just 'good practice' to have fewer links
Intermediate & Advanced SEO | | Bee159
2. Can I use 'no follow' to stop Google etc from counting the 358 main navigation links
3. Is have 4000+ pages of your website all dumbly pointing to other pages a help or hindrance?
4. Can we 'minify' this code so it's cached on first load and therefore loads faster? Thank you.0 -
Multiple Internal links to same destinations
My company is redoing our homepage and there will be 4 links to our main play pages (5 games). 2 in the menu and 2 within the content. I was thinking I should no follow one of the links on the homepage + 1 in the menu so that we don't have link dilution from having multiple internal links to the same destination within 1 page. Does this make sense? Any downside of this or suggestions of a solution that may be more effective? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Rel="canonical" and rel="alternate" both necessary?
We are fighting some duplicate content issues across multiple domains. We have a few magento stores that have different country codes. For example: domain.com and domain.ca, domain.com is the "main" domain. We have set up different rel="alternative codes like: The question is, do we need to add custom rel="canonical" tags to domain.ca that points to domain.com? For example for domain.ca/product.html to point to: Also how far does rel="canonical" follow? For example if we have:
Intermediate & Advanced SEO | | AlliedComputer
domain.ca/sub/product.html canonical to domain.com/sub/product.html
then,
domain.com/sub/product.html canonical to domain.com/product.html0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
De-indexed Link Directory
Howdy Guys, I'm currently working through our 4th reconsideration request and just have a couple of questions. Using Link Detox (www.linkresearchtools.com) new tool they have flagged up a 64 links that are Toxic and should be removed. After analysing them further alot / most of them are link directories that have now been de-indexed by Google. Do you think we should still ask for them to be removed or is this a pointless exercise as the links has already been removed because its been de-indexed. Would like your views on this guys.
Intermediate & Advanced SEO | | ScottBaxterWW0 -
How Rel=Prev & Rel=Next work for me?
I have implemented Rel=Prev & Rel=Next tag on my website. I would like to give example URL to know more about it. http://www.vistapatioumbrellas.com/market-umbrellas?limit=40&p=3 http://www.vistapatioumbrellas.com/market-umbrellas?limit=40&p=4 http://www.vistapatioumbrellas.com/market-umbrellas?limit=40&p=5 Right now, I have blocked paginated pages by Robots.txt by following query. Disallow: /*?p= I have removed disallow syntax from Robots.txt for paginated pages. But, I have confusion with duplicate page title. If you will check all 3 pages so you will find out duplicate page title across all pages. I know that, duplicate page title is harmful for SEO. Will Google crawl + index all paginated pages? If yes so which page will get maximum benefits in organic ranking? Is there any specific way which may help me to solve this issue?
Intermediate & Advanced SEO | | CommercePundit0