Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Problems in indexing a website built with Magento
-
Hi all
My name is Riccardo and i work for a web marketing agency. Recently we're having some problem in indexing this website www.farmaermann.it which is based on Magento.
In particular considering google web master tools the website sitemap is ok (without any error) and correctly uploaded. However only 72 of 1.772 URL have been indexed; we sent the sitemap on google webmaster tools 8 days ago. We checked the structure of the robots.txt consulting several Magento guides and it looks well structured also.
In addition to this we noticed that some pages in google researches have different titles and they do not match the page title defined in Magento backend.To conclude we can not understand if this indexing problems are related to the website sitemap, robots.txt or something else.
Has anybody had the same kind of problems?Thank you all for your time and consideration
Riccardo
-
Hi Dan!
Thank you very much for your help and suggestions. I will try to follow your guidelines also.
Riccardo
-
Thank you Linda!
We will try and we will see what happens.
Riccardo
-
However, you should allow Google to crawl your JavaScript and CSS (which is now blocked). Here's some background info on that:
-
Hi Riccardo
Yes to confirm the site is indexed and crawlable. Checking the number of URLs from a sitemap that are indexed isn't the most reliable way to see if you content is indexed. You can do a site: search on your domain in Google like this as probably one of the most reliable ways. Also, you can try jus crawling the site with a tool like Screaming Frog SEO Spider - and if the tool can crawl everything, there may be just a delay on Google's end. But in your case now, all looks good!
-Dan
-
Hi Riccardo,
Since I do not know which pages exist on your site, I cannot be a 100% sure. You can remove this though from your robots.txt and see what happens (in Google Search Console & Bing Webmaster Tools).
Allow: /*?p=
Allow: /catalog/seo_sitemap/category/
Allow: /catalogsearch/result/Good luck!
-
Hi Linda!
Unfortunately we didn't develop the website but we have to work on its optimization. Probably you have right about the robots.txt because the sitemaps looks ok. I will try to remove the crawl delay. On the other hand which disallow rules should i remove or which modifies should i do in particular?
Thank you very much for your help!
Riccardo
-
Hi Josh!
Thank you very much for your help!
So probably there is a delay in webmaster tools data. Unfortunately we didn't develop the site but we only work on its optimization so we are a little bit confused with these data. -
Hi Ricardo,
Your home page is indexed.
It is most likely your problems are because of the robots.txt. -> http://www.farmaermann.it/robots.txt
1. You set a crawl delay of 10 seconds for all bots, which is quite long.
User-agent: *
Crawl-delay: 102. Some of your pages are not allowed to be crawled, like this one in your menu: http://www.farmaermann.it/integratori.html and http://www.farmaermann.it/contraccettivi-e-gravidanza.html
Allow: /*?p=
Allow: /catalog/seo_sitemap/category/
Allow: /catalogsearch/result/My advice is to modify your robots.txt: remove the crawl delay (and check whether your server can handle that) and make sure the pages in your menu can be crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Someone redirected his website to ours
Hi all, I have strange issue as someone redirected website http://bukmachers.pl to ours https://legalnibukmacherzy.pl We don't know exactly what to do with it. I checked backlinks and the website had some links which now redirect to us. I also checked this website on wayback machine and back in 2017 this website had some low quality content but in 2018 they made similar redirection to current one but to different website (our competitor). Can such redirection be harmful for us? Should we do something with this or leave it, as google stop encouraging to disavow low quality links.
Intermediate & Advanced SEO | | Kahuna_Charles1 -
If my website uses CDN does thousands of 301 redirect can harm the website performance?
Hi, If my website uses CDN does thousands of 301 redirect can harm the website performance? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Check website update frequency?
Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard
Intermediate & Advanced SEO | | seoman100 -
Question about Indexing of /?limit=all
Hi, i've got your SEO Suite Ultimate installed on my site (www.customlogocases.com). I've got a relatively new magento site (around 1 year). We have recently been doing some pr/seo for the category pages, for example /custom-ipad-cases/ But when I search on google, it seems that google has indexed the /custom-ipad-cases/?limit=all This /?limit=all page is one without any links, and only has a PA of 1. Whereas the standard /custom-ipad-cases/ without the /? query has a much higher pa of 20, and a couple of links pointing towards it. So therefore I would want this particular page to be the one that google indexes. And along the same logic, this page really should be able to achieve higher rankings than the /?limit=all page. Is my thinking here correct? Should I disallow all the /? now, even though these are the ones that are indexed, and the others currently are not. I'd be happy to take the hit while it figures it out, because the higher PA pages are what I ultimately am getting links to... Thoughts?
Intermediate & Advanced SEO | | RobAus0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
How do you de-index and prevent indexation of a whole domain?
I have parts of an online portal displaying in SERPs which it definitely shouldn't be. It's due to thoughtless developers but I need to have the whole portal's domain de-indexed and prevented from future indexing. I'm not too tech savvy but how is this achieved? No index? Robots? thanks
Intermediate & Advanced SEO | | Martin_S0 -
Redirecting Canonical 301s and Magento Website
I have an issue with a client's website where it has 3700+ pages, but roughly half of them are duplicates. Thankfully, the only difference between the original and the duplictes is the "?print" at the end of each URL (I suppose this is Magento's way of making a printable page version of the same page. I don't know, I didn't build it.) My questions is, how can I get all the pages like this http://www.mycompany.com/blah.html?print to redirect to pages like this... http://www.mycompany.com/blah.html Also, do they NEED to be Canonical, or will a 301 redirect be sufficient. Also, after having done this, if anybody knows, is there a way I can turn that feature off in Magento, because we're expanding our product line, and I don't want to have to keep chasing after these "?print" pages after the fact.
Intermediate & Advanced SEO | | ClifThompson0