Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Recommended Website Monitoring Tools
-
Hi,
I was wondering what people would recommend for website monitoring (IE is my website working as it should!).
I need something that will:
1/. Allow multiple page monitoring not just homepage
2/. Do header status checking
3/. Do page content checking (ie if the page changes massively, or include the word "error") then we have an issue!
4/. Multiple alert possibilities.We currently use www.websitepulse.com and it is a good service that does all the above, however it just seems so overly complex that its hard to understand what is going on, and its complex functionality and features are really a negative in our case.
Thanks
-
We use Pingdom to monitor a lot of client websites. It is great, because we receive SMS messages when something is wrong. The detailed reporting, iPhone app and abilty to monitor http-statuses is exceptional!
-
Have not, but based on the service for free, it is likely worth a try given it is more robust. With most of our sites we do not have the level of complexity so it is less of a need. Hopefully, some of the mozzers with more eCommerce will see and respond. Also, if you have a private question available, you might use that to go straight to moz and see what they could suggest.
-
PS - I had a look at Mon.itor.us - have you tried their paid service: http://portal.monitis.com/ ??
-
Hi Rob,
Essentially we have a pretty complex website, with many different sections. This website is constantly being developed so there will probably be code releases for changes maybe 4-5 times per week. Any one of these changes may end up causing an issue with one of the pages (IE page of a specific type) . In addition to this we can get issues with DB or server memory which can occasional cause the website to fail.
All issues are pretty disastrous for business, so what I need to know (or to be more exact our developers need to know) as soon as an issue occurs (most of the attached services will check down all you to set a checking period of say every 5 mins) so it can be fixed (as opposed to waiting for a customer etc to tell us there is a website issue, or manually checking every page type with every code release).
As I say we do have websitepulse at the moment which is great, but also far to complex etc to easy set up and manage, so just doing research around this area, and seeing if anyone has some advice.
Thanks
-
Mon.itor.us works well and is free.
-
It seems you are looking for something that constantly monitors the site and simply alerts you to problems. From my point of view as an agency that has more than a few sites up, it might be overkill and I am not sure of what it would be. What we do to cover what you are listing is this: We have a pro plus moz membership and do campaign tracking with it. We can see on a weekly basis via email and daily if we just log in: 4xx, 5xx errors, dupe pg titles, missing pg titles, blocked bots, etc. as well as on page SEO issues, and general robots, rel - canon, etc.
For content checking of page changes I am at a loss, error reports as above and server downtime as below (mon.itor.us) with good result. The beauty of the SEOmoz campaign for me is that it also tracks rankings, connects to G Analytics, and provides competitive link analysis DA, PA, etc.
For the Headers you can use Screaming Frog (I just love that name and it works).
Hope that helps.
-
Doing some digging I found a useful list:
http://mashable.com/2010/04/09/free-uptime-monitoring/
Anyone have any feedback/reviews on these specific tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
What’s the best tool to visualize internal link structure and relationships between pages on a single site?
I‘d like to review the internal linking structure on my site. Is there a tool that can visualize the relationships between all of the pages within my site?
Web Design | | QBSEO0 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
How to find out that none of the images on my site violates copyrights? Is there any tool that can do this without having to check manually image by image?
We plan to add several thousand images to our site and we outsourced the image search to some freelancers who had instructions to just use royalty free pictures. Is there any easy and quick way to check that in fact none of these images violates copyrights without having to check image by image? In case there are violations we are unaware of, do you think we need to be concerned about a risk of receiving Takedown Notices (DMCA) before owner giving us notification for giving us opportunity to remove the photo?
Web Design | | lcourse1 -
What is the best way to handle annual events on a website?
Every year our company has a user conference with between 300 - 400 attendees. I've just begun giving the event more of a presence on our website. I'm wondering, what is the best way to handle highlights from previous years? Would it be to create an archive (e.g. www.companyname.com/eventname/2015) while constantly updating the main landing page to promote the current event? We also use an event website (cvent) to handle our registrations. So once we have an agenda for the current years event I do a temporary redirect from the main landing page to the registration website. I don't really like this practice and I feel like it might be better to keep all of the info on the main domain. Wondering if anybody has any opinions or feedback on that process as well. Just looking for best practices or what others have done and have had success with.
Web Design | | Brando161 -
New Website launch, asking for feedback
Hey Guys, I just launched my new website. I just asking around for feedback. Please check it out if you have time and let me know www.benjaminmarc.com
Web Design | | benjaminmarcinc1 -
Where is the best place to put reciprocal links on our website?
Where should reciprocal links be placed on our website? Should we create a "Resources" page? Should the page be "hidden" from the public? I know there is a right answer out there! Thank you for your help! Jay
Web Design | | theideapeople0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0