Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Increase of 404 error after change of encoding
-
Hello,
We just have launch a new version of our website with a new utf-8 encoding.
Thing is, we use comma as a separator and since the new website went live, I have a massive increase of 404 error of comma-encoded URL.
Here is an example :
http://web.bons-de-reduction.com/annuaire%2C321-sticker%2Csite%2Cpromotions%2C5941.html
instead of :
http://web.bons-de-reduction.com/annuaire,321-sticker,site,promotions,5941.html
I check with Screaming Frog SEO and Xenu, I can't manage to find any encoded URL.
Is anyone have a clue on how to fix that ?
Thanks
-
I will take a look at it but it's not the issue that SEOMoz tell me as this format concerns only images. It's actually a little trick to do lazyloading on images.
The link you pointed out on your example is good ("/annuaire,amkashop,site,promotion...) as comma are not encoded.
And for your example I see no issue except capitalization.
I bet this is a Moz problem because when I fetch as Googlebot, I don't find encoded URL...
-
just wanted to give you one more thing that I think would help http://www.w3schools.com/html5/att_meta_charset.asp
I believe you should clean up your encoding and that it will not be a big deal.
Sincerely,
Tom
-
I thought this may help as well because you do have to clean up your source code
The online quoted-printable encoder tool first encodes the input text in either UTF-8 or ISO-8859-1. The characters are then output according to this schema:
| Character | Result | Comment |
| "=" (0x3D) | =3D | Special handling of the equal sign |
| " " (0x20) to "~" (0x7E) | Unmodified | Printable ASCII (7 bits) |
| Any other | =XX | Hexadecimal char code |Since quoted-printable does not in itself specify the text character encoding, it is important to specify this correctly when used. The online quoted-printable decoder tool attempts to auto-detect the text encoding.
See the Wikipedia article on quoted-printable for more info.
-
I would use a tool similar to this http://www.percederberg.net/tools/text_converter.html
as you can see your links for your gif photos are encoded "data:image/gif;base64"
please give it a try and tell me if that helps?
Sincerely,
Thomas
-
Hello and thanks for your answer.
No word involved here.
We move from :
http-equiv="content-type" content="text/html; charset=iso-8859-1" />
to
charset="utf-8">
Everything is fine except for Mozbot
-
what you need to do is go into your site and cleanup the links that have been converted and messed up because of the change. Once you clean them you will have no problem this is what your links look like
data:image/gif;base64,R0lGODlhAQABAIAAAP///////yH+A1BTQQAsAAAAAAEAAQAAAgJEAQA7
utf-8 is definitely the right coding it's very good you just have to go in and clean it up looking your source code
"
| {"m":2571,"a":"wrap"}" width="108" height="65" data-original="/upload/merchants_logo/108-65/amkashop.jpeg" src="data:image/gif;base64,R0lGODlhAQABAIAAAP///////yH+A1BTQQAsAAAAAAEAAQAAAgJEAQA7"> <noscript></span><img data-merchant="2571" class="merchantLogo lazy" data-out="{"m":2571,"a":"wrap"}" width="108" height="65" src="/upload/merchants_logo/108-65/amkashop.jpeg" alt="Amkashop"><span></noscript> |
| |<a <span="">href</a><a <span="">="</a>/annuaire,amkashop,site,promotions,2685.html" title="Amkashop">Code promo Amkashop"
|
I hope I've been of help to you.
Thomas
-
Did you happened to possibly write it using Microsoft Word and paste content in? Or are you speaking about a website that you converted from another encoding to Unicode 8?
sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are main factors to increase DA?
Hi, I'm running a blog and working continuously on it and creating both DoFollow and NoFollow links but DA of website best heated vest isn't increasing even I"m creating guest posts and other links. Can you please guide me what are the factors to increase DA so I can work on my website and increase my site's DA as well? Thank you!
Moz Pro | | loukris2350 -
How long does it takes for Moz pro to find backlinks and increase my score?
Hi, I'm running a website for tourism & travel called Visit Guide or دليل السفر and i have many backlinks from many websites ( DoFollow & NoFollow ) They are older than 10 days and Moz has not discovered them yet nor found them? How to push Moz Pro to find my backlinks? or how to SpeedIndex my backlinks in other words?
Moz Pro | | VisitDotGuide0 -
Increase in Rankings, but search visibility is decreasing
I just started updating my site with the very basics, focus keyword, meta description, and page titles. I see that I am going up in my targeted keywords, however my search visibility has dropped from 3.34% to 2.78%. My assumption is that my keyword choice is a little off, however I see that I have increased in keyword rankings and dropped very little in what I was ranking for previously, plus the keyword volume seems relatively the same. In fact I've added far more keywords than I have dropped. Curious why my search visibility has dropped, however my keyword rankings only seem to increase.
Moz Pro | | kthomasd0 -
How long do changes in title tags take to affect SEO?
This is kind of a loaded question. I'm completely new to SEO. I think my boss signed up for Moz Pro sometime in February and started adding data to our Ecommerce site to help with rankings. Sometime before this, I changed some of the title tags on the site (trying to help with organic search and CTR). I did not do a site wide change.... just changed maybe 10-20 (just a guess). I did it with keywords in mind but did not make note of when I did it. I didn't really know better at the time, and I did not have access to Google Analytics or Moz Pro. I was looking through the ranking data/graph for February and March. It won't let me look before February 29th (so that's why I think my boss started the Mos Pro subscription around at that time). On that day it said we ranked 12 keywords in the 1-3 spot, and then the following week (march 7) it went down to 6. I don't think or know if any major site changes were implemented, so I'm not sure why that happened and if it has anything to do with my title tag changes I did maybe a week or two before (again I am not sure when I did this unfortunately). Since then the keyword ranking numbers stayed about the same with organic traffic slowly going down (it could be because we are getting out of season for our industry though). The second week of March the site was upgraded and since then the menu has been completely changed around. Last week I did a site wide title tag change. So the minor changes I made in February are no longer in effect anyway. I added more keywords to Moz earlier this week and the number for 1-3 spot keywords went up from 6 to 20. It also says my ranking moved up 4 keywords and down 13 keywords. Anyway, I am wondering how seriously I should take these changes and if I'm damaging the site. I am new to Moz Pro also so all the data you can access is kind of confusing/overwhelming.
Moz Pro | | AliMac260 -
Htaccess and robots.txt and 902 error
Hi this is my first question in here I truly hope someone will be able to help. It's quite a detailed problem and I'd love to be able to fix it through your kind help. It regards htaccess files and robot.txt files and 902 errors. In October I created a WordPress website from what was previously a non-WordPress site it was quite dated. I had built the new site on a sub-domain I created on the existing site so that the live site could remain live whilst I created on the subdomain. The site I built on the subdomain is now live but I am concerned about the existence of the old htaccess files and robots txt files and wonder if I should just delete the old ones to leave the just the new on the new site. I created new htaccess and robots.txt files on the new site and have left the old htaccess files there. Just to mention that all the old content files are still sat on the server under a folder called 'old files' so I am assuming that these aren't affecting matters. I access the htaccess and robots.txt files by clicking on 'public html' via ftp I did a Moz crawl and was astonished to 902 network error saying that it wasn't possible to crawl the site, but then I was alerted by Moz later on to say that the report was ready..I see 641 crawl errors ( 449 medium priority | 192 high priority | Zero low priority ). Please see attached image. Each of the errors seems to have status code 200; this seems to be applying to mainly the images on each of the pages: eg domain.com/imagename . The new website is built around the 907 Theme which has some page sections on the home page, and parallax sections on the home page and throughout the site. To my knowledge the content and the images on the pages are not duplicated because I have made each page as unique and original as possible. The report says 190 pages have been duplicated so I have no clue how this can be or how to approach fixing this. Since October when the new site was launched, approx 50% of incoming traffic has dropped off at the home page and that is still the case, but the site still continues to get new traffic according to Google Analytics statistics. However Bing Yahoo and Google show a low level of Indexing and exposure which may be indicative of the search engines having difficulty crawling the site. In Google Analytics in Webmaster Tools, the screen text reports no crawl errors. W3TC is a WordPress caching plugin which I installed just a few days ago to speed up page speed, so I am not querying anything here about W3TC unless someone spots that this might be a problem, but like I said there have been problems re traffic dropping off when visitors arrive on the home page. The Yoast SEO plugin is being used. I have included information about the htaccess and robots.txt files below. The pages on the subdomain are pointing to the live domain as has been explained to me by the person who did the site migration. I'd like the site to be free from pages and files that shouldn't be there and I feel that the site needs a clean up as well as knowing if the robots.txt and htaccess files that are included in the old site should actually be there or if they should be deleted... ok here goes with the information in the files. Site 1) refers to the current website. Site 2) refers to the subdomain. Site 3 refers to the folder that contains all the old files from the old non-WordPress file structure. **************** 1) htaccess on the current site: ********************* BEGIN W3TC Browser Cache <ifmodule mod_deflate.c=""><ifmodule mod_headers.c="">Header append Vary User-Agent env=!dont-vary</ifmodule>
Moz Pro | | SEOguy1
<ifmodule mod_filter.c="">AddOutputFilterByType DEFLATE text/css text/x-component application/x-javascript application/javascript text/javascript text/x-js text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon application/json
<ifmodule mod_mime.c=""># DEFLATE by extension
AddOutputFilter DEFLATE js css htm html xml</ifmodule></ifmodule></ifmodule> END W3TC Browser Cache BEGIN W3TC CDN <filesmatch ".(ttf|ttc|otf|eot|woff|font.css)$"=""><ifmodule mod_headers.c="">Header set Access-Control-Allow-Origin "*"</ifmodule></filesmatch> END W3TC CDN BEGIN W3TC Page Cache core <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteRule .* - [E=W3TC_ENC:_gzip]
RewriteCond %{HTTP_COOKIE} w3tc_preview [NC]
RewriteRule .* - [E=W3TC_PREVIEW:_preview]
RewriteCond %{REQUEST_METHOD} !=POST
RewriteCond %{QUERY_STRING} =""
RewriteCond %{REQUEST_URI} /$
RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|w3tc_logged_out|wordpress_logged_in|wptouch_switch_toggle) [NC]
RewriteCond "%{DOCUMENT_ROOT}/wp-content/cache/page_enhanced/%{HTTP_HOST}/%{REQUEST_URI}/_index%{ENV:W3TC_PREVIEW}.html%{ENV:W3TC_ENC}" -f
RewriteRule .* "/wp-content/cache/page_enhanced/%{HTTP_HOST}/%{REQUEST_URI}/_index%{ENV:W3TC_PREVIEW}.html%{ENV:W3TC_ENC}" [L]</ifmodule> END W3TC Page Cache core BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress ....(((I have 7 301 redirects in place for old page url's to link to new page url's))).... #Force non-www:
RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.domain.co.uk [NC]
RewriteRule ^(.*)$ http://domain.co.uk/$1 [L,R=301] **************** 1) robots.txt on the current site: ********************* User-agent: *
Disallow:
Sitemap: http://domain.co.uk/sitemap_index.xml **************** 2) htaccess in the subdomain folder: ********************* Switch rewrite engine off in case this was installed under HostPay. RewriteEngine Off SetEnv DEFAULT_PHP_VERSION 53 DirectoryIndex index.cgi index.php BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /WPnewsiteDee/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /subdomain/index.php [L]</ifmodule> END WordPress **************** 2) robots.txt in the subdomain folder: ********************* this robots.txt file is empty **************** 3) htaccess in the Old Site folder: ********************* Deny from all *************** 3) robots.txt in the Old Site folder: ********************* User-agent: *
Disallow: / I have tried to be thorough so please excuse the length of my message here. I really hope one of you great people in the Moz community can help me with a solution. I have SEO knowledge I love SEO but I have not come across this before and I really don't know where to start with this one. Best Regards to you all and thank you for reading this. moz-site-crawl-report-image_zpsirfaelgm.jpg0 -
404 Crawl Diagnostics with void(0) appended to URL
Hello I am getting loads of 404 reported in my Crawl report, all appended with void(0) at the end. For example: http://lfs.org.uk/films-and-filmmakers/watch-our-films/1289/void(0)
Moz Pro | | moshen
The site is running on Drupal 7, Has anyone come across this before? Kind Regards Moshe | http://lfs.org.uk/films-and-filmmakers/watch-our-films/1289/void(0) |0 -
Can increasing website pages decrease domain authority?
Hello Mozzers! Say there is a website with 100 pages and a domain authority of 25. If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
Moz Pro | | MozAddict0 -
How to resolve Duplicate Content crawl errors for Magento Login Page
I am using the Magento shopping cart, and 99% of my duplicate content errors come from the login page. The URL looks like: http://www.site.com/customer/account/login/referer/aHR0cDovL3d3dy5tbW1zcGVjaW9zYS5jb20vcmV2aWV3L3Byb2R1Y3QvbGlzdC9pZC8xOTYvY2F0ZWdvcnkvNC8jcmV2aWV3LWZvcm0%2C/ Or, the same url but with the long string different from the one above. This link is available at the top of every page in my site, but I have made sure to add "rel=nofollow" as an attribute to the link in every case (it is done easily by modifying the header links template). Is there something else I should be doing? Do I need to try to add canonical to the login page? If so, does anyone know how to do it using XML?
Moz Pro | | kdl01