Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Asynchronous loading of product prices bad for SEO?
-
We are currently looking into improving our TTFB on our ecommerce site.
A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched.
The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB.
My question is whether google considers this as black hat SEO or not?
-
Thanks for your response. We'll definitely go for this improvement.
But can you please explain what you mean by "an unintuitive UX idea" ?
-
I don't see any reason why this would be seen as black hat. On the contrary, I see it as an unintuitive UX idea and you should definitely do it.
The only information your withholding (and you're not even cloaking it) is a price that is dependent on a lot of factors. You're not hiding any content or links, so there's no worry there. Even if you were hiding content it wouldn't be a problem, unless it was completely irrelevant and there just to rank the page.
Any affect this could have is that if you're deferring elements to load on the page to improve Time To First Byte, then Google may not read them as they crawl and therefore the content it sees on the page may be depleted, affecting your ability to rank the page. But for something like deferring a price tag, this isn't relevant at all.
I'd say go for it - think it would be a great idea for user experience.
-
Definitely not black hat but could impact SEO and negate any schema markup you have.
I would go to GWT > Crawl > Fetch as Google and see what HTML is received by Googlebot.
If all the async elements are there, you should be gravy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How good/bad the exit intent pop-ups? What is Google's perspective?
Hi all, We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Negative SEO - Spammy Backlinks By Competitor
Hi Everyone, Someone has generated more than 22k spam backlinks (on bad keywords) for my domain.Will it hurt on my website (SEO Ranking)? Because it is already in the top ranking. How could I remove all the spammy backlinks? How could I know particular competitior who have done this?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Negative SEO Click Bot Lowering My CTR?
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term. Is there any way to detect this activity? Background: We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO. We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only) Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory. Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
White Hat / Black Hat SEO | | TheDude0 -
Ever seen this tactic when trying to get rid of bad backlinks?
I'm trying to get rid of a Google penalty, but one of the URLS is particularly bizarre. Here's the penalized site: http://www.travelexinsurance.com. One of the external links Google cited as not being natural that links to the penalized site is: http://content.onlineagency.com/index.aspx?site=6599&tide=769006&last=3111516 In the backlink profile of the penalized site, there are about 100 different backlinks pointing to www.travelexinsurance.com from content.onlineagency.com/... So when I visit http://content.onlineagency.com/index.aspx?site=6599&tide=769006&last=3111516 it actually is displaying content from http://www.starmandstravel.com/787115_6599.htm, which you can see after clicking the "Home" button. That company is a legit travel agency who I assume knows nothing about content.onlineagency.com and is not involved in whatever is going on. And that's the case for every link from content.onlineagency.com. So I'm just wondering if someone can help me understand what sort of tactic content.onlineagency.com is using. One of my predecessors I fear used some black hat tactics. I'm wondering if this is a remnant of that effort.
White Hat / Black Hat SEO | | Patrick_G0 -
href="#" and href="javascript.void()" links. Is there a difference SEO wise?
I am currently working a site re-design and we are looking at if href="#" and href="javascript.void()" have an impact on the site? We were initially looking at getting the links per page down but I am thinking that rel=nofollow is the best method for this. Anyone had any experience with this? Thanks in advanced
White Hat / Black Hat SEO | | clickermediainc0 -
Does posting on Craigslist damage our SEO or reuptation?
We have a website that's a single person barbershop. She has been promoting on Craigslist, and that is outranking the website in the SERPs. However, the craigslist results showing up are actually expired and don't link to anything. They just seem to be cached by Craigslist. My question is, is Craigslist considered to generally not be a good avenue for directing inbound links for services on your site? Or is it a good strategy to use Craigslist to build link traffic for service businesses? I get mixed responses when I search for this. Thanks eYtdHtg.png
White Hat / Black Hat SEO | | smallpotatoes0 -
Finding and Removing bad backlinks
Ok here goes. Over the past 2 years our traffic and rankings have slowly declined, most importantly, for keywords that we ranked #1 and #2 at for years. With the new Penguin updates this year, we never saw a huge drop but a constant slow loss. My boss has tasked me with cleaning up our bad links and reshaping our link profile so that it is cleaner and more natural. I currently have access to Google Analytics and Webmaster Tools, SEOMoz, and Link Builder. 1)What is the best program or process for identifying bad backlinks? What exactly am I looking for? Too many links from one domain? Links from Low PR or low “Trust URL” sites? I have gotten conflicting information reading about all this on the net, with some saying that too many good links(high PR) can be unnatural without some lower level PR links, so I just want to make sure that I am not asking for links to be removed that we need to create or maintain our link profile. 2)What is the best program or process for viewing our link profile and what exactly am I looking for? What constitutes a healthy link profile after the new google algorithm updates? What is the best way to change it? 3)Where do I start with this task? Remove spammy links first or figure out or profile first and then go after bad links? 4)We have some backlinks that are to our old .aspx that we moved to our new platform 2 years ago, there are quite a few (1000+). Some of these pages were redirected and some the redirects were broken at some point. Is there any residual juice in these backlinks still? Should we fix the broken redirects, or does it do nothing? My boss says the redirects wont do anything now that google no longer indexes the old pages but other people have said differently. Whats the deal should we still fix the redirects even though the pages are no longer indexed? I really appreciate any advice as basically if we cant get our site and sales turned around, my job is at stake. Our site is www.k9electronics.com if you want to take a look. We just moved hosts so there are some redirect issues and other things going on we know about.
White Hat / Black Hat SEO | | k9byron0 -
Ever seen a black hat SEO hack this sneaky?
A friend pointed out to me that a University site had been hacked and used to gain top Google rankings. But it was cloaked so that most users wouldn't notice the hack. Only Googlebot and visitors from Google SERPs for the spam keywords would see a hacked version. See http://www.rypmarketing.com/blog/122-how-hackers-gained-an-easy-1-google-ranking-using-a-university-website.whtml (my blog) for screenshot and specifics. I've dealt with hacks before, but nothing this evil and sneaky. Ever seen anything like this? This is not our client, but was just curious if others had seen a hack like this before.
White Hat / Black Hat SEO | | AdamThompson0