Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does collapsing content impact Google SEO signals?
-
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content? -
Thanks EGOL. Still looking for additional evidence about this.
-
well.. yup. I know many SEOs that do think that the collapsable are is just not important enough for google to consider it
good luck
-
If I see a study, I'll post a link here.
-
Yep I completely agree with your response. Unfortunately I'm in a position where I manage major enterprise accounts with multiple stakeholders (including some people are not educated in SEO). Every major change we propose needs to be documented, cited and reviewed. When making an argument for content expansion I would need to use thorough research example (Moz study, documentation on search engine land, etc).
Anyway thank for taking the time to share your feedback and advice on this thread. Although this is not the answer I wanted to hear (i.e. Google doesn't respect collapsed content)...however it's very likely accurate. This is a serious SEO issue that needs to be addressed.
-
Are there any case studies about this issue?
Just the one that I published above. The conclusion is... be prepared to sacrifice 80% of your traffic if you hide your valuable content behind a preview.
I would be asking the UX people to furnish studies that hiding content produces better sales.
We have lots of people raving about the abundance of content on our site, the detailed product descriptions, how much help we give them to decide what to purchase. All of this content is why we dominate the SERPs in our niche and that, in many people's eyes, is a sign of credibility. Lots of people say... "we bought from you because your website is so helpful". However, if we didn't have all of this content in the open these same people would have never even found us.
Nobody has to read this stuff. I would rather land on a website and see my options than land on a website and assume that they was no information because I didn't notice that the links to open it were in faded microfont because the UX guys wanted things to be tidy. I believe that it is a bigger sin to have fantastic content behind a clickthorugh than it is to put valuable information in the open and allow people to have the opportunity to read it.
Putting our content out in the open is what makes our reputation.
I sure am glad that I am the boss here. I can make the decisions and be paid on the basis of my performance.
-
We are applying 500 to 800+ word custom content blocks for our client landing pages (local landing pages) that shows a preview of the first paragraph and a "read more" expansion link. We know that most website visitors only care about the location info of these particular landing pages. We also know that our client UX teams would certainly not approve an entire visible content block on these pages.
Are there any case studies about this issue? I'm trying to find a bona fide research project to help back up our argument. -
It was similar to a Q&A. There was a single sentence question and a paragraph of hidden answer. This page had a LOT of questions and a tremendous amount of keywords in the hidden content. Thousands of words.
The long tail traffic tanked. Then, when we opened the content again the traffic took months to start coming back. The main keywords held in the SERPs. The longtail accounted for the 80% loss.
-
How collapsed was your content? Did you hide the entire block? Only show a few sentences? I'm trying to find a research article about this. This is a MAJOR issue to consider for our SEO campaigns.
-
Yes that is a very legitimate concern of mine. We have invested significant resources into custom long form content for our clients and we are very concerned this all for nothing...or possibly worse (discounting content).
-
Recently i a had related issue with a top ranking website for very competitive queries.
Unfortunately the product department made some changes to the content (UI only) without consulting SEO department. The only worth to mention change they made was to move the first two paragraphs into a collapsible DIV showing only the first 3 lines + a "read more" button. The text in collapsible div was crawlable and visible to SE's. (also it's worth to mention that these paragrap
But the site lost its major keywords positions 2-3 days later.Of-course we reverted the changes back but still two months later, the keywords are very slowly moving back to their "original" positions.
For years i believed in what Google stated, that you can use collapsible content if you are not trying to inject keywords or trying to inflate the amount of content etc. Not anymore.
I believe that placing the content under a collapsible div element, we are actually signaling google that this piece of content is not that important (that's why it is hidden, right? Otherwise it should be in plain sight). So why we should expect from google to take this content as a major part of our contents ranking factor weight.
-
About two years ago I had collapsed content on some important pages. Their longtail traffic went into a steady slide, but the head traffic held. I attribute this to a sign that the collapsed content was discounted, removing it from, or lowering its ability to count in the rankings for long tail queries.
I expanded the page, making all content visible. A few months later, longtail traffic started to slowly rise. It took many months to climb back to previous levels.
After this, every word of my content is now in the open.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
SEO on Jobs sites: how to deal with expired listings with "Google for Jobs" around
Dear community, When dealing with expired job offers on jobs sites from a SEO perspective, most practitioners recommend to implement 301 redirects to category pages in order to keep the positive ranking signals of incoming links. Is it necessary to rethink this recommendation with "Google for Jobs" is around? Google's recommendations on how to handle expired job postings does not include 301 redirects. "To remove a job posting that is no longer available: Remove the job posting from your sitemap. Do one of the following: Note: Do NOT just add a message to the page indicating that the job has expired without also doing one of the following actions to remove the job posting from your sitemap. Remove the JobPosting markup from the page. Remove the page entirely (so that requesting it returns a 404 status code). Add a noindex meta tag to the page." Will implementing 301 redirects the chances to appear in "Google for Jobs"? What do you think?
Intermediate & Advanced SEO | | grnjbs07175 -
Can Google read content that is hidden under a "Read More" area?
For example, when a person first lands on a given page, they see a collapsed paragraph but if they want to gather more information they press the "read more" and it expands to reveal the full paragraph. Does Google crawl the full paragraph or just the shortened version? In the same vein, what if you have a text box that contains three different tabs. For example, you're selling a product that has a text box with overview, instructions & ingredients tabs all housed under the same URL. Does Google crawl all three tabs? Thanks for your insight!
Intermediate & Advanced SEO | | jlo76130 -
Content Below the Fold
Hi I wondered what the view is on content below the fold? We have the H1, product listings & then some written content under the products - will Google just ignore this? I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Having 2 brands with the same content - will this work from an SEO perspective
Hi All, I would love if someone could help and provide some insights on this. We're a financial institution and have a set of products that we offer. We have recently joined with another brand and will now be offering all our products to their customers. What we are looking to do is have 1 site that masks the content for both sites so it appears as there are 2 seperate brands with different content - in fact we have a main site and then a sister brand that offers the same products. Is there anyway to do this so when someone searches for Credit Card from Brand A it is indexed under Brand A and same when someone searched for Credit Card from Brand B it is indexed under Brand B. The one thing is we would not want to rel:can the pages nor be penalised by googles latest PR algorithm. Hope someone can help! Thanks Dave
Intermediate & Advanced SEO | | CFCU1 -
SEO Impact of External links in JS tag
We have our JS tag and iframe tag being used over by 100 leading websites. What would be the SEO impact if we added a follow link in the iframe. Would it have any negative impact ? Vivek
Intermediate & Advanced SEO | | kvivek050 -
Membership/subscriber (/customer) only content and SEO best practice
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice? A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!) Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0