Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does collapsing content impact Google SEO signals?
-
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content? -
Thanks EGOL. Still looking for additional evidence about this.
-
well.. yup. I know many SEOs that do think that the collapsable are is just not important enough for google to consider it
good luck
-
If I see a study, I'll post a link here.
-
Yep I completely agree with your response. Unfortunately I'm in a position where I manage major enterprise accounts with multiple stakeholders (including some people are not educated in SEO). Every major change we propose needs to be documented, cited and reviewed. When making an argument for content expansion I would need to use thorough research example (Moz study, documentation on search engine land, etc).
Anyway thank for taking the time to share your feedback and advice on this thread. Although this is not the answer I wanted to hear (i.e. Google doesn't respect collapsed content)...however it's very likely accurate. This is a serious SEO issue that needs to be addressed.
-
Are there any case studies about this issue?
Just the one that I published above. The conclusion is... be prepared to sacrifice 80% of your traffic if you hide your valuable content behind a preview.
I would be asking the UX people to furnish studies that hiding content produces better sales.
We have lots of people raving about the abundance of content on our site, the detailed product descriptions, how much help we give them to decide what to purchase. All of this content is why we dominate the SERPs in our niche and that, in many people's eyes, is a sign of credibility. Lots of people say... "we bought from you because your website is so helpful". However, if we didn't have all of this content in the open these same people would have never even found us.
Nobody has to read this stuff. I would rather land on a website and see my options than land on a website and assume that they was no information because I didn't notice that the links to open it were in faded microfont because the UX guys wanted things to be tidy. I believe that it is a bigger sin to have fantastic content behind a clickthorugh than it is to put valuable information in the open and allow people to have the opportunity to read it.
Putting our content out in the open is what makes our reputation.
I sure am glad that I am the boss here. I can make the decisions and be paid on the basis of my performance.
-
We are applying 500 to 800+ word custom content blocks for our client landing pages (local landing pages) that shows a preview of the first paragraph and a "read more" expansion link. We know that most website visitors only care about the location info of these particular landing pages. We also know that our client UX teams would certainly not approve an entire visible content block on these pages.
Are there any case studies about this issue? I'm trying to find a bona fide research project to help back up our argument. -
It was similar to a Q&A. There was a single sentence question and a paragraph of hidden answer. This page had a LOT of questions and a tremendous amount of keywords in the hidden content. Thousands of words.
The long tail traffic tanked. Then, when we opened the content again the traffic took months to start coming back. The main keywords held in the SERPs. The longtail accounted for the 80% loss.
-
How collapsed was your content? Did you hide the entire block? Only show a few sentences? I'm trying to find a research article about this. This is a MAJOR issue to consider for our SEO campaigns.
-
Yes that is a very legitimate concern of mine. We have invested significant resources into custom long form content for our clients and we are very concerned this all for nothing...or possibly worse (discounting content).
-
Recently i a had related issue with a top ranking website for very competitive queries.
Unfortunately the product department made some changes to the content (UI only) without consulting SEO department. The only worth to mention change they made was to move the first two paragraphs into a collapsible DIV showing only the first 3 lines + a "read more" button. The text in collapsible div was crawlable and visible to SE's. (also it's worth to mention that these paragrap
But the site lost its major keywords positions 2-3 days later.Of-course we reverted the changes back but still two months later, the keywords are very slowly moving back to their "original" positions.
For years i believed in what Google stated, that you can use collapsible content if you are not trying to inject keywords or trying to inflate the amount of content etc. Not anymore.
I believe that placing the content under a collapsible div element, we are actually signaling google that this piece of content is not that important (that's why it is hidden, right? Otherwise it should be in plain sight). So why we should expect from google to take this content as a major part of our contents ranking factor weight.
-
About two years ago I had collapsed content on some important pages. Their longtail traffic went into a steady slide, but the head traffic held. I attribute this to a sign that the collapsed content was discounted, removing it from, or lowering its ability to count in the rankings for long tail queries.
I expanded the page, making all content visible. A few months later, longtail traffic started to slowly rise. It took many months to climb back to previous levels.
After this, every word of my content is now in the open.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO & How long does it take for Google to disavow
Following on from a previous problem of 2 of our main pages completely dropping from index, we have discovered that 150+ spam, porn domains have been directed at our pages (sometime in the last 3-4 months, don't have an exact date). Does anyone have exerpeince on how long it may take Google to take noticed of a new disavow list? Any estimates would be very helpful in determining our next course of action.
Intermediate & Advanced SEO | | Vuly1 -
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!
Intermediate & Advanced SEO | | MJTrevens1 -
Merging Pages and SEO
Hi, We are redesigning our website the following way: Before: Page A with Content A, Page B with Content B, Page C with Content C, etc
Intermediate & Advanced SEO | | viatrading1
e.g. one page for each Customer Returns, Overstocks, Master Case, etc
Now: Page D with content A + B + C etc.
e.g. one long page containing all Product Conditions, one after the other So we are merging multiples pages into one.
What is the best way to do so, so we don't lose traffic? (or we lose the minimum possible) e.g. should we 301 Redirect A/B/C to D...?
Is it likely that we lose significant traffic with this change? Thank you,0 -
Onsite SEO vs Offsite SEO
Hey I know the importance of both onsite & offsite, primarily with regard to outreach/content/social. One thing I am trying to determine at the moment, is how much do I invest in offsite. My current focus is to improve our onpage content on product pages, which is taking some time as we have a small team. But I also know our backlinks need to improve. I'm just struggling on where to spend my time. Finish the onsite stuff by section first, or try to do a bit of both onsite/offsite at the same time?
Intermediate & Advanced SEO | | BeckyKey1 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
SEO impact difference between a URL Rewrite and 301 redirect
Hi guys and girls! Just putting a new site live, we changed the URL from one thing to another and I created a 301 file redirecting the urls like for like. The developer installing it has created a different file with columns like: RewriteRule ^page/ http://www.site/page [R=301,L] RewriteRule ^/page/ http://www.site/page [R=301,L] What's the difference? The page redirects but is there a difference between the 301 redirect and this URL rewrite in terms of SEO and link value?
Intermediate & Advanced SEO | | shloy23-2945840 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0