To hide Webflow CMS item pages from Google search results, the recommended method is to use the Robots.txt file, not the NOFOLLOW meta tag.
1. Why the Robots.txt File Is Recommended
- Webflow does not allow per-collection-page meta tag customizations using built-in settings. That means you can't easily add a NOINDEX meta tag to just CMS item pages.
- Adding nofollow is not effective for hiding pages from search. It only tells Google not to pass link equity—it doesn’t prevent indexing.
- The robots.txt file provides a global way to block Googlebot and other search engines from crawling specific CMS page paths.
2. How to Use Robots.txt in Webflow
Go to Project Settings → SEO tab → Custom Robots.txt
Insert a rule like the following to block all CMS item pages (replace collection-slug
with your CMS collection slug):
```
User-agent: *
Disallow: /collection-slug/
```
This blocks search engines from crawling all dynamic CMS pages under that path—for example, /blog/my-post-title
3. Limitations of Robots.txt
- Robots.txt only prevents crawling, not indexing from external links. If another site links to a CMS item page, Google might still index the page, although usually without content.
- For stronger control, you'd need to include a meta NOINDEX on each CMS item page, but Webflow does not support conditional meta tag insertion per CMS item without using custom code or third-party tools.
4. Workarounds Using Custom Code (Advanced)
- You can add a custom code block in the CMS Template Page settings with a
<meta name="robots" content="noindex">
- However, Webflow injects this across all items; so you lose flexibility (i.e., you can’t conditionally apply per item without manual logic).
Summary
To hide all CMS item pages from Google in Webflow, use the Robots.txt file to disallow their paths. Meta NOFOLLOW is not suitable for this purpose, and fine-grained control via meta NOINDEX requires custom code or external workarounds.