Webflow sync, pageviews & more.
NEW

Is anyone experiencing issues with their Robots.txt file in Webflow? I'm having trouble uploading the sitemap to Google and testing the site for rich results on Google due to the site being blocked by the robots.txt file. I've tried editing the robots.txt file and deleting text from the SEO tab, but neither solution has worked. Even manually indexing each URL on Google's sitemap app doesn't resolve the issue. Any suggestions?

TL;DR
  • Clear or properly configure robots.txt in Webflow's SEO settings and publish changes to your live custom domain.
  • Verify the live robots.txt file, ensure correct domain and SSL setup, then test and resubmit in Google Search Console for indexing.

You're encountering issues where Google cannot access your site or sitemap due to entries in your robots.txt file on Webflow. Editing or clearing the file in Webflow’s interface hasn't resolved the issue.

1. Locate and Edit the Robots.txt in Webflow

  • Go to your Webflow project dashboard.
  • Click on “Project Settings.”
  • Navigate to the SEO tab.
  • Scroll down to the robots.txt section.
  • Ensure the field is completely empty or only contains necessary rules.
  • Common mistake: Disallow: /—this blocks the entire site. If you see this, delete it.
  • Click Save Changes, then Publish your site again.

2. Verify Robots.txt is Updating Properly on Live URL

  • Visit your published-domain version of the robots.txt file:
    https://yourdomain.com/robots.txt
  • The file should either be:
  • Empty
  • Or include only safe rules like:
    • User-agent: *
    • Allow: /
  • If old blocking rules still appear, clear your browser cache or use an incognito window to rule out caching issues.
  • Make sure you're checking the correct domain (e.g., www vs non-www).

3. Check Environment-Specific Publishing

  • Webflow allows Staging vs Live domain differences.
  • If you only published to the staging subdomain (e.g., yoursite.webflow.io), changes might not appear on the custom domain.
  • Go back to Webflow and click "Publish to Selected Domains" → Ensure your custom domain is checked.

4. Confirm Domain Connection and SSL Settings

  • Go to Project Settings > Hosting.
  • Make sure your custom domain is connected properly.
  • If using HTTPS (recommended), make sure SSL is turned on.
  • Mismatched SSL or domain settings may cause robots.txt fetch issues.

5. Submit robots.txt and Sitemap in Google Search Console

  • Go to Google Search Console > your property > Settings > robots.txt Tester
  • Paste in your live robots.txt URL to test.
  • Then go to Sitemaps in GSC, and resubmit the sitemap (e.g., /sitemap.xml).
  • If robots.txt is still blocking, you'll see a warning or error message.

6. Test URLs with Rich Results & Indexing Tools

  • Use Google’s URL Inspection Tool in Search Console.
  • Paste a URL and click “Test Live URL” to ensure Google can crawl it now.
  • If it passes, click “Request Indexing.”
  • For structured data, use Google’s Rich Results Test and enter your live page URL.

Summary

The issue is coming from robots.txt blocking crawl access even after edits. Ensure the file is cleared or safely configured, published to your live domain, and cached copies are not misleading. After verification, resubmit your sitemap and retry indexing.

Rate this answer

Other Webflow Questions