Allow Googlebot crawlers in cloudflare DNS (Failed: Blocked due to access forbidden (403))

 To allow Googlebot crawlers through Cloudflare DNS, you'll need to ensure that Googlebot isn't blocked by Cloudflare's security settings, firewall rules, or any other features. Here's how to make sure that Googlebot can crawl your website properly:


--------------------------------------

Go to Firewall > Tools.

Simply add google ASN name and Action Allow





-------------------------------

Steps to Allow Googlebot in Cloudflare:

1. Whitelist Googlebot IP Ranges in Cloudflare

You can explicitly allow Googlebot by whitelisting its IP ranges in Cloudflare.

Steps:

  1. Log in to Cloudflare and select your website.

  2. Go to Firewall > Tools.

  3. In the IP Access Rules section, add the IP ranges used by Googlebot and set them to Allow.

    Googlebot's IP ranges can be found here. Some common ranges include:

    • 66.249.64.0/19
    • 64.233.160.0/19
    • 66.102.0.0/20
  4. Select the scope (either for This Website or Global) and add the rule.

2. Check Firewall Rules

Review your firewall settings to ensure that you're not blocking Googlebot.

Steps:

  1. Go to the Firewall Rules section in Cloudflare.
  2. Check if any of the rules block Googlebot's user-agent or IPs.
  3. Modify or delete any rule that might inadvertently block Googlebot.

3. Disable Bot Fight Mode (If Enabled)

Cloudflare’s Bot Fight Mode may mistakenly block Googlebot. To avoid this, disable Bot Fight Mode.

Steps:

  1. Go to the Security section of Cloudflare.

  2. Under Bots, disable Bot Fight Mode.

    This ensures that Googlebot isn’t blocked alongside harmful bots.

4. Create a Page Rule to Bypass Security for Googlebot

You can create a Page Rule in Cloudflare that allows Googlebot to bypass certain security features like firewalls, browser integrity checks, etc.

Steps:

  1. In Cloudflare, navigate to Page Rules.
  2. Click Create Page Rule and enter your URL pattern (*yourdomain.com/*).
  3. Add settings such as:
    • Security Level: Set to "Essentially Off."
    • Browser Integrity Check: Disable.
    • Cache Level: Set to Bypass (if needed).
  4. Save the rule and deploy it.

5. Check Robots.txt File

Make sure that your robots.txt file allows Googlebot to crawl your site.

Example of allowing Googlebot:

plaintext
User-agent: Googlebot Disallow:

Ensure there’s no Disallow rule preventing Googlebot from accessing important parts of your site.

6. Use Google Search Console to Test

After making changes, test your site in Google Search Console to verify that Googlebot can crawl your website.

By following these steps, you will ensure that Googlebot is allowed to crawl your site without being blocked by Cloudflare DNS settings or security features.

Post a Comment

Previous Post Next Post