G

GoogleOther

Visit Bot Homepage

Verify GoogleOther IP Address

Verify if an IP address truly belongs to Google, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.

GoogleOther is a general-purpose crawler used by Google for internal research, large-scale data analysis, and non–Search-related fetching. It is part of Google’s secondary crawling infrastructure, designed to offload tasks that don’t require the full capabilities or strict policies of Googlebot. GoogleOther typically performs broad but lower-priority fetches, such as machine learning dataset generation or internal experiments. Its activity is generally lightweight compared to Googlebot and is separate from indexing operations that directly influence Google Search results. RobotSense.io verifies GoogleOther using Google’s official validation methods, ensuring only genuine GoogleOther traffic is identified.

This bot does not honor Crawl-Delay rule.

User Agent Examples

Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; GoogleOther)

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GoogleOther) Chrome/W.X.Y.Z Safari/537.36
Example user agent strings for GoogleOther

Robots.txt Configuration for GoogleOther

Robots.txt User-Agent:GoogleOther

Use this identifier in your robots.txt User-agent directive to target GoogleOther.

Recommended Configuration

Our recommended robots.txt configuration for GoogleOther:

User-agent: GoogleOther
Allow: /

Completely Block GoogleOther

Prevent this bot from crawling your entire site:

User-agent: GoogleOther
Disallow: /

Completely Allow GoogleOther

Allow this bot to crawl your entire site:

User-agent: GoogleOther
Allow: /

Block Specific Paths

Block this bot from specific directories or pages:

User-agent: GoogleOther
Disallow: /private/
Disallow: /admin/
Disallow: /api/

Allow Only Specific Paths

Block everything but allow specific directories:

User-agent: GoogleOther
Disallow: /
Allow: /public/
Allow: /blog/

Set Crawl Delay

Limit how frequently GoogleOther can request pages (in seconds):

User-agent: GoogleOther
Allow: /
Crawl-delay: 10

Note: This bot does not officially mention about honoring Crawl-Delay rule.