Googlebot
SearchVerify Googlebot IP Address
Verify if an IP address truly belongs to Google, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.
Googlebot is Google’s primary web crawler, responsible for discovering, fetching, and updating content across the public internet for inclusion in Google Search. It operates at massive scale, continuously revisiting sites based on their importance, freshness, and user demand. Googlebot uses a distributed crawling infrastructure that intelligently balances crawl frequency with server load, aiming to gather the most useful and up-to-date information without overwhelming websites. It identifies itself with the Googlebot user-agent family and is fully transparent about its behavior. Genuine Googlebot traffic can be verified through Google’s published reverse-DNS method, which confirms whether an IP truly belongs to Google’s crawling network. Beyond standard HTML pages, Googlebot is capable of rendering JavaScript, interpreting structured data, and evaluating mobile friendliness, which directly influences how pages appear in search results. Googlebot has 2 internal variants i.e., Googlebot Smartphone and Googlebot Desktop. Google increasingly uses Googlebot Smartphone for content crawling. RobotSense.io verifies Googlebot using Google’s official validation methods, ensuring only genuine Googlebot traffic is identified.
User Agent Examples
Googlebot Smartphone: Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Googlebot Desktop: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; +http://www.google.com/bot.html) Chrome/W.X.Y.Z Safari/537.36
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Googlebot/2.1 (+http://www.google.com/bot.html)Robots.txt Configuration for Googlebot
GooglebotUse this identifier in your robots.txt User-agent directive to target Googlebot.
Recommended Configuration
Our recommended robots.txt configuration for Googlebot:
User-agent: Googlebot
Allow: /Completely Block Googlebot
Prevent this bot from crawling your entire site:
User-agent: Googlebot
Disallow: /Completely Allow Googlebot
Allow this bot to crawl your entire site:
User-agent: Googlebot
Allow: /Block Specific Paths
Block this bot from specific directories or pages:
User-agent: Googlebot
Disallow: /private/
Disallow: /admin/
Disallow: /api/Allow Only Specific Paths
Block everything but allow specific directories:
User-agent: Googlebot
Disallow: /
Allow: /public/
Allow: /blog/Set Crawl Delay
Limit how frequently Googlebot can request pages (in seconds):
User-agent: Googlebot
Allow: /
Crawl-delay: 10Note: This bot officially honors the Crawl-delay directive.