G

Google-Safety

Visit Bot Homepage

Verify Google-Safety IP Address

Verify if an IP address truly belongs to Google, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.

Google-Safety is Google’s crawler used to support security, malware detection, and safe-browsing evaluations across the web. It performs targeted checks to detect harmful content, phishing signals, unwanted software, and compromised pages. These scans help maintain Google Safe Browsing warnings and protect users across Chrome, Search, and other Google products. This bot does not respect robots.txt, as that would limit Google’s ability to assess site safety accurately. Crawl activity is typically lightweight and periodic, triggered by risk indicators, user reports, or automated systems monitoring for changes in a site’s security posture. It ignores the global user agent (*) rule. RobotSense.io verifies Google-Safety using Google’s official validation methods, ensuring only genuine Google-Safety traffic is identified.

This bot does not honor Crawl-Delay rule.

User Agent Examples

Contains: Google-Safety
Example user agent strings for Google-Safety

Robots.txt Configuration for Google-Safety

Robots.txt User-Agent:Google-Safety

Use this identifier in your robots.txt User-agent directive to target Google-Safety.

Recommended Configuration

Our recommended robots.txt configuration for Google-Safety:

# Google officially ignores the Robots.txt config rule for this bot.
User-agent: Google-Safety
Allow: /

Completely Block Google-Safety

Prevent this bot from crawling your entire site:

User-agent: Google-Safety
Disallow: /

Completely Allow Google-Safety

Allow this bot to crawl your entire site:

User-agent: Google-Safety
Allow: /

Block Specific Paths

Block this bot from specific directories or pages:

User-agent: Google-Safety
Disallow: /private/
Disallow: /admin/
Disallow: /api/

Allow Only Specific Paths

Block everything but allow specific directories:

User-agent: Google-Safety
Disallow: /
Allow: /public/
Allow: /blog/

Set Crawl Delay

Limit how frequently Google-Safety can request pages (in seconds):

User-agent: Google-Safety
Allow: /
Crawl-delay: 10

Note: This bot does not officially mention about honoring Crawl-Delay rule.