Verify Meta-ExternalAds IP Address

Verify if an IP address truly belongs to Meta / Facebook, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.

Meta-ExternalAds is Meta’s crawler used to evaluate landing pages associated with ads running on Facebook, Instagram, and other Meta platforms. It performs targeted checks to assess page load behavior, policy compliance, redirects, content quality, and overall ad safety. These fetches are ad-driven, not general web crawling, and help Meta determine whether landing pages meet advertising standards. Blocking it may affect ad review accuracy or eligibility. Crawl activity is focused, low-volume, and typically triggered when advertisers submit new ads, update creatives, or undergo automated policy reviews. It ignores the global user agent (*) rule. RobotSense.io verifies Meta-ExternalAds using Meta’s official validation methods, ensuring only genuine Meta-ExternalAds traffic is identified.

This bot does not honor Crawl-Delay rule.

User Agent Examples

Contains: meta-externalads/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)

Contains: meta-externalads/1.1
Example user agent strings for Meta-ExternalAds

Robots.txt Configuration for Meta-ExternalAds

Robots.txt User-Agent:Meta-ExternalAds

Use this identifier in your robots.txt User-agent directive to target Meta-ExternalAds.

Recommended Configuration

Our recommended robots.txt configuration for Meta-ExternalAds:

User-agent: Meta-ExternalAds
Allow: /

Completely Block Meta-ExternalAds

Prevent this bot from crawling your entire site:

User-agent: Meta-ExternalAds
Disallow: /

Completely Allow Meta-ExternalAds

Allow this bot to crawl your entire site:

User-agent: Meta-ExternalAds
Allow: /

Block Specific Paths

Block this bot from specific directories or pages:

User-agent: Meta-ExternalAds
Disallow: /private/
Disallow: /admin/
Disallow: /api/

Allow Only Specific Paths

Block everything but allow specific directories:

User-agent: Meta-ExternalAds
Disallow: /
Allow: /public/
Allow: /blog/

Set Crawl Delay

Limit how frequently Meta-ExternalAds can request pages (in seconds):

User-agent: Meta-ExternalAds
Allow: /
Crawl-delay: 10

Note: This bot does not officially mention about honoring Crawl-Delay rule.