Applebot
SearchVerify Applebot IP Address
Verify if an IP address truly belongs to Apple, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.
Applebot is Apple's official web crawler used to power search and content features across Apple services such as Siri, Spotlight Suggestions, and Safari. It crawls webpages to discover content, metadata, and structured information that enhance on-device and cloud-based search experiences. Crawl activity is generally moderate and focused on high-quality, publicly accessible content. Its purpose is to improve search relevance, answers, and suggestions across Apple’s ecosystem without operating a standalone public web search engine. Data crawled by Applebot may be utilized by Apple for foundational model training. Apple allows site owners to opt-out of having their content used for generative model training by disallowing Applebot-Extended in the robots.txt file. RobotSense.io verifies Applebot using Apple's official validation methods, ensuring only genuine Applebot traffic is identified.
User Agent Examples
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15(KHTML, like Gecko) Version/17.4 Safari/605.1.15 (Applebot/0.1; +http://www.apple.com/go/applebot)
Mozilla/5.0 (iPhone; CPU iPhone OS 17_4_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.4.1 Mobile/15E148 Safari/604.1 (Applebot/0.1; +http://www.apple.com/go/applebot)Robots.txt Configuration for Applebot
ApplebotUse this identifier in your robots.txt User-agent directive to target Applebot.
Recommended Configuration
Our recommended robots.txt configuration for Applebot:
User-agent: Applebot
Allow: /
User-agent: Applebot-Extended
Allow: /Completely Block Applebot
Prevent this bot from crawling your entire site:
User-agent: Applebot
Disallow: /Completely Allow Applebot
Allow this bot to crawl your entire site:
User-agent: Applebot
Allow: /Block Specific Paths
Block this bot from specific directories or pages:
User-agent: Applebot
Disallow: /private/
Disallow: /admin/
Disallow: /api/Allow Only Specific Paths
Block everything but allow specific directories:
User-agent: Applebot
Disallow: /
Allow: /public/
Allow: /blog/Set Crawl Delay
Limit how frequently Applebot can request pages (in seconds):
User-agent: Applebot
Allow: /
Crawl-delay: 10Note: This bot officially honors the Crawl-delay directive.