M

MicrosoftPreview

Visit Bot Homepage

Verify MicrosoftPreview IP Address

Verify if an IP address truly belongs to Microsoft, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.

MicrosoftPreview is a Microsoft crawler used to render webpages in a browser-like environment for testing, feature evaluation, and content understanding across Microsoft services. It performs fetches that simulate modern browser behavior, including JavaScript execution, layout rendering, and metadata extraction. Unlike Bingbot, it is not used for core indexing but for assessing how pages display in Microsoft products. Activity is moderate and focused on pages relevant to rendering quality checks. Its purpose is to enhance visual accuracy and user experience across Microsoft’s platforms. RobotSense.io verifies MicrosoftPreview using Microsoft’s official validation methods, ensuring only genuine MicrosoftPreview traffic is identified.

This bot does not honor Crawl-Delay rule.

User Agent Examples

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; MicrosoftPreview/2.0; +https://aka.ms/MicrosoftPreview) Chrome/W.X.Y.Z Safari/537.36
	
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36  (compatible; MicrosoftPreview/2.0; +https://aka.ms/MicrosoftPreview)
Example user agent strings for MicrosoftPreview

Robots.txt Configuration for MicrosoftPreview

Robots.txt User-Agent:MicrosoftPreview

Use this identifier in your robots.txt User-agent directive to target MicrosoftPreview.

Recommended Configuration

Our recommended robots.txt configuration for MicrosoftPreview:

User-agent: MicrosoftPreview
Allow: /

Completely Block MicrosoftPreview

Prevent this bot from crawling your entire site:

User-agent: MicrosoftPreview
Disallow: /

Completely Allow MicrosoftPreview

Allow this bot to crawl your entire site:

User-agent: MicrosoftPreview
Allow: /

Block Specific Paths

Block this bot from specific directories or pages:

User-agent: MicrosoftPreview
Disallow: /private/
Disallow: /admin/
Disallow: /api/

Allow Only Specific Paths

Block everything but allow specific directories:

User-agent: MicrosoftPreview
Disallow: /
Allow: /public/
Allow: /blog/

Set Crawl Delay

Limit how frequently MicrosoftPreview can request pages (in seconds):

User-agent: MicrosoftPreview
Allow: /
Crawl-delay: 10

Note: This bot does not officially mention about honoring Crawl-Delay rule.