Y

YandexSitelinks

Visit Bot Homepage

Verify YandexSitelinks IP Address

Verify if an IP address truly belongs to Yandex, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.

YandexSitelinks is a Yandex crawler used to verify the availability and accessibility of pages selected for sitelinks in Yandex Search results. It performs targeted checks to confirm that sitelink URLs return valid responses and remain accessible. The bot does not crawl sites broadly or index new content; its role is strictly validation. It's crawl activity is lightweight and periodic, triggered when sitelinks are generated, updated, or rechecked. Its purpose is to ensure that sitelinks shown in search results lead to functioning, reliable pages for users. It honors the global robots.txt user agent (*) rule. RobotSense.io verifies YandexSitelinks using Yandex’s official validation methods, ensuring only genuine YandexSitelinks traffic is identified.

This bot does not honor Crawl-Delay rule.

User Agent Examples

Mozilla/5.0 (compatible; YandexSitelinks; Dyatel; +http://yandex.com/bots)
Example user agent strings for YandexSitelinks

Robots.txt Configuration for YandexSitelinks

Robots.txt User-Agent:YandexSitelinks

Use this identifier in your robots.txt User-agent directive to target YandexSitelinks.

Recommended Configuration

Our recommended robots.txt configuration for YandexSitelinks:

User-agent: YandexSitelinks
Allow: /

Completely Block YandexSitelinks

Prevent this bot from crawling your entire site:

User-agent: YandexSitelinks
Disallow: /

Completely Allow YandexSitelinks

Allow this bot to crawl your entire site:

User-agent: YandexSitelinks
Allow: /

Block Specific Paths

Block this bot from specific directories or pages:

User-agent: YandexSitelinks
Disallow: /private/
Disallow: /admin/
Disallow: /api/

Allow Only Specific Paths

Block everything but allow specific directories:

User-agent: YandexSitelinks
Disallow: /
Allow: /public/
Allow: /blog/

Set Crawl Delay

Limit how frequently YandexSitelinks can request pages (in seconds):

User-agent: YandexSitelinks
Allow: /
Crawl-delay: 10

Note: This bot does not officially mention about honoring Crawl-Delay rule.