Robots.txt

This file provides instructions to search engine crawlers about which pages to index.

# Robots.txt for AtoZ Logistic Solutions
# https://atoz-logistic-solutions.getlandingsite.com/

# Allow all search engines to crawl the entire site
User-agent: *
Allow: /

# Disallow crawling of admin or private areas (if any)
# Disallow: /admin/
# Disallow: /private/

# Sitemap location
Sitemap: https://atoz-logistic-solutions.getlandingsite.com/sitemap-xml

# Crawl delay (optional - helps prevent server overload)
Crawl-delay: 1

# Specific instructions for major search engines
User-agent: Googlebot
Allow: /
Crawl-delay: 1

User-agent: Bingbot
Allow: /
Crawl-delay: 1

User-agent: Slurp
Allow: /
Crawl-delay: 1

Robots.txt Configuration

Current Settings:

  • Allow all crawlers to index the site
  • Sitemap location specified
  • Crawl delay set to 1 second
  • Major search engines configured

Benefits:

  • Better search engine indexing
  • Controlled server load
  • Sitemap discovery
  • Crawling guidelines

SEO Impact

This robots.txt file helps search engines efficiently crawl and index your logistics website, improving your SEO performance by:

Faster Indexing

Guides crawlers to important pages

Better Rankings

Optimizes crawl budget usage

Global Reach

Works with all major search engines

This robots.txt file was configured on December 19, 2024 for optimal SEO performance.