This file provides instructions to search engine crawlers about which pages to index.
# Robots.txt for AtoZ Logistic Solutions
# https://atoz-logistic-solutions.getlandingsite.com/
# Allow all search engines to crawl the entire site
User-agent: *
Allow: /
# Disallow crawling of admin or private areas (if any)
# Disallow: /admin/
# Disallow: /private/
# Sitemap location
Sitemap: https://atoz-logistic-solutions.getlandingsite.com/sitemap-xml
# Crawl delay (optional - helps prevent server overload)
Crawl-delay: 1
# Specific instructions for major search engines
User-agent: Googlebot
Allow: /
Crawl-delay: 1
User-agent: Bingbot
Allow: /
Crawl-delay: 1
User-agent: Slurp
Allow: /
Crawl-delay: 1
This robots.txt file helps search engines efficiently crawl and index your logistics website, improving your SEO performance by:
Guides crawlers to important pages
Optimizes crawl budget usage
Works with all major search engines
This robots.txt file was configured on December 19, 2024 for optimal SEO performance.