A robots.txt file is a text file that tells web robots (also known as spiders or crawlers) which pages on your website to crawl and which to ignore.
When a robot crawls a website, it reads the robots.txt file to check for instructions on which pages it should crawl and which it should ignore.
No comments:
Post a Comment