Robots.txt, also known as ‘robots exclusion protocol’, is a text file that contains instructions for crawling bots.
Robots.txt file is important to instruct search engines which pages or sections of a website should be crawled or indexed. It is a simple text file that is placed in the root directory of a website.
Search engine bot at first looks for a robots.txt file on the server and then it crawls other pages of the website. You can put your special instructions for search engines in this Robots.txt file.