by » MOC » Thu, 7 May 2020, 11:27 am | #2 of 4 | ∇
Hi,
It's to tell [BOTS] (individually or collectively) what pages on your website you'd like them to crawl and which ones you'd like them not to crawl for their search engine indexing purposes. Most reputable [BOTS] will follow your specific robots.txt commands.
by » RH-Calvin » Tue, 26 May 2020, 10:55 am | #3 of 4 | ∇
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.