Robots.txt is really a file that exists on the foundation directory of each website and may be used to instruct search engines like google on which directories/documents of the website they are able to crawl and include inside their index. Of course, they don’t reveal many of the algorithm details, https://news.rafeeg.ae