Robot.txt is a text file created on a website to tell search engine robots that which pages on the website are not to be indexed in the search engine database. Generally it is done to hide webpages with some sensitive content. Specifying those pages in the robot.txt file will prevent them from accessing by search engine crawlers.
The file must be placed on the main directory of a website, so that the crawler can easily find it and follow the instructions written in it.
Filed Under: What Is
Questions related to this article?
👉Ask and discuss on Electro-Tech-Online.com and EDAboard.com forums.
Tell Us What You Think!!
You must be logged in to post a comment.