Answers

Question and Answer:

  Home  Web Developer Cum SEO Analyst

⟩ Please explain what is robots.txt?

Robots.txt is one way of telling the Search Engine Bots about the web pages on your website which you do not want them to visit.

Robots.txt is useful for preventing the indexation of the parts of any online content that website owners do not want to display.

IF you want to block all search engine robots from crawling your website, just put the follow code:

IF you want to block Google from crawling your website, just put the follow code:

It is important to decide the location of Robots.txt very carefully, or else errors might occur while displaying the website.

 201 views

More Questions for you: