Robots.txt

The robots exclusion standard, otherwise called the robots exclusion convention or essentially robots.txt, is a standard utilized by sites to speak with web crawlers and other web robots. The standard indicates how to educate the web robot about which territories of the site ought not to be prepared or examined. Robots are regularly utilized via web crawlers to order sites. Not all robots participate with the standard; email gatherers, spam-bots, malware, and robots that output for security vulnerabilities may even begin with the portions of the site where they have been advised to remain out. The standard is not quite the same as however can be utilized in conjunction with, Sitemaps, a robot inclusion standard for sites.

also share this on:
« Back to Glossary Index

Leave a Reply

Your email address will not be published. Required fields are marked *