In order for these texts to function, they have to be plain and comply with the Robots exclusion standard.In addition, these files can include one or more groups and take into account that each of them can block or allow access to a path of direct file to a website.
5 Examples of robots txt
- User –agent * disallow: /: with this type of robots it is prevented that everyone can access some type of content, since with this code you can specify the type since in this case an asterisk will indicate all of them. Since with disallow / robots will be difficult to access elsewhere.
- User-agent * Disallow: when the disallow is left empty, it is easier for robots to access any type of content.
- Disallow: / html – This code allows not all robots to pass through html contact or any html file.
- User –agent: google: with this code it is used so that no robot goes through the place but must be through the bot of
- The robots. Txt is not binding: this type of bot is often ignored by crawlers