EXPLAIN OF ROBOT TXT

EXPLAIN OF ROBOT TXT

Besides sitemap, robots.txt is an important part greeting a web structure. This article will discuss about the understanding of robots.txt, robots.txt is a file with a function to restrict the access of search engine robots that are exploring the website. Before they access web pages, they check to see if a robots.txt file exists or not, and in no robots.txt commands that prevent them from accessing certain pages and vice versa.

\We need a robots.txt robots not to index the page that we do not want, it can be dangerous if until there is something secret in the search engines to index and find someone else. And if we want all the content on our website in the index by search engines we do not need a robots.txt.

Although Google or other search engines will not crawl or index the content of pages blocked by robots.txt, we may still be a URL in the index, this occurs because the URLs we found on other sites. As a result, we URL could potentially become public information and may appear in search results.

Post a Comment

0 Comments