Robots.txt:-
The robot exclusion standard is also known as the Robots Exclusion Protocol or Robots.txt protocol. You need a Robots.txt file only if your site includes Content that you don't want to index in a search engine like your site admin panel. For example in your website u don't want to index your Admin panel so we write on Robots.txt- User agent: Disallow/Admin panel/. If we also don't want index any particular web page of your website or another panel we write name only.
If you want a Particuler google to not crawl so we write google bot bcoz google crawler name was bot. We also restrict Particuler crawler. After making Robots.txt u keep this file in your root directory. link-www.domainname.com/Robots.txt. we also write Meta tags-{<metaname="robots"content="nofollow"> <metaname="googlebot" content="noindex">
No comments:
Post a Comment