Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


A Robots.txt file is extremely important from an SEO perspective. It comes with all the instructions on how to crawl a site. It is a standard used by websites to tell a bot which part of the site needs indexing. The file gives you the liberty to specify which areas shouldn’t get processed by crawlers (underdeveloped pages or duplicate content).

This file contains a user agent where you can specify directives such as crawl delay, allow or disallow. If you sit to write all of this manually, it will take ages. You will have to enter multiple lines for a single command in just one file. One wrong line and so much could go wrong. Our Robots.Txt Generator is here to make the task easier for you.


LATEST BLOGS

How Google Search works

How Google Search works

17 Jun  / 1296 views  /  by Krish