Robots.txt Generator

Generate a Robots text file to upload to your website for free.

Leave blank if you don't have.
Google Image
Google Mobile
MSN Search
Yahoo MM
Yahoo Blogs
DMOZ Checker
MSN PicSearch
The path is relative to the root and must contain a trailing slash "/".


1) Select if you want to allow, or not allow bots to crawl this site.  If your site is still in development you should set it to not allow, otherwise always leave it as allow.

2) Select a crawl delay.  This delay is the time between the bot crawling one page and another.   This can reduce the load on your server, but slows down new pages and changes to your site being indexed.  We recommend keeping this at zero.

3) Add the URL of your site map.  This is optional but makes it easier for the bots to find all your pages.

4) In the next area you can specify the access for each of the major bots out there.  If you are not sure, leave it as default.

5) In the last area you can specify subdirectories on your server you do not want robots to crawl.  

Why use a Robots.txt generator?

A Robots.txt generator is an easy-to-use tool for creating a robots.txt file for your website. It allows you to configure the access settings for a variety of web crawlers and web spiders, including Googlebot and Bingbot. By specifying the directories and files that you want these bots to access, you can help ensure that your website is indexed correctly and that only the data you want to be indexed is seen. A Robots.txt generator can also provide you with more control over how your website is crawled, helping you to keep your website secure and private.

What does a Robot.txt file do?

A robot.txt file is a text file that a website creates to tell web robots (typically search engine robots) which pages on the website should not be visited or indexed. This is done for a variety of reasons, including to keep certain pages private, to prevent a server from being overloaded by too many requests, or to keep certain content from being indexed by search engines