Skip to content

Using the Robots.txt Tool in All in One SEO

Are you looking to customize the robots.txt on your site? This article will help.

The robots.txt module in All in One SEO lets you manage the robots.txt that WordPress creates.

This enables you to have greater control over the instructions you give web crawlers about your site.

Tutorial Video

Here’s a video on how to use the Robots.txt tool in All in One SEO:

About the Robots.txt in WordPress

First, it’s important to understand that WordPress generates a dynamic robots.txt for every WordPress site.

This default robots.txt contains the standard rules for any site running on WordPress.

Second, because WordPress generates a dynamic robots.txt there is no static file to be found on your server. The content of the robots.txt is stored in your WordPress database and displayed in a web browser. This is perfectly normal and is much better than using a physical file on your server.

Lastly, All in One SEO doesn’t generate a robots.txt, it just provides you with a really easy way to add custom rules to the default robots.txt that WordPress generates.

Using the Robots.txt Editor in All in One SEO

To get started, click on Tools in the All in One SEO menu.

Tools menu item in the All in One SEO menu

You should see the Robots.txt Editor and the first setting will be Enable Custom Robots.txt. Click the toggle to enable the custom robots.txt editor.

Click the Enable Custom Robots.txt toggle in the Robots.txt Editor

You should see the Custom Robots.txt Preview section at the bottom of the screen which shows the default rules added by WordPress.

Robots.txt Preview section in the Robots.txt Editor

Default Robots.txt Rules in WordPress

The default rules that show in the Custom Robots.txt Preview section (shown in the screenshot above) ask robots not to crawl your core WordPress files. It’s unnecessary for search engines to access these files directly because they don’t contain any relevant site content.

If for some reason you want to remove the default rules that are added by WordPress then you’ll need to use the robots_txt filter hook in WordPress.

Adding Rules Using the Rule Builder

The rule builder is used to add your own custom rules for specific paths on your site.

For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to add this.

Adding a rule in the robots.txt rule builder

To add a rule, click the Add Rule button and then complete the fields which are described below.

User Agent

First, enter the user agent in the User Agent field.

For example, if you want to specify Google’s crawler then enter “Googlebot” in the User Agent field.

If you want a rule that applies to all user agents then enter * in the User Agent field.

Directive

Next, select the rule type in the Directive drop down. There are four rule types you can select from:

  • Allow will allow crawlers with the specified user agent access to the directory or file in the Value field.
  • Block will block crawlers with the specified user agent access to the directory or file in the Value field.
  • Clean-param lets you exclude pages with URL parameters which can give the same content with a different URL. Yandex, the only search engine that currently supports this directive, has a good explanation with examples here.
  • Crawl-delay tells crawlers how frequently they can crawl your content. For example, a crawl delay of 10 tells crawlers not to crawl your content more than every 10 seconds.
    Currently this directive is only supported by Bing, Yahoo and Yandex. You can change the crawl rate of Google’s crawler in Google Search Console.

Value

Next, enter the directory path or filename in the Value field.

You can enter a directory path such as /wp-content/backups/ and file paths such as /wp-content/backups/temp.png.

You can also use * as a wildcard such as /wp-content/backup-*.

If you want to add more rules, then click the Add Rule button and repeat the steps above.

When you’re finished, click the Save Changes button.

Your rules will appear in the Custom Robots.txt Preview section and in your robots.txt which you can view by clicking the Open Robots.txt button.

Completed custom robots.txt

Editing Rules Using the Rule Builder

To edit any rule you’ve added, just change the details in the rule builder and click the Save Changes button.

Editing a custom robots.txt rule in the rule editor

Deleting a Rule in the Rule Builder

To delete a rule you’ve added, click the trash icon to the right of the rule.

Deleting a custom robots.txt rule in the rule editor

Changing the Order of Rules in the Rule Builder

You can easily change the order in which your custom rules appear in your robots.txt by dragging and dropping the entries in the rule builder.

Click and hold the drag and drop icon to the right of the rule and move the rule to where you want it to appear as seen below.

Changing the order of custom rules in the Robots.txt editor

Google has a good explanation here of why the order in which you place your rules is important.

Importing Your Own Robots.txt into All in One SEO

You can import your own robots.txt or rules from another source very easily.

First, click the Import button to open the Import Robots.txt window.

Import button shown in the rule builder in All in One SEO

In the Import Robots.txt window you can either import from a URL by entering the URL of a robots.txt in the Import from URL field or you can paste the contents of a robots.txt in the Paste Robots.txt text field.

Import Robots.txt window showing the Import from URL field and the Paste Robots.txt text

Once you’ve done this, click the Import button.

Using Advanced Rules in the Rule Builder

The Robots.txt Rule Builder also supports the use of advanced rules. This includes regex patterns as well as URL parameters.

Here are three examples of how advanced rules can be used:

In the examples above, these advanced rules are shown:

  • /search$ – this uses regex to allow access to the exact path “/search”
  • /search/ – this blocks access to paths that start with “/search/” but are not an exact match
  • /?display=wide – this allows access to the homepage with the matching URL parameter

Advanced rules such as these allow granular control over your site’s robots.txt file so that you have full control over how user agents access your website.

Robots.txt Editor for WordPress Multisite

There is also a Robots.txt Editor for Multisite Networks. Details can be found in our documentation on the Robots.txt Editor for Multisite Networks here.

Robots.txt module in All in One SEO The robots.txt module in All in One SEO allows you to set up a robots.txt file for your site that will override the default robots.txt file that WordPress creates. By creating a robots.txt file with All in One SEO Pack you have greater control over the instructions you give web crawlers about your site. Just like WordPress, All in One SEO generates a dynamic file so there is no static file to be found on your server.聽The content of the robots.txt file is stored in your WordPress database.

Default Rules

The default rules that show in the Create a Robots.txt File box (shown in screenshot above) ask robots not to crawl your core WordPress files. It's unnecessary for search engines to access these files directly because they don't contain any relevant site content. If for some reason you want to remove the default rules that are added by WordPress then you'll need to use the robots_txt filter hook in WordPress.

Adding Rules

The rule builder is used to add your own custom rules for specific paths on your site. For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to聽add this rule as shown below. To add a rule:

  1. Enter the User Agent. Using * will apply the rule to all user agents
  2. Select the rule type to Allow or Block a robot
  3. Enter the directory path, for example /wp-content/plugins/
  4. Click the Add Rule button
  5. The rule will appear in the table and in the box that shows your robots.txt appears

Adding a Rule in the Robots.txt module

Robots.txt Editor for WordPress Multisite

There is also a Robots.txt Editor for Multisite Networks.聽Details can be found here. NOTE:聽Whilst the robots.txt generated by All in One SEO is a dynamically generated page and not a static text file on your server, care should be taken in creating a large robots.txt for two reasons:

  1. A large robots.txt indicates a potentially complex set of rules which could be hard to maintain
  2. Google has proposed a maximum file size of 512KB to alleviate strain on servers from long connection times.

代做工资流水公司长春签证工资流水图片宜昌办入职银行流水威海银行流水电子版价格大庆查询银行流水PS银川代做房贷银行流水荆州打印日常消费流水南宁房贷收入证明办理合肥办理背调流水合肥企业贷流水制作武汉工资流水开具滁州入职银行流水报价西安收入证明代办泰州企业贷流水代办吉林车贷工资流水 查询淮安银行流水单代做潮州签证银行流水 代开珠海制作企业流水打印大连薪资银行流水样本襄阳银行流水修改查询长春做房贷银行流水邯郸打印在职证明上海工资银行流水费用泰安工资流水代开铜陵办理个人流水青岛企业银行流水模板厦门银行流水账单办理制作背调流水莆田查询流水账单威海代开工资代付流水海口银行流水单代开香港通过《维护国家安全条例》两大学生合买彩票中奖一人不认账让美丽中国“从细节出发”19岁小伙救下5人后溺亡 多方发声卫健委通报少年有偿捐血浆16次猝死汪小菲曝离婚始末何赛飞追着代拍打雅江山火三名扑火人员牺牲系谣言男子被猫抓伤后确诊“猫抓病”周杰伦一审败诉网易中国拥有亿元资产的家庭达13.3万户315晚会后胖东来又人满为患了高校汽车撞人致3死16伤 司机系学生张家界的山上“长”满了韩国人?张立群任西安交通大学校长手机成瘾是影响睡眠质量重要因素网友洛杉矶偶遇贾玲“重生之我在北大当嫡校长”单亲妈妈陷入热恋 14岁儿子报警倪萍分享减重40斤方法杨倩无缘巴黎奥运考生莫言也上北大硕士复试名单了许家印被限制高消费奥巴马现身唐宁街 黑色着装引猜测专访95后高颜值猪保姆男孩8年未见母亲被告知被遗忘七年后宇文玥被薅头发捞上岸郑州一火锅店爆改成麻辣烫店西双版纳热带植物园回应蜉蝣大爆发沉迷短剧的人就像掉进了杀猪盘当地回应沈阳致3死车祸车主疑毒驾开除党籍5年后 原水城县长再被查凯特王妃现身!外出购物视频曝光初中生遭15人围殴自卫刺伤3人判无罪事业单位女子向同事水杯投不明物质男子被流浪猫绊倒 投喂者赔24万外国人感慨凌晨的中国很安全路边卖淀粉肠阿姨主动出示声明书胖东来员工每周单休无小长假王树国卸任西安交大校长 师生送别小米汽车超级工厂正式揭幕黑马情侣提车了妈妈回应孩子在校撞护栏坠楼校方回应护栏损坏小学生课间坠楼房客欠租失踪 房东直发愁专家建议不必谈骨泥色变老人退休金被冒领16年 金额超20万西藏招商引资投资者子女可当地高考特朗普无法缴纳4.54亿美元罚金浙江一高校内汽车冲撞行人 多人受伤

代做工资流水公司 XML地图 TXT地图 虚拟主机 SEO 网站制作 网站优化