As we continue our All in One SEO Pack series, we will now show you how to create a robots.txt file in WordPress with the All in One SEO Pack plugin. This helps you control what content of your site is being crawled by search engines and can help your site performance, or SEO rank. First, we will walk you through installing the Robots.txt feature, then show you how to generate a robots.txt file. For more information on robots.txt files, see our full article How to Stop Search Engines from Crawling your Website.
Create robots.txt file
- Log into your WordPress Dashboard.
Click All in One SEO then Feature Manager in the navigation menu.
You will see Robots.txt listed. Click the Activate button.
Click the Robots.txt link in the All in One SEO section menu.
You will then see a section where you can Create a Robots.txt File. Choose your rules as needed. Below is a description of the options.
Option Description Create a Robots.txt File Rule Type Here you can choose to Allow or Block a bot from crawling. User Agent Enter a User Agent to allow or block. Here are some popular search engines you typically want to allow: Googlebot, Yahoo! Slurp, bingbot Here are some commonly blocked agents: AhrefsBot, Baiduspider, Ezooms, MJ12bot, YandexBot Directory Path Enter the path to the directory you want to block or allow access to. Add Rule Click this to add your new rule to the robots.txt file. Save Robots.txt File Click this button to save any changes you have made to the robots.txt file. Delete Robots.txt File Click this to erase your existing robots.txt file.
There is an Optimize button on the bottom of this page. According to All in One SEO Pack you can “Click the Optimize button below and All in One SEO Pack will analyze your Robots.txt file to make sure it complies with the standards for Robots.txt files. The results will be displayed in a table below.“
Congratulations, now you know how to create a robots.txt file in WordPress with the All in One SEO Pack plugin