The do_robots WordPress PHP action fires when the template loader determines a robots.txt request.
Usage
add_action('do_robots', 'custom_robots_txt');
function custom_robots_txt() {
// your custom code here
}
Parameters
- None
More information
See WordPress Developer Resources: do_robots
Examples
Add custom rules to robots.txt
To add custom rules to the robots.txt file, you can use the following code:
add_action('do_robots', 'add_custom_robots_txt_rules');
function add_custom_robots_txt_rules() {
echo "Disallow: /private-area/\n";
}
Allow only specific user agents
If you want to allow only specific user agents to access your site, use this code:
add_action('do_robots', 'allow_specific_user_agents');
function allow_specific_user_agents() {
echo "User-agent: Googlebot\n";
echo "Allow: /\n";
echo "\n";
echo "User-agent: *\n";
echo "Disallow: /\n";
}
Block specific user agents
To block specific user agents from accessing your site, use this code:
add_action('do_robots', 'block_specific_user_agents');
function block_specific_user_agents() {
echo "User-agent: BadBot\n";
echo "Disallow: /\n";
}
Disallow access to specific directories
If you want to disallow access to specific directories for all user agents, use this code:
add_action('do_robots', 'disallow_specific_directories');
function disallow_specific_directories() {
echo "Disallow: /private/\n";
echo "Disallow: /confidential/\n";
}
Disallow access to specific files
To disallow access to specific files for all user agents, use this code:
add_action('do_robots', 'disallow_specific_files');
function disallow_specific_files() {
echo "Disallow: /private-file.pdf\n";
}