Documentation robots.txt

robots.txt

Overview #

WordPress.com uses the core robots.txt file and adds a number of default entries, such as the sitemap, to optimize your site. To modify the file, you can hook into the do_robotstxt action, or filter the output by hooking into robots_txt (source).

↑ Top ↑

Example: Mark a directory as “nofollow” #

function my_disallow_directory() {
	echo "User-agent: *n" . PHP_EOL;
	echo "Disallow: /path/to/your/directory/n" . PHP_EOL;
}
add_action( 'do_robotstxt', 'my_disallow_directory' );

↑ Top ↑

Caching #

Note that we cache the robots.txt for long periods of time. This means that you’ll need to force the caches to clear after any changes, by going to Settings > Reading from your Dashboard and toggling the privacy settings.