Overview # uses the core robots.txt file and adds a number of default entries, such as the sitemap, to optimize your site. To modify the file, you can hook into the do_robotstxt action, or filter the output by hooking into robots_txt (source).

↑ Top ↑

Example: Mark a directory as “nofollow” #

function my_disallow_directory() {
	echo "User-agent: *" . PHP_EOL;
	echo "Disallow: /path/to/your/directory/" . PHP_EOL;
add_action( 'do_robotstxt', 'my_disallow_directory' );

↑ Top ↑

Caching #

Note that we cache the robots.txt for long periods of time. This means that you’ll need to force the caches to clear after any changes, by going to Settings > Reading from your Dashboard and toggling the privacy settings.

↑ Top ↑

On domains #

On any subdomain of, the robots.txt output will be hard-coded to return a “Disallow for all user agents” result. This is to prevent search engines from indexing content hosted on development/staging sites.

Ready to get started?

Tell us about your needs

Let us lead the way. We’ll help you select a top tier development partner. We’ll train your developers, operations, infrastructure, and editorial teams. We’ll coarchitect your deployment processes. We will provide live support for peak events. We’ll help your people avoid dark alleys and blind corners, and reduce wasted cycles.