WordPress GuideSEO → Robots.txt Template

Best WordPress robots.txt template (example)

A well-optimized robots.txt file helps search engines understand which parts of your WordPress site to crawl—and which to skip. Get it wrong, and you could block important pages or open up private areas to indexing. 

Here’s the best robots.txt template for WordPress and everything you need to know to customize it the right way.

Get fast, reliable hosting for WordPress

Power your site with the industry’s fastest, most optimized WordPress hosting

The best robots.txt template for WordPress

A solid robots.txt file tells search engines what to crawl and what to ignore. The goal is to improve your site’s SEO without accidentally blocking key content.

Recommended code example

What each line means

This basic format strikes the right balance for most WordPress sites: it hides unnecessary files while making it easy for search engines to index your content.

Where to place your robots.txt file

Your robots.txt file must live in the root directory of your domain, where search engines expect to find it. For example:

https://yourdomain.com/robots.txt

Don’t place it in subdirectories or plugin folders. If search engines can’t find it in the root, they’ll assume it doesn’t exist and crawl your site freely.

How to create or edit your robots.txt file

There are two main ways to manage your robots.txt file in WordPress: manually or through a plugin.

Manually via FTP or file manager

This method gives you full control, but also comes with more risk if you make a mistake.

Using SEO plugins (Yoast, Rank Math, AIOSEO)

Most top SEO plugins include a robots.txt editor. For example:

These tools allow you to modify the file safely from your WordPress dashboard without needing FTP access.

When to block specific URLs or content types

The basic template works for most WordPress installs, but sometimes you’ll want to block additional parts of your site from indexing.

Examples of common disallow rules

You can add as many Disallow: lines as needed, but be sure you’re not blocking valuable content.

When not to disallow

Some site owners mistakenly block folders like /wp-content/uploads/, thinking it improves SEO. In reality, this stops your images from being indexed, which can hurt your visibility in Google Images and reduce overall traffic.

How to test and validate your robots.txt file

After editing your robots.txt file, it’s important to make sure it works as intended.

Google Search Console robots.txt tester

Manual checks

SEO implications of robots.txt settings

Small mistakes in your robots.txt file can have big SEO consequences. It’s important to understand how these settings affect crawling and indexing.

Crawl budget efficiency

Blocking low-value pages like /wp-admin/ or certain archives can free up crawl budget for more important content. This is especially useful for large sites.

Avoiding accidental SEO damage

Many site owners have unknowingly blocked their entire site with a single line:

Disallow: /

This tells search engines to ignore every URL, including your homepage. Always double-check your rules—especially before a site launch or redesign.

Robots.txt vs noindex meta tags

While both tools influence search engine behavior, they work differently:

Use robots.txt for sensitive or non-public sections of your site (like login pages), and noindex for content you want crawlers to see but not index (like thin tag pages or duplicate thank-you pages).

Advanced rules for different user agents (optional)

If you want more control, you can write rules for specific bots.

Example:

This blocks only AhrefsBot while allowing other crawlers. Use this approach to:

Just don’t block Googlebot unless you’re absolutely sure what you’re doing.

Bonus: How to prevent image indexing (and why you might not want to)

Some site owners want to block Google from indexing their images, often for privacy or branding reasons.

Blocking image folders in robots.txt

To prevent search engines from indexing your media uploads, you can add:

Disallow: /wp-content/uploads/

This blocks access to all image, video, and PDF uploads.

When image indexing helps SEO

For most WordPress sites, image indexing is a good thing. Optimized images with descriptive file names and alt text can drive traffic from Google Images. Blocking uploads reduces visibility—and can prevent featured images from showing up in SERPs.

Unless you have a specific reason, it’s better to leave /uploads/ open.

Additional resources

Comprehensive guide to securing WordPress with ModSecurity

This guide provides a comprehensive overview of how to use ModSecurity to enhance the security of your WordPress site.

WordPress SEO services guide: 7 best, how to choose, and more →

Explore essential WordPress SEO services to improve site visibility, rankings, and organic traffic.
















Why security matters for WordPress enterprise hosting

Use the blog as your guide to attacks to watch out for, security best practices, and steps to improve the WordPress protection you already have.

Originally from Albuquerque, New Mexico, Leslie Bowman has hopped around the country since graduating from undergrad. She focused on English and Latin American History in college and eventually attended graduate school in New York City. There, she developed a passion for short, persuasive arguments. Bowman found a love of copywriting and is now a Senior Copywriter at Liquid Web.