◦ Comprehensive security
◦ 24/7 support
WordPress Guide → SEO → Robots.txt Template
Best WordPress robots.txt template (example)
A well-optimized robots.txt file helps search engines understand which parts of your WordPress site to crawl—and which to skip. Get it wrong, and you could block important pages or open up private areas to indexing.
Here’s the best robots.txt template for WordPress and everything you need to know to customize it the right way.
Get fast, reliable hosting for WordPress
Power your site with the industry’s fastest, most optimized WordPress hosting
The best robots.txt template for WordPress
A solid robots.txt file tells search engines what to crawl and what to ignore. The goal is to improve your site’s SEO without accidentally blocking key content.
Recommended code example
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: https://yourdomain.com/wp-sitemap.xmlWhat each line means
- User-agent: * — This applies the rules to all bots and crawlers.
- Disallow: /wp-admin/ — Blocks access to the admin backend, which has no SEO value.
- Disallow: /wp-includes/ — Blocks WordPress core files that should never be crawled.
- Sitemap: — Helps crawlers discover your site’s content more efficiently. Replace the example URL with your actual sitemap location.
This basic format strikes the right balance for most WordPress sites: it hides unnecessary files while making it easy for search engines to index your content.
Where to place your robots.txt file
Your robots.txt file must live in the root directory of your domain, where search engines expect to find it. For example:
https://yourdomain.com/robots.txt
Don’t place it in subdirectories or plugin folders. If search engines can’t find it in the root, they’ll assume it doesn’t exist and crawl your site freely.
How to create or edit your robots.txt file
There are two main ways to manage your robots.txt file in WordPress: manually or through a plugin.
Manually via FTP or file manager
- Open a plain text editor (like Notepad).
- Add your robots.txt rules.
- Save the file as robots.txt.
- Upload it to the root directory of your site using FTP or your hosting control panel.
This method gives you full control, but also comes with more risk if you make a mistake.
Using SEO plugins (Yoast, Rank Math, AIOSEO)
Most top SEO plugins include a robots.txt editor. For example:
- Yoast SEO: Go to SEO > Tools > File Editor.
- Rank Math: Head to General Settings > Edit robots.txt.
- AIOSEO: Go to Tools > Robots.txt Editor.
These tools allow you to modify the file safely from your WordPress dashboard without needing FTP access.
When to block specific URLs or content types
The basic template works for most WordPress installs, but sometimes you’ll want to block additional parts of your site from indexing.
Examples of common disallow rules
- Disallow: /wp-content/plugins/ — Blocks crawlers from accessing plugin directories.
- Disallow: /category/uncategorized/ — Prevents indexing of unwanted category archives.
- Disallow: /thank-you/ — Hides low-value or duplicate thank-you pages from search engines.
You can add as many Disallow: lines as needed, but be sure you’re not blocking valuable content.
When not to disallow
Some site owners mistakenly block folders like /wp-content/uploads/, thinking it improves SEO. In reality, this stops your images from being indexed, which can hurt your visibility in Google Images and reduce overall traffic.
How to test and validate your robots.txt file
After editing your robots.txt file, it’s important to make sure it works as intended.
Google Search Console robots.txt tester
- Log in to Google Search Console.
- Go to your property and click on robots.txt Tester (under “Legacy tools”).
- Paste your robots.txt content or load your live file.
- Test specific URLs to confirm whether they’re allowed or blocked.
Manual checks
- Open https://yourdomain.com/robots.txt in your browser to view the live file.
- Use site:yourdomain.com searches in Google to check whether blocked URLs are still indexed.
SEO implications of robots.txt settings
Small mistakes in your robots.txt file can have big SEO consequences. It’s important to understand how these settings affect crawling and indexing.
Crawl budget efficiency
Blocking low-value pages like /wp-admin/ or certain archives can free up crawl budget for more important content. This is especially useful for large sites.
Avoiding accidental SEO damage
Many site owners have unknowingly blocked their entire site with a single line:
Disallow: /
This tells search engines to ignore every URL, including your homepage. Always double-check your rules—especially before a site launch or redesign.
Robots.txt vs noindex meta tags
While both tools influence search engine behavior, they work differently:
- robots.txt blocks crawlers from accessing a page at all.
- noindex allows crawling, but tells search engines not to include the page in search results.
Use robots.txt for sensitive or non-public sections of your site (like login pages), and noindex for content you want crawlers to see but not index (like thin tag pages or duplicate thank-you pages).
Advanced rules for different user agents (optional)
If you want more control, you can write rules for specific bots.
Example:
User-agent: AhrefsBot Disallow: / User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/This blocks only AhrefsBot while allowing other crawlers. Use this approach to:
- Reduce server load from aggressive bots
- Block scrapers or low-value crawlers
Just don’t block Googlebot unless you’re absolutely sure what you’re doing.
Bonus: How to prevent image indexing (and why you might not want to)
Some site owners want to block Google from indexing their images, often for privacy or branding reasons.
Blocking image folders in robots.txt
To prevent search engines from indexing your media uploads, you can add:
Disallow: /wp-content/uploads/
This blocks access to all image, video, and PDF uploads.
When image indexing helps SEO
For most WordPress sites, image indexing is a good thing. Optimized images with descriptive file names and alt text can drive traffic from Google Images. Blocking uploads reduces visibility—and can prevent featured images from showing up in SERPs.
Unless you have a specific reason, it’s better to leave /uploads/ open.
Next steps for optimizing your WordPress robots.txt
Your robots.txt file can guide search engines toward your most valuable content—or accidentally lock them out. With a strong template and some strategic tweaks, you’re setting your site up for better SEO performance.
Start by testing your current file, reviewing your index coverage in Google Search Console, and optimizing your sitemap strategy.
Ready to upgrade your WordPress experience? Professional hosting improves speeds, security, and reliability for a website and a brand that people find engaging and trustworthy.
Don’t want to deal with server management and maintenance either? Our fully managed hosting for WordPress is the best in the industry. Our team are not only server IT experts, but WordPress hosting experts as well. Your server couldn’t be in better hands.
Click below to explore options or start a chat with one of our WordPress hosting experts now to get answers to your questions and further guidance.
Additional resources
Comprehensive guide to securing WordPress with ModSecurity
→
This guide provides a comprehensive overview of how to use ModSecurity to enhance the security of your WordPress site.
WordPress SEO services guide: 7 best, how to choose, and more →
Explore essential WordPress SEO services to improve site visibility, rankings, and organic traffic.
Why security matters for WordPress enterprise hosting
→
Use the blog as your guide to attacks to watch out for, security best practices, and steps to improve the WordPress protection you already have.
Originally from Albuquerque, New Mexico, Leslie Bowman has hopped around the country since graduating from undergrad. She focused on English and Latin American History in college and eventually attended graduate school in New York City. There, she developed a passion for short, persuasive arguments. Bowman found a love of copywriting and is now a Senior Copywriter at Liquid Web.