◦ Comprehensive security
◦ 24/7 support
WordPress Guide → SEO → Robots.txt
WordPress robots.txt — a complete beginner’s guide
If you want better SEO and tighter control over how search engines crawl your WordPress site, robots.txt is your first stop. Even though it’s just a small text file, it plays a big role in how search engines see (and ignore) parts of your site.
Let’s walk through what it does, how to edit it, and the best practices to follow—whether you’re running a blog, a business site, or something in between.
Get fast, reliable hosting for WordPress
Power your site with the industry’s fastest, most optimized WordPress hosting
What is robots.txt in WordPress?
The robots.txt file is a plain text file that gives search engine bots instructions on what parts of your site they’re allowed to crawl. These bots—like Googlebot—scan your website to add your pages to search engines.
In WordPress, you don’t automatically get a physical robots.txt file when you install the software. Instead, WordPress generates a virtual version of it behind the scenes. You can see it by visiting:
yourdomain.com/robots.txt
This virtual file lives in memory, not in your file system. That means you can view it, but you can’t directly edit it unless you create a physical version yourself or use a plugin that replaces it.
Why robots.txt matters for SEO and site performance
The robots.txt file is important because it tells bots where to go and (just as importantly) where not to go.
Here’s why that matters:
- Avoid indexing junk pages. You can prevent bots from indexing login pages, admin sections, and plugin files that have no SEO value.
- Control crawl budget. Google doesn’t crawl every page on every visit. By focusing bots on the right pages, you help them find and index the most important content.
- Protect private areas. While it’s not a security tool, robots.txt can discourage bots from visiting sensitive areas like staging sites or gated content.
- Improve speed and efficiency. Bots that aren’t wasting time crawling unnecessary files can index your site faster and more accurately.
Default WordPress robots.txt file and its limitations
Out of the box, WordPress includes a virtual robots.txt file with very basic rules. A typical version looks like this:
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.phpHere’s what it means:
- User-agent: * targets all bots.
- Disallow: /wp-admin/ tells bots not to crawl the admin dashboard.
- Allow: /wp-admin/admin-ajax.php is an exception that allows access to a file WordPress uses for features like live updates and contact forms.
This setup is fine for basic sites, but it doesn’t give you control over things like plugin folders, media files, or SEO-specific settings. That’s why most site owners eventually choose to customize their own.
How to view and test your current robots.txt file
Before editing anything, it helps to see what you’re starting with.
- Visit your current robots.txt file. Open your browser and go to https://yourdomain.com/robots.txt. If you’ve never created one, this will show the default virtual version.
- Test in Google Search Console. If your site is verified in Search Console, use the URL inspection tool to see how Google views your robots.txt rules.
- Use SEO tools or plugins. Some SEO plugins like Yoast and Rank Math let you preview and test your file from the dashboard. files can index your site faster and more accurately.
How to create or edit a robots.txt file in WordPress
You can edit robots.txt using a plugin, through cPanel/File Manager, or with FTP. Let’s go over each method.
Option 1: Using an SEO plugin like Yoast or Rank Math
Plugins make editing robots.txt beginner-friendly and reduce the risk of file errors.
With Yoast SEO:
- From your WordPress dashboard, go to SEO > Tools.
- Click File editor.
- If a robots.txt file doesn’t exist, Yoast will create one for you.
- Make your edits directly in the box, then click Save changes to robots.txt.
With Rank Math:
- Go to Rank Math > General Settings > Edit robots.txt.
- Make your changes and click Save changes.
Plugins will automatically create a physical robots.txt file in your website’s root directory that overrides the virtual one.
Option 2: Manually via FTP or File Manager
If you’re comfortable with file access, you can create the file directly.
- Open your site using FTP (via FileZilla) or your hosting control panel’s File Manager.
- Navigate to the root directory of your WordPress site (usually called public_html).
- Create a new file named robots.txt.
- Edit it using a plain text editor (like Notepad) or the built-in editor in File Manager.
- Save and upload it.
Make sure the file is readable by search engine bots (permission 644 is standard).
Option 3: Using cPanel File Manager
- Log in to your hosting account and open cPanel.
- Click File Manager, then open the public_html folder.
- Click + File to create a new file, and name it robots.txt.
- Right-click it, then choose Edit to open the text editor.
- Add your custom rules and save the file.
Best practices for WordPress robots.txt configuration
Follow these tips to make sure your robots.txt file helps, rather than hurts, your SEO.
- Don’t block CSS or JavaScript files. Google needs these files to render your pages correctly. Blocking them can lead to poor indexing and layout issues.
- Keep important URLs crawlable. Pages like your homepage, blog posts, or shop pages should not be disallowed.
- Disallow sensitive areas. Block /wp-admin/ and /wp-login.php unless you have a reason to let bots crawl them.
- Add your sitemap. Always include a line that points to your sitemap:
- Sitemap: https://yourdomain.com/sitemap.xml
How to submit your robots.txt file to Google
If you’ve made big changes to your robots.txt file, it’s a good idea to tell Google.
- Log in to Google Search Console.
- Use the URL Inspection tool to test any affected pages.
- In Settings, under Crawl stats, you can see if Google is hitting your new robots.txt file.
- You don’t need to submit the file itself—but submitting your sitemap afterward is a good idea.
Troubleshooting common robots.txt issues
Watch out for these beginner mistakes:
- Blocked CSS or JS files. If your site looks weird in Google’s preview, make sure you’re not blocking folders like /wp-includes/ or /wp-content/.
- Pages marked ‘Indexed, though blocked by robots.txt.’ That means Google found the page through a link but can’t crawl it. Consider removing the block or using a noindex meta tag instead.
- Conflicts with plugins. If you’re using multiple SEO or performance plugins, they might try to edit robots.txt at the same time. Always test after making changes.
Getting started with WordPress robots.txt
The robots.txt file may seem small, but it plays a big role in your site’s crawlability and SEO health. Whether you’re running a blog or an online store, the right configuration can help bots find what matters — and ignore what doesn’t.
Start by checking your current setup, then use one of the simple editing methods to fine-tune your robots.txt file for better control and performance.
Ready to upgrade your WordPress experience? Professional hosting improves speeds, security, and reliability for a website and a brand that people find engaging and trustworthy.
Don’t want to deal with server management and maintenance either? Our fully managed hosting for WordPress is the best in the industry. Our team are not only server IT experts, but WordPress hosting experts as well. Your server couldn’t be in better hands.
Click below to explore options or start a chat with one of our WordPress hosting experts now to get answers to your questions and further guidance.
Additional resources
Comprehensive guide to securing WordPress with ModSecurity
→
This guide provides a comprehensive overview of how to use ModSecurity to enhance the security of your WordPress site.
Optimizing Largest Contentful Paint (LCP) in WordPress →
Understand how Largest Contentful Paint (LCP) impacts your WordPress site’s performance and how to optimize it for better user experience and SEO.
Why security matters for WordPress enterprise hosting
→
Use the blog as your guide to attacks to watch out for, security best practices, and steps to improve the WordPress protection you already have.
Originally from Albuquerque, New Mexico, Leslie Bowman has hopped around the country since graduating from undergrad. She focused on English and Latin American History in college and eventually attended graduate school in New York City. There, she developed a passion for short, persuasive arguments. Bowman found a love of copywriting and is now a Senior Copywriter at Liquid Web.