◦ Comprehensive security
◦ 24/7 support
WordPress Guide → Errors → Robots.txt
Error: Robots.txt in WordPress stuck on Disallow (and how to fix it)
Your site looks great. Your content is ready. But somehow, your pages aren’t showing up in Google. Before you start blaming algorithms or SEO strategy, check one small but powerful file: robots.txt.
A single line in this file—Disallow: /—can stop search engines from crawling your entire site. If you’re seeing this directive and can’t get rid of it, don’t panic. You’re not alone, and the fix is easier than you think.
Get fast, reliable hosting for WordPress
Power your site with the industry’s fastest, most optimized WordPress hosting
What does “Disallow” mean in robots.txt?
In robots.txt, “Disallow” tells search engines not to crawl specific parts of your website. The robots.txt file tells search engines what they can and cannot crawl on your website. It doesn’t hide content from users, but it can prevent pages from being indexed in search results.
The role of robots.txt in WordPress SEO
Every major search engine checks robots.txt before it tries to crawl your site. If the file says Disallow: /, that’s a full stop—it blocks everything. WordPress sometimes creates this directive automatically when privacy settings are enabled.
This might be helpful if your site is under construction, but it’s a big problem once you’re ready to be found online.
Common Disallow mistakes
Here are a few examples of Disallow directives and what they do:
- Disallow: / — Blocks everything, including your homepage.
- Disallow: /wp-content/ — Blocks images, styles, and other media.
- Disallow: /wp-admin/ — Common and usually okay. It blocks admin pages from being indexed, which you don’t want in search anyway.
Even well-meaning directives can cause indexing issues if you’re not careful. Let’s look at how to identify and fix the problem.
How to check if robots.txt is stuck on Disallow
You don’t need technical tools to check your current robots.txt. A browser and your WordPress dashboard are enough to get started.
Use your browser or an online tool
- Open a browser tab and go to https://yourdomain.com/robots.txt.
- If you see Disallow: / near the top, that’s your problem.
- Optional: Use the Google Search Console robots.txt Tester to simulate how Google interprets your rules.
If the file is empty or doesn’t exist, WordPress might be generating a virtual one.
Check your WordPress settings
- Log into your WordPress admin dashboard.
- Go to Settings > Reading.
- Find the option labeled Discourage search engines from indexing this site.
- If this box is checked, uncheck it and save.
This setting tells WordPress to create a virtual robots.txt with Disallow: /. Even if you upload a physical file, this setting can override it.
How to fix a robots.txt file stuck on Disallow
Once you’ve identified the issue, you can fix it in one of two ways: using a plugin, or editing the file directly.
Method 1 – Edit robots.txt using an SEO plugin
Most popular SEO plugins let you edit robots.txt from the dashboard:
Yoast SEO
- Go to SEO > Tools > File Editor.
- If a robots.txt file exists, you’ll see the contents.
- Remove any Disallow: / lines that are blocking content.
- Click Save changes to robots.txt.
Rank Math
- Navigate to Rank Math > General Settings > Edit robots.txt.
- Modify or remove problematic Disallow rules.
- Click Save Changes.
All in One SEO
- Go to All in One SEO > Search Appearance > Advanced.
- Scroll to the robots.txt section.
- Edit the content as needed and save.
Example of a clean robots.txt:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml
Method 2 – Edit robots.txt manually via FTP or file manager
If you prefer direct control:
- Use an FTP client like FileZilla or your hosting provider’s file manager.
- Navigate to your site’s root directory (usually public_html).
- Find the robots.txt file. If it doesn’t exist, create a new one.
- Open the file in a text editor.
- Replace or remove the line Disallow: /.
- Save and reupload the file to the same root directory.
Need help creating the file? Just open Notepad, paste in your rules, save it as robots.txt, and upload it via FTP.
Still not working? Advanced troubleshooting steps
Sometimes the issue goes deeper than what you see in the dashboard or file editor. Here’s what else to check.
Your site may be using a virtual robots.txt
If you don’t have a physical robots.txt file in your root folder, WordPress may be generating one automatically based on internal settings. This virtual file can be influenced by:
- The “Discourage search engines” checkbox in Settings > Reading.
- Certain themes or plugins that override SEO behavior.
Tip: Create and upload a physical robots.txt to override the virtual one.
Plugin or theme conflicts
Some plugins can alter or replace your robots.txt rules without warning:
- SEO plugins like Yoast or AIOSEO
- Security plugins like Wordfence
- Maintenance mode or under-construction plugins
Try temporarily disabling these one at a time and rechecking your robots.txt file. If the file changes, you’ve found the culprit.
File permission issues
Make sure your server can read and update robots.txt. Recommended permissions = 644 (readable by the server)
How to check:
- Right-click the file in your FTP client or file manager.
- Select File permissions or Change Permissions.
- Update the numeric value to 644 or 666 if needed.
Caching issues
Robots.txt changes can take time to appear if your site uses caching:
- Browser cache: Try opening your robots.txt in an Incognito or Private window.
- Server-side cache: Clear your WordPress cache using plugins like WP Rocket or W3 Total Cache.
- Search engine cache: Use Google Search Console’s URL Inspection tool to request reindexing.
How to confirm your site is no longer blocked
Once you’ve made changes, it’s important to verify that search engines can now access your site.
Use Google Search Console
- Log into Google Search Console.
- Use the URL Inspection tool to test a few pages.
- If they’re crawlable, you’re good to go.
- Use the robots.txt Tester to make sure your file doesn’t block important paths.
Use a third-party SEO crawler
These tools can help simulate how search engines view your site:
- Screaming Frog SEO Spider (free up to 500 URLs)
- Ahrefs Site Audit
- SEMrush Site Audit
They’ll let you know if parts of your site are still blocked or inaccessible.
Prevent robots.txt errors in the future
A few proactive steps can help you avoid running into this problem again.
Set up alerts and tracking
- Monitor indexing coverage in Google Search Console.
- Use tools like UptimeRobot or Jetpack to watch for downtime or status changes.
Create a clean and simple robots.txt file
Here’s a template that works for most WordPress sites:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml
This setup blocks admin pages (which shouldn’t be public anyway) but allows all other content and includes your sitemap.
Document plugin responsibilities
If you have multiple plugins managing SEO, maintenance mode, or performance, note which ones touch robots.txt. Keeping overlapping tools to a minimum prevents unexpected overrides.
Next steps for fixing WordPress robots.txt errors
A single Disallow line can block your entire site from search engines, often without you even realizing it. Fortunately, WordPress gives you several ways to diagnose and fix the issue quickly—if you know where to look.
Start by checking the virtual settings inside WordPress and editing your robots.txt using your preferred method—plugin, FTP, or file manager.
Ready to upgrade your WordPress experience? Professional hosting improves speeds, security, and reliability for a website and a brand that people find engaging and trustworthy.
Don’t want to deal with server management and maintenance either? Our fully managed hosting for WordPress is the best in the industry. Our team are not only server IT experts, but WordPress hosting experts as well. Your server couldn’t be in better hands.
Click through below to explore all of our hosting for WordPress options, or chat with a WordPress expert right now to get answers and advice.
Additional resources
Diagnosing WordPress errors on your site →
Even more common errors, how to troubleshoot them, and how to solve them
WordPress Multisite domain mapping →
A step-by-step guide to setup domain mapping
What is managed WordPress hosting? →
What it means, what it includes, and how to decide if it’s right for you
Haritha Jacob is a knowledgeable System Engineer with extensive experience in resolving costumers’ complaints and issues. She has experience with various programming languages and operating systems, enterprise backup and recovery procedures, system performance-monitoring tools, and more.