WordPress GuideErrors → Robots.txt

Error: Robots.txt in WordPress stuck on Disallow (and how to fix it)

Your site looks great. Your content is ready. But somehow, your pages aren’t showing up in Google. Before you start blaming algorithms or SEO strategy, check one small but powerful file: robots.txt.

A single line in this file—Disallow: /—can stop search engines from crawling your entire site. If you’re seeing this directive and can’t get rid of it, don’t panic. You’re not alone, and the fix is easier than you think.

Get fast, reliable hosting for WordPress

Power your site with the industry’s fastest, most optimized WordPress hosting

What does “Disallow” mean in robots.txt?

In robots.txt, “Disallow” tells search engines not to crawl specific parts of your website. The robots.txt file tells search engines what they can and cannot crawl on your website. It doesn’t hide content from users, but it can prevent pages from being indexed in search results.

The role of robots.txt in WordPress SEO

Every major search engine checks robots.txt before it tries to crawl your site. If the file says Disallow: /, that’s a full stop—it blocks everything. WordPress sometimes creates this directive automatically when privacy settings are enabled.

This might be helpful if your site is under construction, but it’s a big problem once you’re ready to be found online.

Common Disallow mistakes

Here are a few examples of Disallow directives and what they do:

Even well-meaning directives can cause indexing issues if you’re not careful. Let’s look at how to identify and fix the problem.

How to check if robots.txt is stuck on Disallow

You don’t need technical tools to check your current robots.txt. A browser and your WordPress dashboard are enough to get started.

Use your browser or an online tool

If the file is empty or doesn’t exist, WordPress might be generating a virtual one.

Check your WordPress settings

This setting tells WordPress to create a virtual robots.txt with Disallow: /. Even if you upload a physical file, this setting can override it.

How to fix a robots.txt file stuck on Disallow

Once you’ve identified the issue, you can fix it in one of two ways: using a plugin, or editing the file directly.

Method 1 – Edit robots.txt using an SEO plugin

Most popular SEO plugins let you edit robots.txt from the dashboard:

Yoast SEO

Rank Math

All in One SEO

Example of a clean robots.txt:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml

Method 2 – Edit robots.txt manually via FTP or file manager

If you prefer direct control:

Need help creating the file? Just open Notepad, paste in your rules, save it as robots.txt, and upload it via FTP.

Still not working? Advanced troubleshooting steps

Sometimes the issue goes deeper than what you see in the dashboard or file editor. Here’s what else to check.

Your site may be using a virtual robots.txt

If you don’t have a physical robots.txt file in your root folder, WordPress may be generating one automatically based on internal settings. This virtual file can be influenced by:

Tip: Create and upload a physical robots.txt to override the virtual one.

Plugin or theme conflicts

Some plugins can alter or replace your robots.txt rules without warning:

Try temporarily disabling these one at a time and rechecking your robots.txt file. If the file changes, you’ve found the culprit.

File permission issues

Make sure your server can read and update robots.txt. Recommended permissions = 644 (readable by the server)

How to check: 

Caching issues

Robots.txt changes can take time to appear if your site uses caching:

How to confirm your site is no longer blocked

Once you’ve made changes, it’s important to verify that search engines can now access your site.

Use Google Search Console

Use a third-party SEO crawler

These tools can help simulate how search engines view your site:

They’ll let you know if parts of your site are still blocked or inaccessible.

Prevent robots.txt errors in the future

A few proactive steps can help you avoid running into this problem again.

Set up alerts and tracking

Create a clean and simple robots.txt file

Here’s a template that works for most WordPress sites:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml

This setup blocks admin pages (which shouldn’t be public anyway) but allows all other content and includes your sitemap.

Document plugin responsibilities

If you have multiple plugins managing SEO, maintenance mode, or performance, note which ones touch robots.txt. Keeping overlapping tools to a minimum prevents unexpected overrides.

Additional resources

Diagnosing WordPress errors on your site →

Even more common errors, how to troubleshoot them, and how to solve them

WordPress Multisite domain mapping →

A step-by-step guide to setup domain mapping

What is managed WordPress hosting? →

What it means, what it includes, and how to decide if it’s right for you

Haritha Jacob is a knowledgeable System Engineer with extensive experience in resolving costumers’ complaints and issues. She has experience with various programming languages and operating systems, enterprise backup and recovery procedures, system performance-monitoring tools, and more.