The Perfect WordPress robots.txt File (2026 SEO Edition)
WordPress powers 43% of the internet, which means Googlebot spends nearly half its life crawling WordPress sites. But out of the box, WordPress's default robots.txt is... basic.
In 2026, an unoptimized robots.txt file bleeds "Crawl Budget." If you are letting Google waste time crawling your admin themes, plugin readmes, or internal search result pages, you are actively hurting your rankings. This guide gives you the exact copy-paste code you need for a modern, high-ranking WordPress site.
Table of Contents
1. The Perfect Template (Copy-Paste)
This template is designed for 99% of WordPress sites (Blogs, Portfolios, Business sites).
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php Disallow: /wp-includes/ Allow: /wp-includes/js/ Allow: /wp-includes/images/ Allow: /wp-includes/css/ # Block Internal Search (High Priority) Disallow: /?s= Disallow: /search/ # Block User Profiles (Usually Low Value) Disallow: /author/ # Ignore this if you are a multi-author news site! Sitemap: https://yourdomain.com/sitemap_index.xml
*Note: Replace `yourdomain.com` with your actual domain.
2. Why These Rules Matter
Disallow: /wp-admin/
Standard security. There is simply no reason for Google to be sniffing around your login page or backend dashboard. It's a waste of their time and your server resources.
Allow: /wp-admin/admin-ajax.php
CRITICALLY IMPORTANT. Many plugins (like greedy sliders or contact forms) use AJAX to render content on the front-end. If you block the admin folder entirely, you inadvertently block this file, and Google might see your page as "broken." Always white-list this file.
Disallow: /?s=
This blocks internal search results. Without this, spambots can generate infinite URLs like domain.com/?s=casino, which Google might confusingly index, destroying your site's authority.
3. Virtual vs. Physical Files
"I can't find the robots.txt file in my FTP / File Manager!"
That's normal! WordPress creates a Virtual robots.txt file dynamically. It doesn't actually exist on the server disk unless you create it manually.
Best Practice 2026: Let plugins like RankMath or Yoast manage the virtual file. Don't upload a physical text file unless you are a developer who needs raw control. Physical files override virtual ones and can confuse plugins.
4. Editing with RankMath & Yoast
Using RankMath (Recommended)
- Go to RankMath Dashboard > General Settings.
- Click on Edit robots.txt.
- If you see a text box, simply paste the template from above.
- If locked, ensure no physical file exists on your server.
- Save changes.
Using Yoast SEO
- Go to Yoast SEO > Tools.
- Click File Editor.
- The top box is typically for `.htaccess`, the bottom one is for `robots.txt`.
- If you don't see it, click "Create robots.txt file".
- Paste and Save.
5. Special Rules for WooCommerce
Online stores have unique "junk" pages like Carts and Checkouts that should be de-indexed to save crawl budget.
# WooCommerce Specifics Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /*?add-to-cart=* Disallow: /*?orderby=* Disallow: /*?price-min=*
Why block filters (orderby, price-min)? Faceted navigation creates millions of low-quality URLs (e.g., "Sort by low price + Red + Size M"). Blocking these parameters prevents Google from getting lost in infinite variations.
6. Dangerous Mistakes to Avoid
Blocking /wp-content/
Mistake! This folder holds your themes and plugins (CSS/JS). If you block it, Google sees your site as unstyled text, which is terrible for "Mobile Usability" scores.
Blocking /category/ or /tag/
Controversial. Generally, you want category pages indexed (they are good hubs), but you might want to de-index tags. Use noindex meta tags instead of blocking them here to allow link equity to flow.