What is robots.txt?

What is robots.txt? (And Why You Better Not Screw It Up)

Okay — let’s not pretend this is exciting. It’s not.
But if you want Google to treat your website like a grown-up, you need to understand this one little file that sits quietly at the root of your domain and basically says:

“Hey Google, here’s what you can and can’t crawl. Play nice.”

Let’s go.


🔹 robots.txt = Your Website’s Doorman

Robots.txt is a plain text file. You put it at:
yourwebsite.com/robots.txt

Search engines check it first before crawling anything on your site.

That’s it. No drama. Just rules.

Think of it like this:

🚪 Door 1: “Come on in!”
🚫 Door 2: “Get lost.”


🔹 What’s It Look Like?

This simple:

pgsql
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Translation

  • User-agent: * → applies to all bots

  • Disallow: /wp-admin/ → don’t crawl admin stuff

  • Allow: /wp-admin/admin-ajax.php → okay, crawl this one thing

That’s it. This whole file might be 5 lines long. Still important.


🔹 Why Should You Give a Damn?

  1. Google doesn’t crawl everything. You have a “crawl budget” (yes, it’s a thing). Don’t waste it on useless crap like tag pages or old test folders.

  2. Keeps private junk private. Login pages, thank-you pages, weird old redirects — block ’em.

  3. Cleaner search results. Nobody wants your /filter-by/price pages on Google.


🔹 What NOT to Do (Seriously)

Disallow: /

Don’t ever do this unless you hate traffic.

That one line tells Google: “Don’t crawl ANYTHING.” Ever. Goodbye rankings.

❌ Block your CSS/JS

If Google can’t load your styles or scripts, your site looks broken to them. You look dumb. Don’t do it.

❌ Think it’s for security

It’s not. Bad bots ignore it. People can still visit hidden URLs. You wanna protect stuff? Use passwords, not robots.txt.


🔹 How to Create One (In 3 Seconds)

  1. Open Notepad

  2. Write your rules

  3. Save as robots.txt

  4. Upload to your site root
    (Yep, just drop it in public_html or wherever your site lives)

If you’re on WordPress:
→ Yoast SEO plugin does it for you.

Done.


🔹 Should You Even Bother?

Yes. Unless you like Google crawling garbage. Here’s a fast rulebook:

Want this crawled? Use robots.txt?
Blog posts ✅ Let it crawl
/wp-admin/ 🚫 Block it
/checkout/ 🚫 Block it
Images? 🤷 Depends
JS/CSS files ✅ Let it crawl

🔹 Bonus: Drop Your Sitemap

At the end of the file, just add this:

Sitemap: https://yourwebsite.com/sitemap.xml

Easy. Now bots know where your good stuff lives.


🔹 Quick Example for Bloggers

Here’s one you can steal:

makefile
User-agent: *
Disallow: /wp-admin/
Disallow: /tag/
Disallow: /author/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourblog.com/sitemap.xml

Blocks the stuff you don’t need on Google. Keeps the focus on content.


🔹 Final Word (Straight Up)

Don’t overthink this.

Make the file. Block the crap. Let Google crawl your best stuff.
That’s all robots.txt is. A few lines. Big results. And if you mess it up, you will tank your traffic. So get it right.


You want me to throw in a WordPress-friendly template?
Or one for Shopify / Ghost / Wix / Blogger? Just say the word. I’ll drop it raw.

2 thoughts on “What is robots.txt?”

Leave a Comment