FASTLAUNCHWEBGet My Free Website Audit
Local SEOSeptember 29, 20254 min read

ROBOTS.TXT FOR PLUMBING WEBSITES. TELL GOOGLE WHAT TO CRAWL.

Your robots.txt file is a tiny text file with big power. It tells Google what to look at and what to ignore on your plumbing website. Here's how to set it up right.

There's a file on your website you've probably never seen.

It's called robots.txt. It's tiny. Usually just a few lines of text.

But it's one of the first things Google reads when it visits your site. And if it's set up wrong, it could be telling Google to ignore your most important pages.

Yeah. Your website might literally be telling Google "don't look at me."

Let's make sure that's not happening.

What Is Robots.txt?

In the simplest terms, robots.txt is a set of instructions for search engines.

It lives at the root of your website: `yoursite.com/robots.txt`

When Google's crawler (called Googlebot) visits your website, the first thing it does is check this file. The file tells Googlebot: - Which pages it's allowed to crawl (look at and index) - Which pages it should skip - Where your sitemap is located

Think of it like a bouncer at a club. "You can go in there. Don't go in there. Here's the map of the place."

If your robots.txt file is missing, misconfigured, or blocking important pages, it directly affects your rankings.

How to Check Your Robots.txt Right Now

Easy. Type this into your browser:

`https://www.yoursite.com/robots.txt`

Replace "yoursite.com" with your actual domain. Hit enter.

You'll see one of three things:

1. A simple text file. Usually a few lines. This means you have one. Read the lines carefully (I'll explain what they mean in a second).

2. A 404 error (page not found). This means you don't have a robots.txt file. That's not a catastrophe, but it's not ideal either. Without one, Google will crawl everything, which is mostly fine, but you lose the ability to guide it.

3. A file that says "Disallow: /" followed by nothing else. This is the nightmare scenario. It means your entire website is blocked from Google. Googlebot reads this and says "okay, I won't crawl any of your pages." And you'll rank for absolutely nothing.

I've seen this happen. Usually on sites that were built on staging servers and never had the robots.txt updated for production. The developer blocked Google during development (normal) and then forgot to unblock it (disaster).

Understanding the Basics

Here's what a healthy robots.txt file looks like for a plumbing website:

``` User-agent: * Allow: /

Sitemap: https://www.yoursite.com/sitemap.xml ```

That's it. Three lines. Let me break them down:

User-agent: * means "these instructions apply to all search engines" (Google, Bing, Yahoo, etc.).

Allow: / means "you can crawl everything on the site."

Sitemap: tells Google where to find your sitemap (a file that lists all your pages). This helps Google discover and index all your pages faster.

If your robots.txt looks like this, you're in good shape. Google can access everything and knows where to find your sitemap.

What to Block (And What Not To)

For most plumbing websites, you want Google to crawl everything. Your homepage, service pages, blog posts, about page, contact page... all of it.

Pages you might want to block: - Admin or login pages (`/wp-admin/` for WordPress) - Thank you / confirmation pages (you don't want these ranking in search) - Internal search results pages - Staging or test pages - Duplicate pages or printer-friendly versions

Pages you should NEVER block: - Your homepage - Any service page - Any area/location page - Your blog posts - Your about or contact page - Any page you want to rank on Google

Common mistake: Blocking your CSS and JavaScript files. Some old robots.txt configurations block `/wp-includes/` or `/wp-content/` on WordPress sites. This prevents Google from rendering your site properly. Google needs access to your CSS and JS to understand your page layout. Don't block these.

The "Disallow: /" Disaster

I mentioned this earlier but it's worth repeating because it's that devastating.

If your robots.txt has this line:

``` User-agent: * Disallow: / ```

Your entire website is invisible to Google. Every page. Every blog post. Every service page. All blocked.

Google won't crawl any of it. You'll get zero organic traffic. Zero rankings. Zero calls from search.

And here's the scary part. Your website will look perfectly fine to you. You can visit it. Your customers can visit it (if they have the URL). Everything works.

You just won't appear in any search results. Ever.

I've seen plumbing businesses go months without realizing this is the problem. They think their SEO is bad. They blame their content. They blame their reviews.

Nope. Their robots.txt was just telling Google to go away.

Check yours. Right now. `yoursite.com/robots.txt`

How to Fix Your Robots.txt

On WordPress:

Option 1: Install Yoast SEO or Rank Math. Both let you edit your robots.txt directly from the WordPress admin panel (Settings > Tools > File Editor in Yoast, or Rank Math > General Settings > Edit robots.txt).

Option 2: Create a file called `robots.txt` and upload it to your site's root directory via FTP.

On Squarespace:

Squarespace generates robots.txt automatically. You can customize it in Settings > SEO > Robots.txt.

On Wix:

Wix also generates it automatically. You can edit it in SEO Tools > Robots.txt Editor.

What to put in it:

For most plumbing websites, keep it simple:

``` User-agent: * Allow: / Disallow: /wp-admin/

Sitemap: https://www.yoursite.com/sitemap.xml ```

That allows Google to crawl everything except your admin panel (which it doesn't need to see anyway) and tells it where your sitemap is.

Don't Forget the Sitemap Line

The `Sitemap:` line is important and often missing.

Your sitemap is a file that lists every page on your website. It helps Google discover pages faster and more efficiently.

If you're on WordPress with Yoast, your sitemap is usually at `yoursite.com/sitemap_index.xml`.

Make sure that URL is included in your robots.txt. It's a small thing that can speed up how quickly Google discovers new pages you add.

Test Before You Publish

Before you save any changes to your robots.txt, test it.

Google has a free robots.txt tester in Google Search Console. Go to Search Console > Settings > robots.txt. You can paste your proposed file and test whether it blocks or allows specific URLs.

Always test. One wrong character can accidentally block your entire site. A test takes 30 seconds and prevents a potential disaster.

The 2-Minute Audit

Here's your action plan:

  1. Visit yoursite.com/robots.txt (30 seconds)
  2. Check for "Disallow: /" ... if it's there without specific paths, that's bad. Fix it. (30 seconds)
  3. Check for a Sitemap line. If it's missing, add one. (30 seconds)
  4. Make sure important pages aren't blocked. (30 seconds)

Four steps. Two minutes. And you'll know whether your robots.txt is helping or hurting your rankings.

Get your free website audit and we'll check your robots.txt along with 50 other technical SEO factors that affect how Google sees your plumbing website.

This is one of those "boring but important" things that can make or break your rankings. Don't ignore it.

---

P.S. Every website we build ships with a properly configured robots.txt from day one. It's one of a hundred little technical details we handle so you don't have to think about them. See our pricing and get a website where nothing slips through the cracks.

DONE READING? LET'S MAKE YOUR PHONE RING.

Book a free 15-minute audit. We'll look at your current website and tell you exactly what's costing you calls. No pressure. No BS.

Get My Free Website Audit

MORE ARTICLES