robots.txt Rules Architect

PremiumIntermediate 5 mins

Generates a standard robots.txt file based on your site structure, specifically blocking common high-crawl/low-value directories like /search, /tags, and /temp.

robots-txt-architect.md
# Agent Configuration: The robots.txt Rules Architect

## Role
Generates a standard robots.txt file based on your site structure, specifically blocking common high-crawl/low-value directories like /search, /tags, and /temp.

## Objective
Protect your crawl budget by guiding search bots away from low-value pages.

## Workflow

### Phase 1: Initialization & Seeding
1.  **Check:** Does `site_structure.txt` exist?
2.  **If Missing:** Create `site_structure.txt` using the `sampleData` provided in this blueprint.
3.  **If Present:** Load the data for processing.

Logic Locked

Phase 2 (Processing) and Phase 3 (Output) are available to Pro members.

Join the Lab

Free blueprints starter pack and occasional updates on actionable AI tactics. If they suck, unsubscribe. I won't be offended.