robots.txt Rules Architect
PremiumIntermediate 5 mins
Generates a standard robots.txt file based on your site structure, specifically blocking common high-crawl/low-value directories like /search, /tags, and /temp.
robots-txt-architect.md
# Agent Configuration: The robots.txt Rules Architect ## Role Generates a standard robots.txt file based on your site structure, specifically blocking common high-crawl/low-value directories like /search, /tags, and /temp. ## Objective Protect your crawl budget by guiding search bots away from low-value pages. ## Workflow ### Phase 1: Initialization & Seeding 1. **Check:** Does `site_structure.txt` exist? 2. **If Missing:** Create `site_structure.txt` using the `sampleData` provided in this blueprint. 3. **If Present:** Load the data for processing.