The model sees a constructed input pipeline, not one giant static string

What You’ll Learn

  • Why a single static system prompt doesn’t scale
  • How to build a system prompt pipeline with multiple stages
  • How to inject dynamic context based on current state

The Problem

A single monolithic system prompt tries to cover everything: role, tools, conventions, memory, skills. It gets big, stale, and hard to maintain. Adding one more rule means scrolling through 2000 lines.

The Solution

A system prompt pipeline that assembles the prompt from composable stages:

Pipeline stages:
+------------------+
| 1. Identity      |  "You are a coding agent"
+------------------+
| 2. Environment   |  "Working directory: /project"
+------------------+
| 3. Tools         |  "Available: bash, read, write..."
+------------------+
| 4. Memory        |  "User prefers pytest"
+------------------+
| 5. Skills (names)|  "Skills: git, test, code-review"
+------------------+
| 6. Rules         |  "Always commit after changes"
+------------------+

How It Works

  1. Each stage is a function that returns a string (or nothing).

  2. A SystemPromptBuilder chains stages together.

class SystemPromptBuilder:
    def __init__(self):
        self.stages = []

    def add(self, stage_fn):
        self.stages.append(stage_fn)

    def build(self, context: dict) -> str:
        parts = []
        for stage in self.stages:
            result = stage(context)
            if result:
                parts.append(result)
        return "\n\n".join(parts)
  1. Context (current file, git branch, memory) flows through the pipeline and each stage can use it.

What Changed From s09

ComponentBefore (s09)After (s10)
System promptStatic stringComposable pipeline
MaintenanceEdit giant blockAdd/remove stages
ContextFixed at startDynamic per turn

Try It

cd learn-claude-code
python agents/s10_system_prompt.py
  1. What do you know about this project? (check system prompt assembly)
  2. Add a new rule stage and verify it appears in the prompt
  3. Load a skill and verify the system prompt updated

Key Takeaway

The model sees a constructed input pipeline, not one giant static string. Compose your system prompt from independent, context-aware stages.