Long AI sessions get heavy. The more tokens you accumulate, the worse the model behaves: slower, less precise, and more prone to losing the thread.
The usual reaction is to start a new session and re-explain everything. That works, but it’s slow and it kills momentum.
There’s a better way in Windsurf: use rules as a context bridge.
The idea
Windsurf rules are usually treated as static project guidelines. But they can also act as portable context snapshots that you carry between sessions.
The pattern is simple:
- Create a rule called something like
current-context. - Inside it, write the relevant state of what you’re working on.
- When a session gets bloated, open a new one and reference it with
@current-context. - Keep working without re-explaining anything.
That’s it. No plugins, no external memory tooling.
Treat long AI sessions like long-lived branches. At some point you want to cut them and start fresh.
Why this matters
There’s a real cost to keeping a session alive past its useful point:
- Responses become less accurate.
- The model starts contradicting earlier decisions.
- You pay attention tax re-reading older messages.
- Debugging the conversation becomes harder than the code.
Starting clean is healthier. The blocker has always been the friction of rebuilding context. Rules remove that friction.

Scaling it with feature-scoped rules
The interesting part comes when you stop thinking about one single current-context rule and start treating rules as a library of context fragments.
A few examples that work well in practice:
@context-auth-refactor— ongoing work on the auth module@context-design-system— decisions around tokens and primitives@context-perf-audit— findings from a performance pass@context-migration-v18— state of an Angular upgrade
Each rule captures:
- what the goal is
- what’s already decided
- what’s pending
- relevant files or constraints
Then you mix and match depending on the task. Working on a perf issue inside the auth flow? @context-auth-refactor @context-perf-audit. Done.
This connects with something I’ve mentioned before about keeping AI workflows tight and intentional — the value isn’t in giving the model more, it’s in giving it the right slice at the right moment.
A minimal rule template
Nothing fancy works best. Something like:
# Conversation Context
This rule only provides high-level context.
User session context:
[WRITE CURRENT CONVERSATION CONTEXT HERE]
Do not assume implementation details from this file.
Do not generate code based only on these instructions.
You MUST inspect the existing project structure, patterns,
architecture and source code before making decisions or writing code.
Short, structured, easy to update. The model gets exactly what it needs and nothing else.
Small habits that make it work
A few things I’ve found useful:
- Update the rule before closing a heavy session. Future-you will thank present-you.
- Keep rules short. If they grow too much, split them by feature.
- Date the decisions when they might change. Context rots.
- Don’t dump logs or diffs into the rule. Summarize.
The rule is a memory aid, not a transcript.
The real win
This isn’t a productivity hack. It’s closer to applying basic engineering hygiene to AI workflows: small units, clear boundaries, explicit state.
Sessions become disposable. Context becomes durable. You stop fearing the moment a conversation gets too long, because starting over costs almost nothing.
And once you have a few feature-scoped rules in place, switching between tasks feels less like restarting and more like checking out a branch.
That’s the part that actually changes how you work.
