Amplifier Bundle
context-managed
Intelligent Context Management for Amplifier
Active
April 2026
The Problem
Context dies silently
When sessions grow long, old conversation history gets mechanically chopped. The model suddenly loses track of everything.
🔍
Files forgotten
The model forgets which files it already read, re-reads them, or invents content it never saw.
✂️
Decisions lost
Critical design decisions from early in the session vanish. The model contradicts its own prior reasoning.
💭
Hallucination to fill gaps
Tool outputs and return values disappear from context. The model confabulates results it never received.
The Solution
LLM-powered rolling summaries
Instead of chopping history, the bundle uses the LLM itself to create structured summaries as context grows. The model always sees a gradient: compressed older context, full-detail recent turns.
- Older turns compressed into structured summaries preserving decisions, files, and tool results
- Recent turns kept at full verbatim fidelity
- The model knows what's been compressed and how to recover details
- No more silent gaps — no more hallucinations from missing context
How It Works
The context gradient
Every turn lives on a spectrum from compressed to verbatim. Older history is always available — just denser.
Compressed
Structured summaries
Moderate
Condensed detail
Verbatim
Full recent turns
- Summaries include what was done, which files were touched, what tools returned
- Summaries are created proactively, with headroom before limits
- The model always has orientation — it never wakes up confused
Budget-Aware
Threshold cascade
Four thresholds ensure the model never gets close enough to the limit to behave erratically.
60%
LLM Summarization
Proactive — compress old turns into smart summaries with headroom to spare
70%
Budget Pressure Warning
Model is alerted that context is growing — adjusts behavior accordingly
92%
Emergency Mechanical Fallback
Fast, deterministic trimming with explanatory markers if LLM summarization couldn't keep up
100%
Hard Cap
Never exceeded — the budget is the budget
Recovery
On-demand verbatim history
When a summary isn't enough, the model can pull the exact original conversation back.
- Persistent JSONL transcript with format versioning
- Full tool inputs and outputs preserved — nothing discarded
- The model decides when it needs more detail and retrieves it on demand
- Session resume support — pick up where you left off
Integration
Drop-in replacement
The bundle includes its own root bundle.md with all foundation behaviors built in. No changes to your existing tools, providers, or agents.
📦
Add the bundle
One command installs the bundle and its modules from the GitHub repo.
🔌
Set it active
Use it as your main active bundle. Your tools, providers, and agents stay exactly the same.
amplifier bundle add git+https://github.com/microsoft/amplifier-bundle-context-managed
amplifier bundle use context-managed
Quality
Tested in production conditions
329
Tests across 2 modules
10
Bugs found & fixed in E2E
2
Modules (context + transcript tool)
v1
JSONL format versioning
- Real end-to-end testing in Docker containers — not just unit tests
- Session resume support for interrupted workflows
- Persistent transcript survives crashes and restarts
- Format versioning ensures future compatibility
Get Started
Try it now
Add the bundle, set it active. Your sessions get smarter context management immediately.
Next: create your own custom bundle taking inspiration from foundation, amplifier-dev, and more.
View on GitHub
github.com/microsoft/amplifier-bundle-context-managed
More Amplifier Stories