🔄Prosthetic Tool
context_recovery
Full re-entry brief when returning to a dormant topic. A “previously on...” for your thinking.
When to use this
Use context_recovery when returning to a domain after days or weeks away. It provides a richer brief than tunnel_state, with full summaries of recent conversations to help you re-enter the context.
Quick Example
mcporter call brain.context_recovery domain="data-engineering"Response
🔄 Context Recovery: data-engineering
Last active: 12 days ago | Stage: crystallizing
📋 Recent Summary (5 conversations):
1. "ETL Pipeline Redesign" (3 weeks ago)
→ Decided: Move from pandas to DuckDB for transformations
→ Open: How to handle incremental updates?
2. "Parquet Partitioning Strategy" (3 weeks ago)
→ Decided: Partition by year/month for balance
→ Breakthrough: Predicate pushdown saves 10x on reads
3. "Embedding Pipeline Optimization" (2 weeks ago)
→ Open: Batch size vs memory tradeoff
→ Open: Should we pre-filter noise before embedding?
❓ Top unresolved questions:
- Incremental update strategy for parquet
- Batch size optimization for embeddings
- Error handling for malformed conversation imports
✅ Key decisions still in effect:
- DuckDB over pandas for transforms
- Parquet with year/month partitioning
- nomic-embed-text-v1.5 as embedding modelParameters
| Name | Type | Required | Default | Description |
|---|---|---|---|---|
domain | string | Yes | — | The domain to recover context for |
summary_count | integer | No | 5 | Number of recent summaries to include in the recovery brief |
Examples
Basic recovery
mcporter call brain.context_recovery domain="ai-dev"Deeper recovery with more summaries
mcporter call brain.context_recovery domain="business-strategy" summary_count=10context_recovery vs tunnel_state
| context_recovery | tunnel_state | |
|---|---|---|
| Use case | Returning after extended absence | Quick check at start of session |
| Detail level | Full conversation summaries | Aggregated state snapshot |
| Response size | Larger (includes narrative) | Compact (key metrics) |