Skip to content

Conversation

@blinkagent
Copy link
Contributor

@blinkagent blinkagent bot commented Dec 16, 2025

Summary

Adds conversation compaction support to the scout-agent package to prevent context overflow errors during long conversations.

Problem

During long conversations, the model can run out of context window space and error out. There was no mechanism in scout-agent to handle this gracefully.

Solution

Adds a compaction module that provides:

Token Counting

  • Uses ai-tokenizer for accurate, model-specific token counting
  • Supports Claude, GPT, Gemini, and other models
  • Async API with lazy-loaded tokenizer modules

Compaction Tool

  • compact_conversation tool that accepts detailed summaries
  • Summaries should include: topics, decisions, code changes, file paths, action items, technical details
  • Tool result is detected in subsequent messages to apply compaction

blink-so bot added 6 commits December 16, 2025 10:31
Adds conversation compaction support to prevent context overflow errors.

Features:
- Token counting using ai-tokenizer with model-specific accuracy
- Compaction tool (compact_conversation) for summarizing conversations
- Warning message generation when approaching token limits
- Context length error detection for various providers
- Emergency compaction for when compaction request exceeds context

Exports:
- countConversationTokens - Count tokens in ModelMessage[]
- shouldCompact - Check if compaction is needed
- findCompactionSummary - Find existing compaction in messages
- applyCompaction - Replace pre-compaction messages with summary
- createCompactionTool - Create the compact_conversation tool
- createCompactionWarningMessage - Generate warning message
- isContextLengthError - Detect context overflow errors
- calculateEmergencyCompactionConfig - Plan emergency compaction
- createEmergencyCompactionMessage - Generate emergency request
- prepareEmergencyCompactionMessages - Split messages for emergency

Includes 21 tests covering all functionality.
This commit:
- Imports applyCompaction and createCompactionTool from ./compaction
- Applies compaction to messages before processing in buildStreamTextParams
- Logs when compaction is applied (showing message count reduction)
- Adds the compaction tool to the tools object so the model can call it
- Uses compacted messages for Slack metadata detection and model conversion
- Fixes isolated declarations issue in createCompactionTool by adding return type
- Fixes test assertion for execute method on Tool type
…eded

This commit adds:
- New compaction option in BuildStreamTextParamsOptions to configure:
  - warningThreshold: token count that triggers compaction warning (default: 80% of max)
  - maxTokenThreshold: maximum tokens for context (default: 100k)
  - modelName: model name for token counting
  - Set to false to disable compaction features
- Token counting using ai-tokenizer after message conversion
- Automatic injection of compaction warning message when threshold exceeded
- Logging for token thresholds and warning injection
- Tests for:
  - Compaction tool is included by default
  - Existing compaction summaries are applied
  - Warning message is injected when threshold exceeded
  - Compaction can be disabled with compaction: false
  - Custom thresholds are respected

Note: core.test.ts tests may not run locally due to bun/HTTPParser
incompatibility, but work in CI.
maxTokenThreshold was only used for display in the warning message.
Simplified to just use warningThreshold for both triggering and display.
…njected

The compact_conversation tool is now only available when the token
threshold is exceeded and the warning message is injected. This keeps
the tool list clean when compaction is not needed.

Updated tests to verify:
- Tool is NOT available when under threshold
- Tool IS available when warning is injected
- Tool is NOT available when compaction is disabled
@hugodutka hugodutka force-pushed the feat/conversation-compaction branch 6 times, most recently from f3d4ce8 to 909fa77 Compare December 16, 2025 17:10
@hugodutka hugodutka force-pushed the feat/conversation-compaction branch from 909fa77 to c02246f Compare December 16, 2025 17:36
@hugodutka hugodutka merged commit a162ad8 into main Dec 16, 2025
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant