add MockLLMClient for testing without API credentials#11
add MockLLMClient for testing without API credentials#11falloficarus22 merged 2 commits intomainfrom
Conversation
falloficarus22
commented
Feb 28, 2026
- The mock client allows running tests without needing real OpenAI API keys
- Supports configurable responses, latency simulation and call history tracking
- Can be enabled with --features mock-llm
|
@codex review |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 861e0dfdda
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
agent-tui/src/llm/mod.rs
Outdated
| pub async fn send_message_streaming( | ||
| &self, | ||
| messages: &[Message], | ||
| ) -> Result<Pin<Box<dyn Stream<Item = Result<String>> + Send>>> { | ||
| self.record_call(messages, true); |
There was a problem hiding this comment.
Respect streaming toggle in mock streaming API
The set_streaming(false) configuration is never enforced: send_message_streaming does not read streaming_enabled and always records is_streaming as true while returning chunked output. In tests that try to disable streaming to validate non-streaming behavior, this mock will still behave as streaming and can mask regressions in code paths that depend on the toggle.
Useful? React with 👍 / 👎.
agent-tui/src/llm/mod.rs
Outdated
| if latency > 0 { | ||
| // For simplicity, we don't add per-character latency in this mock | ||
| // but the total stream could be delayed if needed | ||
| } |
There was a problem hiding this comment.
Apply configured latency to streaming mock responses
The streaming path reads latency_ms but the if latency > 0 block is a no-op, so set_latency(...) has no effect for streaming calls. This makes latency- or timeout-sensitive tests unreliable when they use send_message_streaming, because the mock cannot reproduce delayed streaming behavior.
Useful? React with 👍 / 👎.