Skip to content

Conversation

@kevalmahajan
Copy link
Member

@kevalmahajan kevalmahajan commented Oct 27, 2025

🐛 Bug-fix PR

📌 Summary

Closes #1239 and #1348

  1. Added support for IBM watsonx.ai LLM provider
  2. Added indepth documentation for LLM chat including setup, usage, architecture and more.
  3. Updated Redis configuration requirements for session persistence and chat consistency.
  4. Added integration tests for testing endpoints of LLM Chat with scenarios like multi-worker coordination, redis failure cases, chat history persistence and more.
  5. Improved configurations of env variable from settings class for consistency.

🧪 Verification

Check Command Status
Lint suite make lint Pass
Unit tests make test Pass
Coverage ≥ 90 % make coverage
Manual regression no longer fails steps / screenshots

📐 MCP Compliance (if relevant)

  • Matches current MCP spec
  • No breaking change to MCP clients

✅ Checklist

  • Code formatted (make black isort pre-commit)
  • No secrets/credentials committed

@kevalmahajan kevalmahajan marked this pull request as draft October 27, 2025 16:57
@kevalmahajan kevalmahajan marked this pull request as ready for review October 28, 2025 05:46
kevalmahajan and others added 18 commits October 28, 2025 21:14
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
Signed-off-by: Keval Mahajan <[email protected]>
- Fix typo: "tempurature" → "temperature" in README
- Fix wrong default model: "granite3b" → "ibm/granite-13b-chat-v2"
- Fix model_id type: int → string
- Fix duplicate in variable name: LLMCHAT_SESSION_SESSION_TTL → LLMCHAT_SESSION_TTL
- Fix incorrect defaults for Redis settings (CHAT_HISTORY_TTL: 0.2 → 3600, CHAT_HISTORY_MAX_MESSAGES: 0.2 → 50)
- Add missing LLMCHAT_ prefix to env vars in llm-chat.md to match config.py
- Improve table formatting for consistency

Signed-off-by: Mihai Criveti <[email protected]>
@crivetimihai crivetimihai force-pushed the add_watsonx_provider_support branch from 0d9c484 to 974e161 Compare October 28, 2025 22:18
Rename documentation images from generic "Untitled diagram-*" to
descriptive names without spaces:
- Untitled diagram-2025-10-27-134134.png → llm-chat-workflow-architecture.png
- Untitled diagram-2025-10-27-134308.png → llm-chat-session-management.png

Update image references in llm-chat.md with proper alt text.

Signed-off-by: Mihai Criveti <[email protected]>
@crivetimihai crivetimihai merged commit 44ee927 into main Oct 28, 2025
45 checks passed
@crivetimihai crivetimihai deleted the add_watsonx_provider_support branch October 28, 2025 22:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

LLMChat Multi-Worker: Add Documentation and Integration Tests (PR #1236 Follow-up)

3 participants