Skip to content

Commit 974e161

Browse files
committed
docs: fix typos and inconsistencies in LLM Chat documentation
- Fix typo: "tempurature" → "temperature" in README - Fix wrong default model: "granite3b" → "ibm/granite-13b-chat-v2" - Fix model_id type: int → string - Fix duplicate in variable name: LLMCHAT_SESSION_SESSION_TTL → LLMCHAT_SESSION_TTL - Fix incorrect defaults for Redis settings (CHAT_HISTORY_TTL: 0.2 → 3600, CHAT_HISTORY_MAX_MESSAGES: 0.2 → 50) - Add missing LLMCHAT_ prefix to env vars in llm-chat.md to match config.py - Improve table formatting for consistency Signed-off-by: Mihai Criveti <[email protected]>
1 parent 48d83bc commit 974e161

File tree

3 files changed

+109
-110
lines changed

3 files changed

+109
-110
lines changed

README.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1239,13 +1239,13 @@ The LLM Chat MCP Client allows you to interact with MCP servers using conversati
12391239

12401240

12411241
**IBM WatsonX AI**
1242-
| Setting | Description | Default | Options |
1243-
| ----------------------- | --------------------------------| ------------ | ----------------|
1244-
| `WATSONX_URL` | watsonx url | (none) | string |
1245-
| `WATSONX_APIKEY` | API key | (none) | string |
1246-
| `WATSONX_PROJECT_ID` | Project Id for WatsonX | (none) | string |
1247-
| `WATSONX_MODEL_ID` | Watsonx model id | `granite3b` | int |
1248-
| `WATSONX_TEMPERATURE` | tempurature (optional) | `0.7` | float (0.0-1.0) |
1242+
| Setting | Description | Default | Options |
1243+
| ----------------------- | --------------------------------| ------------------------------ | ----------------|
1244+
| `WATSONX_URL` | watsonx url | (none) | string |
1245+
| `WATSONX_APIKEY` | API key | (none) | string |
1246+
| `WATSONX_PROJECT_ID` | Project Id for WatsonX | (none) | string |
1247+
| `WATSONX_MODEL_ID` | Watsonx model id | `ibm/granite-13b-chat-v2` | string |
1248+
| `WATSONX_TEMPERATURE` | temperature (optional) | `0.7` | float (0.0-1.0) |
12491249

12501250

12511251
**Ollama Configuration:**
@@ -1277,12 +1277,12 @@ The LLM Chat MCP Client allows you to interact with MCP servers using conversati
12771277

12781278
| Setting | Description | Default | Options |
12791279
| -------------------------------------| -------------------------------------------| ------- | ------- |
1280-
| `LLMCHAT_SESSION_SESSION_TTL` | Seconds for active_session key TTL | `300` | int |
1280+
| `LLMCHAT_SESSION_TTL` | Seconds for active_session key TTL | `300` | int |
12811281
| `LLMCHAT_SESSION_LOCK_TTL` | Seconds for lock expiry | `30` | int |
12821282
| `LLMCHAT_SESSION_LOCK_RETRIES` | How many times to poll while waiting | `10` | int |
12831283
| `LLMCHAT_SESSION_LOCK_WAIT` | Seconds between polls | `0.2` | float |
1284-
| `LLMCHAT_CHAT_HISTORY_TTL` | Seconds for chat history expiry | `0.2` | float |
1285-
| `LLMCHAT_CHAT_HISTORY_MAX_MESSAGES` | Maximum message history to store per user | `0.2` | float |
1284+
| `LLMCHAT_CHAT_HISTORY_TTL` | Seconds for chat history expiry | `3600` | int |
1285+
| `LLMCHAT_CHAT_HISTORY_MAX_MESSAGES` | Maximum message history to store per user | `50` | int |
12861286

12871287
**Documentation:**
12881288
- [LLM Chat Guide](https://ibm.github.io/mcp-context-forge/using/clients/llm-chat) - Complete LLM Chat setup and provider configuration

docs/docs/using/clients/llm-chat.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -87,33 +87,33 @@ OLLAMA_TEMPERATURE=0.7
8787

8888
```bash
8989
# ===== Redis Connection =====
90-
CACHE_TYPE=redis # Enable Redis
90+
CACHE_TYPE=redis # Enable Redis
9191
REDIS_URL=redis://localhost:6379/0 # Redis connection string
9292

9393
# ===== Session Management =====
94-
SESSION_TTL=300 # Active session ownership TTL (seconds)
94+
LLMCHAT_SESSION_TTL=300 # Active session ownership TTL (seconds)
9595
# Default: 300 (5 minutes)
9696
# Higher values: More stable sessions, slower failover
9797
# Lower values: Faster failover, more frequent TTL refreshes
9898

9999
# ===== Lock Configuration =====
100-
SESSION_LOCK_TTL=30 # Lock expiry time (seconds)
100+
LLMCHAT_SESSION_LOCK_TTL=30 # Lock expiry time (seconds)
101101
# Default: 30 seconds
102102
# Should be > typical initialization time
103103

104-
SESSION_LOCK_RETRIES=10 # Lock acquisition retry attempts
104+
LLMCHAT_SESSION_LOCK_RETRIES=10 # Lock acquisition retry attempts
105105
# Default: 10 attempts
106106
# Total wait time = RETRIES × WAIT
107107

108-
SESSION_LOCK_WAIT=0.2 # Wait between retry attempts (seconds)
108+
LLMCHAT_SESSION_LOCK_WAIT=0.2 # Wait between retry attempts (seconds)
109109
# Default: 0.2 seconds (200ms)
110110
# Total max wait: 10 × 0.2 = 2 seconds
111111

112112
# ===== Chat History =====
113-
CHAT_HISTORY_TTL=3600 # History expiry time (seconds)
113+
LLMCHAT_CHAT_HISTORY_TTL=3600 # History expiry time (seconds)
114114
# Default: 3600 (1 hour)
115115

116-
CHAT_HISTORY_MAX_MESSAGES=50 # Maximum messages per user
116+
LLMCHAT_CHAT_HISTORY_MAX_MESSAGES=50 # Maximum messages per user
117117
# Default: 50
118118
# Older messages automatically trimmed
119119
```
@@ -153,7 +153,7 @@ After selecting a server, expand the **LLM Configuration** section and choose yo
153153
5. IBM watsonx.ai
154154
6. Ollama
155155

156-
Select your preferred provider. If youve already configured LLM details like API key, URL, etc., in the environment variables, you can leave the next fields blank.
156+
Select your preferred provider. If you've already configured LLM details like API key, URL, etc., in the environment variables, you can leave the next fields blank.
157157

158158

159159
### Step 3: 🔗 Connect to the Chat Session

0 commit comments

Comments
 (0)