You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* LLM support can be installed with new extras "llm" or "all"
* conditionally import "llm" and "llm.cli"
* show alternative message when user attempts to use \llm without
dependencies available. Unlike ssh extras, there is no need to exit
on failure.
* cache the list of possible CLI commands for performance, avoiding a
regression
* provide environment variable to turn off LLM support even in the
presence of the llm dependency
* update quickstart to recommend installing with the "all" extra
* update changelog
* update doc/llm.md
Copy file name to clipboardExpand all lines: doc/llm.md
+21-2Lines changed: 21 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,13 +8,22 @@ Alias: `\ai` works the same as `\llm`.
8
8
9
9
## Quick Start
10
10
11
-
1) Configure your API key (only needed for remote providers like OpenAI):
11
+
1) Make sure mycli is installed with the `[llm]` extras, like
12
+
```bash
13
+
pip install 'mycli[llm]'
14
+
```
15
+
or that the `llm` dependency is installed separately:
16
+
```bash
17
+
pip install llm
18
+
```
19
+
20
+
2) From the mycli prompt, configure your API key (only needed for remote providers like OpenAI):
12
21
13
22
```text
14
23
\llm keys set openai
15
24
```
16
25
17
-
2) Ask a question. The response’s SQL (inside a ```sql fenced block) is extracted and pre-filled at the prompt:
26
+
3) Ask a question. The response’s SQL (inside a ```sql fenced block) is extracted and pre-filled at the prompt:
18
27
19
28
```text
20
29
World> \llm "Capital of India?"
@@ -165,6 +174,16 @@ World> \llm templates show mycli-llm-template
165
174
- Data sent: Contextual questions send schema (table/column names and types) and a single sample row per table. Review your data sensitivity policies before using remote models; prefer local models (such as ollama) if needed.
166
175
- Help: Running `\llm` with no arguments shows a short usage message.
167
176
177
+
## Turning Off LLM Support
178
+
179
+
To turn off LLM support even when the `llm` dependency is installed, set the `MYCLI_LLM_OFF` environment variable:
0 commit comments