Skip to content

Commit 2be3acd

Browse files
committed
make LLM support optional
* LLM support can be installed with new extras "llm" or "all" * conditionally import "llm" and "llm.cli" * show alternative message when user attempts to use \llm without dependencies available. Unlike ssh extras, there is no need to exit on failure. * cache the list of possible CLI commands for performance, avoiding a regression * provide environment variable to turn off LLM support even in the presence of the llm dependency * update quickstart to recommend installing with the "all" extra * update changelog * update doc/llm.md
1 parent 45b439a commit 2be3acd

File tree

5 files changed

+88
-13
lines changed

5 files changed

+88
-13
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ If you already know how to install Python packages, then you can install it via
2020
You might need sudo on Linux.
2121

2222
```bash
23-
pip install -U mycli
23+
pip install -U 'mycli[all]'
2424
```
2525

2626
or

changelog.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,11 @@
11
Upcoming (TBD)
22
==============
33

4+
Features
5+
--------
6+
* Make LLM dependencies an optional extra.
7+
8+
49
Internal
510
--------
611
* Add mypy to Pull Request template.

doc/llm.md

Lines changed: 21 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,13 +8,22 @@ Alias: `\ai` works the same as `\llm`.
88

99
## Quick Start
1010

11-
1) Configure your API key (only needed for remote providers like OpenAI):
11+
1) Make sure mycli is installed with the `[llm]` extras, like
12+
```bash
13+
pip install 'mycli[llm]'
14+
```
15+
or that the `llm` dependency is installed separately:
16+
```bash
17+
pip install llm
18+
```
19+
20+
2) From the mycli prompt, configure your API key (only needed for remote providers like OpenAI):
1221

1322
```text
1423
\llm keys set openai
1524
```
1625

17-
2) Ask a question. The response’s SQL (inside a ```sql fenced block) is extracted and pre-filled at the prompt:
26+
3) Ask a question. The response’s SQL (inside a ```sql fenced block) is extracted and pre-filled at the prompt:
1827

1928
```text
2029
World> \llm "Capital of India?"
@@ -165,6 +174,16 @@ World> \llm templates show mycli-llm-template
165174
- Data sent: Contextual questions send schema (table/column names and types) and a single sample row per table. Review your data sensitivity policies before using remote models; prefer local models (such as ollama) if needed.
166175
- Help: Running `\llm` with no arguments shows a short usage message.
167176

177+
## Turning Off LLM Support
178+
179+
To turn off LLM support even when the `llm` dependency is installed, set the `MYCLI_LLM_OFF` environment variable:
180+
```bash
181+
export MYCLI_LLM_OFF=1
182+
```
183+
184+
This may be desirable for faster startup times.
185+
186+
168187
---
169188

170189
## Learn More

mycli/packages/special/llm.py

Lines changed: 49 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
import contextlib
2+
import functools
23
import io
34
import logging
45
import os
@@ -10,15 +11,30 @@
1011
from typing import Optional, Tuple
1112

1213
import click
13-
import llm
14-
from llm.cli import cli
14+
15+
try:
16+
if not os.environ.get('MYCLI_LLM_OFF'):
17+
import llm
18+
19+
LLM_IMPORTED = True
20+
else:
21+
LLM_IMPORTED = False
22+
except ImportError:
23+
LLM_IMPORTED = False
24+
try:
25+
if not os.environ.get('MYCLI_LLM_OFF'):
26+
from llm.cli import cli
27+
28+
LLM_CLI_IMPORTED = True
29+
else:
30+
LLM_CLI_IMPORTED = False
31+
except ImportError:
32+
LLM_CLI_IMPORTED = False
1533

1634
from mycli.packages.special.main import Verbosity, parse_special_command
1735

1836
log = logging.getLogger(__name__)
1937

20-
LLM_CLI_COMMANDS = list(cli.commands.keys())
21-
MODELS = {x.model_id: None for x in llm.get_models()}
2238
LLM_TEMPLATE_NAME = "mycli-llm-template"
2339

2440

@@ -67,7 +83,7 @@ def build_command_tree(cmd):
6783
if isinstance(cmd, click.Group):
6884
for name, subcmd in cmd.commands.items():
6985
if cmd.name == "models" and name == "default":
70-
tree[name] = MODELS
86+
tree[name] = {x.model_id: None for x in llm.get_models()}
7187
else:
7288
tree[name] = build_command_tree(subcmd)
7389
else:
@@ -76,7 +92,7 @@ def build_command_tree(cmd):
7692

7793

7894
# Generate the command tree for autocompletion
79-
COMMAND_TREE = build_command_tree(cli) if cli else {}
95+
COMMAND_TREE = build_command_tree(cli) if LLM_CLI_IMPORTED is True else {}
8096

8197

8298
def get_completions(tokens, tree=COMMAND_TREE):
@@ -120,7 +136,25 @@ def __init__(self, results=None):
120136
# Plugins directory
121137
# https://llm.datasette.io/en/stable/plugins/directory.html
122138
"""
139+
140+
NEED_DEPENDENCIES = """
141+
To enable LLM features you need to install mycli with LLM support:
142+
143+
pip install 'mycli[llm]'
144+
145+
or
146+
147+
pip install 'mycli[all]'
148+
149+
or install LLM libraries separately
150+
151+
pip install llm
152+
153+
This is required to use the \\llm command.
154+
"""
155+
123156
_SQL_CODE_FENCE = r"```sql\n(.*?)\n```"
157+
124158
PROMPT = """
125159
You are a helpful assistant who is a MySQL expert. You are embedded in a mysql
126160
cli tool called mycli.
@@ -159,8 +193,16 @@ def ensure_mycli_template(replace=False):
159193
return
160194

161195

196+
@functools.cache
197+
def cli_commands() -> list[str]:
198+
return list(cli.commands.keys())
199+
200+
162201
def handle_llm(text, cur) -> Tuple[str, Optional[str], float]:
163202
_, verbosity, arg = parse_special_command(text)
203+
if not LLM_IMPORTED:
204+
output = [(None, None, None, NEED_DEPENDENCIES)]
205+
raise FinishIteration(output)
164206
if not arg.strip():
165207
output = [(None, None, None, USAGE)]
166208
raise FinishIteration(output)
@@ -176,7 +218,7 @@ def handle_llm(text, cur) -> Tuple[str, Optional[str], float]:
176218
capture_output = False
177219
use_context = False
178220
restart = True
179-
elif parts and parts[0] in LLM_CLI_COMMANDS:
221+
elif parts and parts[0] in cli_commands():
180222
capture_output = False
181223
use_context = False
182224
elif parts and parts[0] == "--help":

pyproject.toml

Lines changed: 12 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,9 +21,6 @@ dependencies = [
2121
"pyperclip >= 1.8.1",
2222
"pycryptodomex",
2323
"pyfzf >= 0.3.1",
24-
"llm>=0.19.0",
25-
"setuptools", # Required by llm commands to install models
26-
"pip",
2724
]
2825

2926
[build-system]
@@ -35,6 +32,15 @@ build-backend = "setuptools.build_meta"
3532

3633
[project.optional-dependencies]
3734
ssh = ["paramiko", "sshtunnel"]
35+
llm = [
36+
"llm>=0.19.0",
37+
"setuptools", # Required by llm commands to install models
38+
"pip",
39+
]
40+
all = [
41+
"mycli[ssh]",
42+
"mycli[llm]",
43+
]
3844
dev = [
3945
"behave>=1.2.6",
4046
"coverage>=7.2.7",
@@ -46,6 +52,9 @@ dev = [
4652
"pdbpp>=0.10.3",
4753
"paramiko",
4854
"sshtunnel",
55+
"llm>=0.19.0",
56+
"setuptools", # Required by llm commands to install models
57+
"pip",
4958
]
5059

5160
[project.scripts]

0 commit comments

Comments
 (0)