From 33f99f235048d4a0ff654854f585c5f4f4000883 Mon Sep 17 00:00:00 2001 From: Agusti Fernandez <6601142+agustif@users.noreply.github.com> Date: Fri, 25 Apr 2025 20:54:30 +0200 Subject: [PATCH 1/2] adds lm studio in plugin directory --- docs/plugins/directory.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/plugins/directory.md b/docs/plugins/directory.md index d816fddc..a662c210 100644 --- a/docs/plugins/directory.md +++ b/docs/plugins/directory.md @@ -40,6 +40,7 @@ These plugins can be used to interact with remotely hosted models via their API: - **[llm-deepseek](https://github.com/abrasumente233/llm-deepseek)** adds support for the [DeepSeek](https://deepseek.com)'s DeepSeek-Chat and DeepSeek-Coder models. - **[llm-lambda-labs](https://github.com/simonw/llm-lambda-labs)** provides access to models hosted by [Lambda Labs](https://docs.lambdalabs.com/public-cloud/lambda-chat-api/), including the Nous Hermes 3 series. - **[llm-venice](https://github.com/ar-jan/llm-venice)** provides access to uncensored models hosted by privacy-focused [Venice AI](https://docs.venice.ai/), including Llama 3.1 405B. +- **[llm-lmstudio](https://github.com/agustif/llm-lmstudio)** provides access to local models using [LM Studio](https://lmstudio.ai/), If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin. From bf311af926527a29bd73ea1d37b7c9e5b31c8c00 Mon Sep 17 00:00:00 2001 From: Agusti Fernandez <6601142+agustif@users.noreply.github.com> Date: Fri, 25 Apr 2025 20:58:37 +0200 Subject: [PATCH 2/2] adds lm llm-plugin-pdf in plugin directory --- docs/plugins/directory.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/plugins/directory.md b/docs/plugins/directory.md index a662c210..c121fd0e 100644 --- a/docs/plugins/directory.md +++ b/docs/plugins/directory.md @@ -40,7 +40,6 @@ These plugins can be used to interact with remotely hosted models via their API: - **[llm-deepseek](https://github.com/abrasumente233/llm-deepseek)** adds support for the [DeepSeek](https://deepseek.com)'s DeepSeek-Chat and DeepSeek-Coder models. - **[llm-lambda-labs](https://github.com/simonw/llm-lambda-labs)** provides access to models hosted by [Lambda Labs](https://docs.lambdalabs.com/public-cloud/lambda-chat-api/), including the Nous Hermes 3 series. - **[llm-venice](https://github.com/ar-jan/llm-venice)** provides access to uncensored models hosted by privacy-focused [Venice AI](https://docs.venice.ai/), including Llama 3.1 405B. -- **[llm-lmstudio](https://github.com/agustif/llm-lmstudio)** provides access to local models using [LM Studio](https://lmstudio.ai/), If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin. @@ -69,6 +68,7 @@ If an API model host provides an OpenAI-compatible API you can also [configure L - **[llm-templates-fabric](https://github.com/simonw/llm-templates-fabric)** provides access to the [Fabric](https://github.com/danielmiessler/fabric) collection of prompts: `cat setup.py | llm -t fabric:explain_code`. - **[llm-fragments-github](https://github.com/simonw/llm-fragments-github)** can load entire GitHub repositories in a single operation: `llm -f github:simonw/files-to-prompt 'explain this code'`. - **[llm-hacker-news](https://github.com/simonw/llm-hacker-news)** imports conversations from Hacker News as fragments: `llm -f hn:43615912 'summary with illustrative direct quotes'`. +- **[llm-plugin-pdf](https://github.com/agustif/llm-plugin-pdf)** provides a `-f pdf:` loader that can load local or remote PDF files as fragments. ## Just for fun