Using Local LLMs #163
nicholas-alonzo
started this conversation in
General
Replies: 1 comment 1 reply
-
|
Cus this is a skill pack (kinda like an MCP), so think of it the same way as Claude Flow and BMAD and SuperClaude, but without the MCP. You will need to hook this to an IDE or CLI agent with LLMs already built in. https://github.com/Fission-AI/OpenSpec?tab=readme-ov-file#native-slash-commands |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Just found out about OpenSpec, it looks great and I want to give it a try. Skimming through the README I didn't see a mention of using local LLMs. Did I miss that or would that be a new feature? I find myself using LM Studio or Ollama for downloading the models. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions