[ChatGPT Quick Actions] Custom API Base URL setting (and model switcher based on /v1/models) #14540
Labels
extension: chatgpt-quick-actions
Issues related to the chatgpt-quick-actions extension
extension
Issues related to one of the extensions in the Store
feature request
New feature or improvement
status: stalled
Stalled due inactivity
Extension
https://www.raycast.com/alanzchen/chatgpt-quick-actions
Description
It would be nice to be able to use Ollama or other OpenAI-compatible endpoints (LiteLLM proxy being a popular example) with this extension. This setting is present in the "ChatGPT" extension (brought up to make an example of the UX), I use this extension to chat with local LLMs with comfort, but since "ChatGPT Quick Actions" provides wonderful "Execute" and "Transform" commands that are not present in the mentioned extension, I would love to set it up the same way.
Who will benefit from this feature?
Addition of the Base URL setting, the change to make API key optional and changing the model switcher to make it universal will provide tinkerers and privacy-focused people with means to use alternative LLM providers with this extension, which is cool if you have usecases that do not require "big" LLMs (one example being code refactors - MacBooks with 32GB RAM can run models that are capable of this without slowing down the system, this greatly improves privacy, latency and sometimes even accuracy)
Anything else?
No response
The text was updated successfully, but these errors were encountered: