You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I propose integrating DeepSeek-V3 as a new AI provider in OpenCommit. DeepSeek-V3 is a powerful AI model that offers high performance and speed, making it a great alternative to existing providers like OpenAI, Anthropic, or Ollama.
This integration would allow users to use DeepSeek-V3 for generating commit messages, providing more options and flexibility in choosing AI models.
Suggested Solution
Add DeepSeek-V3 as a supported AI provider in the OpenCommit configuration.
Allow users to set DeepSeek-V3 as the default provider via the OCO_AI_PROVIDER config.
Support DeepSeek-V3's API endpoint and model options in the configuration.
Example configuration:
oco config set OCO_AI_PROVIDER=deepseek-v3
oco config set OCO_API_KEY=<DeepSeek_API_Key>
oco config set OCO_API_URL=<DeepSeek_API_URL>
oco config set OCO_MODEL=deepseek-v3
### Alternatives
_No response_
### Additional Context
_No response_
The text was updated successfully, but these errors were encountered:
DeepSeek has implemented a Context Caching feature (see: https://api-docs.deepseek.com/news/news0802). This should help reduce API usage costs. Therefore, I support adding this feature.
Description
I propose integrating DeepSeek-V3 as a new AI provider in OpenCommit. DeepSeek-V3 is a powerful AI model that offers high performance and speed, making it a great alternative to existing providers like OpenAI, Anthropic, or Ollama.
This integration would allow users to use DeepSeek-V3 for generating commit messages, providing more options and flexibility in choosing AI models.
Suggested Solution
OCO_AI_PROVIDER
config.Example configuration:
The text was updated successfully, but these errors were encountered: