Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Venice.ai API #3660

Closed
2 tasks done
julienrbrt opened this issue Jan 9, 2025 · 4 comments
Closed
2 tasks done

Support Venice.ai API #3660

julienrbrt opened this issue Jan 9, 2025 · 4 comments
Assignees
Labels
area:configuration Relates to configuration options kind:enhancement Indicates a new feature request, imrovement, or extension "needs-triage"

Comments

@julienrbrt
Copy link

julienrbrt commented Jan 9, 2025

Validations

  • I believe this is a way to improve. I'll try to join the Continue Discord for questions
  • I'm not able to find an open issue that requests the same enhancement

Problem

Venice.ai isn't supported. It has a free API for pro user and respect privacy.
I personally use it as an alternative to Ollama on a less powerful machine.)
They say they are compatible with OpenAI api, so maybe it is just a question of changing the api endpoint? (https://docs.venice.ai/welcome/about-venice#key-features)

Solution

Support venice.ai API.

@dosubot dosubot bot added area:configuration Relates to configuration options kind:enhancement Indicates a new feature request, imrovement, or extension labels Jan 9, 2025
@julienrbrt
Copy link
Author

I should have read the docs better, I have managed to configure it for chat using https://docs.continue.dev/customize/model-providers/openai#openai-compatible-servers--apis.

@tkeitzl
Copy link

tkeitzl commented Jan 10, 2025

Which value did you choose for $provider$ and $model$ and potentially for $completionOptions$, @julienrbrt?

{
  "models": [
    {
      "title": "Venice.AI",
      "provider": "ollama",
      "model": "llama-3.3-70b",
      "apiKey": XXX,
      "apiBase": "https://api.venice.ai/api/v1",
      "completionOptions": {
        "endpoint": ""
      }
    }
  ]
}

Thank you for sharing!

@julienrbrt
Copy link
Author

There is no completion api AFAICT. You can find the models by querying the api (https://www.postman.com/veniceai/venice-ai-workspace/request/9ujtbqt/list-models).

I have

    {
      "title": "Llama 3.1 405b (Venice.ai)",
      "provider": "openai",
      "model": "llama-3.1-405b",
      "apiKey": "redacted",
      "apiBase": "https://api.venice.ai/api/v1"
    },
    {
      "title": "Qwen Coder 2.5 (Venice.ai)",
      "provider": "openai",
      "model": "qwen32b",
      "apiKey": "redacted",
      "apiBase": "https://api.venice.ai/api/v1"
    },

@tomfotherby
Copy link

I created a pull-request to get Venice Added to the documentation: #4013

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:configuration Relates to configuration options kind:enhancement Indicates a new feature request, imrovement, or extension "needs-triage"
Projects
None yet
Development

No branches or pull requests

4 participants