Skip to content
This repository was archived by the owner on Sep 18, 2025. It is now read-only.

Conversation

@kujtimiihoxha
Copy link
Collaborator

No description provided.

@snipeship
Copy link
Contributor

@kujtimiihoxha – proposal: look into LiteLLM (repo)

  • One proxy, every vendor – Anthropic, OpenAI, Groq, Bedrock, Azure, Vertex, many more, all routed through the same endpoint.
  • Auto-maintained model catalog – pricing, context windows, reasoning, etc., for every model/provider pair (https://models.litellm.ai/).

We can potentially embed or supervise the proxy from Go (docs, endpoints).

Thoughts?

@Kreijstal
Copy link

Yes, this project doesn't support deepseek or openrouter? Or ollama?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants