A Rust-powered client library for interacting with large language models (LLMs). Supports streaming and non-streaming conversations with various API providers.
This is a work in progress and is not yet ready for production use.some providers are may not be fully supported yet.
checkout /src/models/models.rs for more details.
checkout /src/api/providers.rs for more details.
- π Async-first implementation using Tokio runtime
- π Multi-provider support (Deepseek, etc.)
- π‘ Stream response handling with backpressure support
- π§ Configurable API endpoints and rate limiting
- π οΈ Strong type safety with Rust enums and structs
- π§ Conversation memory management
- π¦ Comprehensive error handling
Add to your Cargo.toml
:
[dependencies]
llmhub = { git = "https://github.com/akirco/llmhub" }
checkout examples
cargo run --example llmhub_test
Feel free to open issues or pull requests.
MIT License