-
Notifications
You must be signed in to change notification settings - Fork 137
Description
Problem Description:
When configuring the api_base in config.yaml with provider: "openai", the current implementation automatically appends /v1 to the URL.
However most LLM providers (e.g., SiliconFlow, OpenAI, DeepSeek) provide full API endpoints in their documentation that already include the version suffix (e.g., https://api.siliconflow.cn/v1). Users intuitively copy and paste these full URLs. The current behavior results in an invalid double-suffix (e.g., .../v1/v1), causing connection failures (404 Not Found).
Steps to Reproduce
- Open config.yaml.
- Set provider to "openai".
- Set api_base to a standard full URL, such as https://api.siliconflow.cn/v1.
- Run mini-agent and attempt to send a message.
Actual Behavior
The agent fails to connect to the LLM provider. The logs show a 404 Not Found error because the client attempts to connect.
Expected Behavior
The LLMClient initialization logic should check if api_base already ends with /v1. If the suffix is present, it should use the URL as-is without appending it again. This is more intuitive workflows for users just read provider's documentation.
###Environment Information
- Python version: 3.11
- Operating system: macOS (Apple Silicon)
- Mini-Agent version: Current main branch (Dec 2025)