Skip to content

Conversation

@honix
Copy link
Owner

@honix honix commented Dec 31, 2025

Updated the Pluggable LLM section to include BYOK or local inference options.

Updated the Pluggable LLM section to include BYOK or local inference options.
@honix honix marked this pull request as ready for review December 31, 2025 08:02
@honix honix merged commit 9570dd2 into main Dec 31, 2025
1 check passed
@claude
Copy link

claude bot commented Dec 31, 2025

PR Review: Enhance Pluggable LLM description in README

Summary

This PR adds clarification to the README that the LLM integration supports BYOK (Bring Your Own Key) and local inference services. The change is simple, clear, and accurate.

✅ Code Quality & Best Practices

  • Clear communication: The addition clearly communicates that users can use their own API keys or local inference
  • Concise: Keeps the feature list brief while adding important information
  • Consistent style: Matches the existing README format

✅ Accuracy Verification

I verified the claim by examining the codebase:

  • OpenRouter adapter (backend/ai/adapters/openrouter.py:38-42): Accepts custom api_key and has a configurable base_url pointing to OpenRouter
  • Configuration (backend/config.py): Uses environment variables for API keys and model selection
  • Environment example (backend/.env.example): Documents OPENROUTER_API_KEY configuration

Finding: The statement "BYOK" is accurate - users can provide their own OpenRouter API key. However, the "local inference services" claim needs clarification.

⚠️ Minor Issue: Local Inference Support

The OpenRouter adapter has a hardcoded base_url:

self.client = AsyncOpenAI(
    base_url="https://openrouter.ai/api/v1",  # Hardcoded
    api_key=api_key or DEFAULT_API_KEY,
)

Impact: Users cannot currently point to local inference services (e.g., Ollama, LocalAI, vLLM) without code modification. The README implies this is possible, but it's not configurable via environment variables.

💡 Recommendations

  1. Short-term: Consider softening the language slightly:

    • Current: "BYOK or connect to your local inference services"
    • Suggested: "BYOK (Bring Your Own Key)" or "Supports custom API keys"
  2. Long-term (future enhancement): Make base_url configurable via environment variable to truly support local inference:

    # In config.py
    OPENROUTER_BASE_URL = os.getenv("OPENROUTER_BASE_URL", "https://openrouter.ai/api/v1")
    
    # In openrouter.py
    base_url=base_url or OPENROUTER_BASE_URL

🎯 Verdict

Approve with minor suggestion: The PR improves documentation clarity and the BYOK claim is accurate. The local inference capability would require a small code change to be fully accurate, but this is a minor issue that could be addressed in a follow-up PR if desired.

The change is safe, improves user understanding, and aligns with the project's goal of being pluggable.


Note: No security concerns, performance impacts, or test coverage issues with this documentation-only change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants