A Flask-based proxy server that enables Claude Code to work with OpenRouter models by transforming API requests between Anthropic's format and OpenRouter's OpenAI-compatible format. This solution resolves litellm API key issues and doesn't needs a separate ccr service as in "Claude Code Router" and provides seamless integration between Claude Code and OpenRouter's diverse model ecosystem.
OpenRouter now officially supports Claude Code https://openrouter.ai/docs/guides/guides/claude-code-integration
- Overview
- Architecture
- Features
- Prerequisites
- Installation
- Configuration
- Usage
- API Endpoints
- Model Mapping
- Deployment
- Troubleshooting
- Security Considerations
- Contributing
- License
The Claude Code Proxy Server addresses a critical limitation where Claude Code requires Anthropic API keys, but users may want to leverage OpenRouter's extensive model catalog and competitive pricing. This proxy server acts as a middleware layer that:
- Intercepts Claude Code's Anthropic-format API requests
- Transforms them to OpenRouter's OpenAI-compatible format
- Forwards requests to OpenRouter with proper authentication
- Converts responses back to Anthropic's expected format
- Returns responses to Claude Code seamlessly
- Bypass API Key Limitations: Use OpenRouter models without Anthropic API keys
- Cost Optimization: Access free and low-cost models through OpenRouter
- Model Diversity: Leverage OpenRouter's extensive model catalog
- Seamless Integration: Works transparently with existing Claude Code installations
- Streaming Support: Full support for real-time streaming responses
- Easy Deployment: Simple Flask application with minimal dependencies
graph TD
A[Claude Code] -->|Anthropic API Format| B[Proxy Server]
B -->|Request Transformation| C[OpenRouter API]
C -->|OpenAI Format Response| B
B -->|Response Transformation| A
subgraph "Proxy Server Components"
D[Request Handler]
E[Model Mapper]
F[Format Transformer]
G[Streaming Handler]
end
B --> D
D --> E
D --> F
D --> G
The proxy server resolves authentication limitations by:
- Accepting Claude Code's requests with Anthropic-style authentication
- Replacing authentication headers with OpenRouter API keys
- Transforming request payloads between API formats
- Maintaining compatibility with Claude Code's expected response structure
This approach eliminates the need for:
- Direct Anthropic API access
- Complex litellm configuration
- Multiple authentication systems
- âś… Format Translation: Seamless conversion between Anthropic and OpenRouter API formats
- âś… Streaming Support: Real-time response streaming for interactive sessions
- âś… Model Mapping: Intelligent mapping between Claude model names and OpenRouter equivalents
- âś… Error Handling: Comprehensive error handling and reporting
- âś… Health Checks: Built-in health check endpoint for monitoring
- âś… Verbose Logging: Optional detailed logging for debugging
- âś… Production Ready: Gunicorn-compatible for production deployments
- Python 3.7 or higher
- pip package manager
- OpenRouter API account and key
- Claude Code installation (optional, for testing)
- Visit OpenRouter.ai
- Sign up for a free account
- Navigate to the API Keys section
- Generate a new API key
- Copy the key for configuration
If you haven't already installed Claude Code:
# Install via pip
pip install claude-code
# Or via npm
npm install -g claude-code# Clone the repository
git clone <repository-url>
cd openrouter-claude-proxy-server
# Or download the Python file directly
curl -O https://raw.githubusercontent.com/<repo>/openrouter_proxy_server_claude_code_v1.0.py# Create virtual environment
python3 -m venv claude-proxy-env
# Activate virtual environment
# On macOS/Linux:
source claude-proxy-env/bin/activate
# On Windows:
claude-proxy-env\Scripts\activate# Install required packages
pip install flask requests
# Or install from requirements.txt (if available)
pip install -r requirements.txt# Check that Python and required packages are installed
python --version
pip list | grep -E "(flask|requests)"Create a .env file or set environment variables in your shell:
# Required: Your OpenRouter API key
export OPENROUTER_API_KEY="sk-or-v1-your-api-key-here"
# Optional: Enable verbose logging for debugging
export PROXY_VERBOSE="true"
# Optional: Custom port (default: 8000)
export PROXY_PORT="8000"Add these to your shell profile (~/.zshrc, ~/.bashrc, etc.):
# OpenRouter Configuration
export OPENROUTER_API_KEY="sk-or-v1-your-actual-api-key"
export PROXY_VERBOSE="false" # Set to "true" for debuggingThen reload your shell:
# On macOS/Linux
source ~/.zshrc # or ~/.bashrcConfigure Claude Code to use the proxy server:
# Set the API base URL to point to your proxy
export ANTHROPIC_API_BASE_URL="http://localhost:8000/anthropic"
# Set a dummy API key (won't be used, but required by Claude Code)
export ANTHROPIC_API_KEY="dummy-key-for-claude-code"# Start the server in development mode
python openrouter_proxy_server_claude_code_v1.0.py# Start with Gunicorn (recommended for production)
gunicorn --bind 0.0.0.0:8000 --workers 4 --timeout 120 openrouter_proxy_server_claude_code_v1.0:app# Test that the server is running
curl http://localhost:8000/healthExpected response:
{"status": "healthy", "service": "claude-code-proxy"}# Test the proxy with a simple request
curl -X POST http://localhost:8000/v1/messages \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet",
"messages": [{"role": "user", "content": "Hello, world!"}],
"max_tokens": 100
}'# Test with Claude Code (if installed)
claude-code --model claude-sonnet "Hello, can you help me with Python?"import requests
# Send a message through the proxy
response = requests.post(
"http://localhost:8000/v1/messages",
headers={"Content-Type": "application/json"},
json={
"model": "claude-sonnet",
"messages": [
{"role": "user", "content": "Explain quantum computing in simple terms"}
],
"max_tokens": 500
}
)
print(response.json())import requests
import json
# Stream a response
response = requests.post(
"http://localhost:8000/v1/messages",
headers={"Content-Type": "application/json"},
json={
"model": "claude-sonnet",
"messages": [
{"role": "user", "content": "Write a short story about AI"}
],
"max_tokens": 1000,
"stream": True
},
stream=True
)
for line in response.iter_lines():
if line:
decoded_line = line.decode('utf-8')
if decoded_line.startswith('data: '):
data_str = decoded_line[6:]
if data_str != '[DONE]':
try:
data = json.loads(data_str)
print(data.get('delta', {}).get('text', ''), end='')
except json.JSONDecodeError:
continueMethod: POST
Description: Main proxy endpoint that transforms requests between Claude Code and OpenRouter.
Request Body:
{
"model": "claude-sonnet",
"messages": [
{
"role": "user",
"content": "Your message here"
}
],
"max_tokens": 1000,
"temperature": 0.7,
"stream": false
}Response:
{
"id": "msg_abc123def456...",
"type": "message",
"role": "assistant",
"content": [
{
"type": "text",
"text": "The assistant's response"
}
],
"model": "claude-sonnet",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 10,
"output_tokens": 50
}
}Method: GET
Description: Health check endpoint for monitoring.
Response:
{
"status": "healthy",
"service": "claude-code-proxy"
}The proxy server intelligently maps Claude model names to OpenRouter equivalents:
| Claude Model | OpenRouter Model | Description |
|---|---|---|
claude-sonnet-4-5-20250929 |
openai/gpt-oss-120b:free |
Free high-capacity model |
claude-haiku-4-5-20251001 |
openai/gpt-oss-120b:free |
Free model for quick tasks |
claude-sonnet |
openai/gpt-oss-120b:free |
Default Sonnet mapping |
claude-opus |
openai/gpt-oss-20b:free |
Free Opus-equivalent |
claude-haiku |
moonshotai/kimi-k2:free |
Free Haiku-equivalent |
You can modify the model mapping in the code by editing the model_mapping dictionary:
model_mapping = {
"claude-sonnet": "anthropic/claude-3-sonnet",
"claude-opus": "anthropic/claude-3-opus",
"claude-haiku": "anthropic/claude-3-haiku",
# Add your custom mappings here
}# Simple Flask development server
python openrouter_proxy_server_claude_code_v1.0.py# Install Gunicorn
pip install gunicorn
# Start the server
gunicorn --bind 0.0.0.0:8000 --workers 4 --timeout 120 openrouter_proxy_server_claude_code_v1.0:appCreate a gunicorn.conf.py file:
# gunicorn.conf.py
bind = "0.0.0.0:8000"
workers = 4
worker_class = "sync"
worker_connections = 1000
timeout = 120
keepalive = 2
max_requests = 1000
max_requests_jitter = 100
preload_app = TrueThen start with:
gunicorn --config gunicorn.conf.py openrouter_proxy_server_claude_code_v1.0:appCreate a systemd service file /etc/systemd/system/claude-proxy.service:
[Unit]
Description=Claude Code Proxy Server
After=network.target
[Service]
Type=exec
User=your-username
Group=your-group
WorkingDirectory=/path/to/openrouter-claude-proxy-server
Environment=PATH=/path/to/openrouter-claude-proxy-server/claude-proxy-env/bin
Environment=OPENROUTER_API_KEY=your-api-key-here
ExecStart=/path/to/openrouter-claude-proxy-server/claude-proxy-env/bin/gunicorn --config gunicorn.conf.py openrouter_proxy_server_claude_code_v1.0:app
Restart=always
RestartSec=10
[Install]
WantedBy=multi-user.targetEnable and start the service:
sudo systemctl enable claude-proxy
sudo systemctl start claude-proxy
sudo systemctl status claude-proxyCreate a Dockerfile:
FROM python:3.9-slim
WORKDIR /app
COPY openrouter_proxy_server_claude_code_v1.0.py .
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
EXPOSE 8000
ENV OPENROUTER_API_KEY=""
ENV PROXY_VERBOSE="false"
CMD ["gunicorn", "--bind", "0.0.0.0:8000", "--workers", "4", "openrouter_proxy_server_claude_code_v1.0:app"]Create a docker-compose.yml:
version: '3.8'
services:
claude-proxy:
build: .
ports:
- "8000:8000"
environment:
- OPENROUTER_API_KEY=${OPENROUTER_API_KEY}
- PROXY_VERBOSE=${PROXY_VERBOSE:-false}
restart: unless-stoppedDeploy with:
docker-compose up -dProblem: The proxy server can't find your OpenRouter API key.
Solution:
# Set the environment variable
export OPENROUTER_API_KEY="sk-or-v1-your-actual-api-key"
# Add to your shell profile for persistence
echo 'export OPENROUTER_API_KEY="sk-or-v1-your-actual-api-key"' >> ~/.zshrc
source ~/.zshrcProblem: The proxy server isn't running or is on a different port.
Solution:
# Check if the server is running
ps aux | grep openrouter_proxy_server
# Start the server
python openrouter_proxy_server_claude_code_v1.0.py
# Check if port is in use
lsof -i :8000Problem: Claude Code isn't configured to use the proxy server.
Solution:
# Verify Claude Code configuration
echo $ANTHROPIC_API_BASE_URL
echo $ANTHROPIC_API_KEY
# Set correct configuration
export ANTHROPIC_API_BASE_URL="http://localhost:8000/anthropic"
export ANTHROPIC_API_KEY="dummy-key"Problem: Requests are taking too long to complete.
Solution:
# Enable verbose logging to diagnose
export PROXY_VERBOSE="true"
# Check OpenRouter API status
curl https://openrouter.ai/api/v1/models
# Consider using a closer OpenRouter endpoint or CDNProblem: Streaming responses are not functioning properly.
Solution:
# Check if your client supports streaming
# Test with curl
curl -X POST http://localhost:8000/v1/messages \
-H "Content-Type: application/json" \
-d '{"model": "claude-sonnet", "messages": [{"role": "user", "content": "test"}], "stream": true}' \
--no-bufferProblem: The wrong model is being used or model isn't found.
Solution:
# Check available OpenRouter models
curl -H "Authorization: Bearer $OPENROUTER_API_KEY" \
https://openrouter.ai/api/v1/models
# Verify your model mapping in the code
grep -A 10 "model_mapping" openrouter_proxy_server_claude_code_v1.0.pyEnable verbose logging for detailed troubleshooting:
export PROXY_VERBOSE="true"
python openrouter_proxy_server_claude_code_v1.0.pyThis will show:
- Incoming request details
- Transformation process
- OpenRouter API calls
- Response transformations
- Error details
Check the server logs for common patterns:
# Monitor logs in real-time
tail -f /var/log/claude-proxy.log
# Look for errors
grep -i error /var/log/claude-proxy.log
# Check API response times
grep "OPENROUTER RESPONSE STATUS" /var/log/claude-proxy.log- Never commit API keys to version control
- Use environment variables instead of hardcoding keys
- Rotate API keys regularly
- Monitor API usage for unusual activity
- Use HTTPS in production environments
- Implement rate limiting to prevent abuse
- Consider authentication for the proxy server itself
- Use firewalls to restrict access
# Set appropriate file permissions
chmod 600 .env
chmod 700 scripts/
# Use non-root user for deployment
useradd -r -s /bin/false claude-proxy
# Configure reverse proxy with SSL (nginx example)
server {
listen 443 ssl;
server_name your-domain.com;
ssl_certificate /path/to/cert.pem;
ssl_certificate_key /path/to/key.pem;
location / {
proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}Create a requirements.txt file:
Flask==2.3.3
requests==2.31.0
gunicorn==21.2.0- Python: 3.7 or higher
- Memory: Minimum 512MB RAM
- Disk: 100MB free space
- Network: Internet connection for OpenRouter API access
# For production monitoring
pip install prometheus-client
# For enhanced logging
pip install structlog
# For API rate limiting
pip install flask-limiterWe welcome contributions! Please follow these guidelines:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
# Clone your fork
git clone https://github.com/your-username/openrouter-claude-proxy-server.git
# Create development environment
python -m venv dev-env
source dev-env/bin/activate
pip install -r requirements.txt
pip install -r requirements-dev.txt # if available
# Run tests
python -m pytest
# Check code style
flake8 openrouter_proxy_server_claude_code_v1.0.pyThis project is licensed under the MIT License. See the LICENSE file for details.
- Issues: Report bugs and feature requests on GitHub
- Documentation: Check this README and inline code comments
- Community: Join our discussions for community support
- Initial release
- Basic proxy functionality
- Model mapping support
- Streaming support
- Health check endpoint
- Production deployment support
Note: This proxy server is designed to bridge Claude Code with OpenRouter's API. Please ensure you comply with both OpenRouter's and Anthropic's terms of service when using this software.