WebSocket proxy server that bridges HTTP API clients to Claude Code CLI using the reversed SDK protocol.
Use any OpenAI-compatible client (Cursor, Continue, etc.) to talk to Claude Code — with full access to every model in your Claude Code subscription.
┌──────────────┐ HTTP POST ┌─────────────────┐ WebSocket ┌────────────────┐
│ HTTP Client │ ──────────────────▶ │ claude-ws-proxy │ ◀──────────────────▶ │ Claude Code CLI │
│ (Cursor, │ /v1/messages │ │ SDK protocol │ (subprocess) │
│ Continue, │ ◀────────────────── │ :3456 HTTP API │ (reversed) │ │
│ curl) │ SSE stream │ :8765 WebSocket │ │ │
└──────────────┘ └─────────────────┘ └────────────────┘
The proxy:
- Accepts standard Anthropic
/v1/messagesHTTP requests - Spawns Claude Code CLI as a subprocess
- Communicates with it over WebSocket using the reversed SDK protocol
- Streams responses back as SSE (Server-Sent Events)
# Clone
git clone https://github.com/sioakim/claude-ws-proxy.git
cd claude-ws-proxy
# Start
docker compose up -d
# Open the dashboard
open http://localhost:3456
# Click "Login" — you'll be redirected to claude.ai to authenticate
# Approve access, and you're done!# Prerequisites: Node.js 22+, Claude Code CLI
npm install -g @anthropic-ai/claude-code
claude auth login # one-time auth
# Clone and run
git clone https://github.com/sioakim/claude-ws-proxy.git
cd claude-ws-proxy
npm install
npm start
# Dashboard at http://localhost:3456Any model available in your Claude Code subscription works. Pass the exact model ID in your API request.
| Model | ID | Released |
|---|---|---|
| Claude Opus 4.6 | claude-opus-4-6 |
Feb 2026 |
| Claude Sonnet 4.6 | claude-sonnet-4-6 |
Feb 2026 |
| Claude Sonnet 4.5 | claude-sonnet-4-5-20250929 |
Sep 2025 |
| Claude Opus 4 | claude-opus-4-20250514 |
May 2025 |
| Claude Sonnet 4 | claude-sonnet-4-20250514 |
May 2025 |
| Claude Haiku 4.5 | claude-haiku-4-5-20251001 |
Oct 2025 |
| Claude Haiku 3.5 | claude-3-5-haiku-20241022 |
Oct 2024 |
Note: New models are automatically supported when Anthropic adds them to Claude Code — no proxy update needed. Just use the model ID in your request. The model list returned by /v1/models is configurable via the models array in config.json.
- Authentication is handled via Claude Code's OAuth flow
- Tokens persist in a Docker volume (
claude-ws-proxy-auth) across container restarts - Auto-refresh: Claude Code handles token refresh automatically
- Re-authenticate: Visit
http://localhost:3456/auth/loginor click "Login" on the dashboard - Check status:
curl http://localhost:3456/auth/status
All settings via config.json or environment variables (env takes precedence):
| Setting | Env Var | Default | Description |
|---|---|---|---|
httpPort |
HTTP_PORT |
3456 |
HTTP API port |
wsPort |
WS_PORT |
8765 |
Internal WebSocket port (not exposed) |
claudePath |
CLAUDE_PATH |
claude |
Path to Claude Code CLI binary |
defaultModel |
DEFAULT_MODEL |
claude-sonnet-4-20250514 |
Default model |
autoApproveTool |
— | true |
Auto-approve tool use |
maxSessions |
MAX_SESSIONS |
5 |
Max concurrent sessions |
sessionTimeoutMs |
SESSION_TIMEOUT_MS |
300000 |
Session idle timeout (ms) |
authToken |
AUTH_TOKEN |
"" |
Bearer token for API auth (empty = no auth, legacy) |
requireApiKey |
— | false |
If true, require a valid API key |
apiKey |
— | "" |
Expected API key when requireApiKey is true |
models |
— | (see config.json) | Array of models returned by /v1/models |
Most third-party apps require an API key field — they won't let you proceed without one. The proxy handles this gracefully:
- Default (
requireApiKey: false): The proxy accepts any value (or no value) forx-api-key/Authorization: Bearer ...headers. Just putsk-dummyor any string in your app's API key field. - Validated (
requireApiKey: true,apiKey: "sk-proxy-12345"): The proxy checks that the provided key matches. Useful if you expose the proxy on a network and want basic access control.
Both x-api-key header and Authorization: Bearer <key> header are supported.
Dashboard — shows auth status, active sessions, and login controls.
Anthropic Messages API compatible endpoint. Supports streaming via SSE.
curl -X POST http://localhost:3456/v1/messages \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-20250514",
"max_tokens": 4096,
"stream": true,
"messages": [
{"role": "user", "content": "Hello!"}
]
}'Lists available models (configurable via config.json).
curl http://localhost:3456/v1/modelsHealth check. Returns auth status and active session count.
curl http://localhost:3456/health
# {"status":"ok","version":"1.1.0","authenticated":true,"activeSessions":0}Check authentication status.
curl http://localhost:3456/auth/status
# {"authenticated":true,"expired":false,"expiresAt":"2026-03-01T00:00:00.000Z"}Initiate OAuth login. Returns a URL to visit for authentication.
curl http://localhost:3456/auth/login
# {"url":"https://claude.ai/oauth/...","message":"Visit the URL to authenticate"}Delete stored credentials.
curl -X POST http://localhost:3456/auth/logout
# {"success":true,"message":"Logged out successfully"}Examples for connecting popular apps and tools to the proxy. Use sk-dummy (or any string) as the API key.
import anthropic
client = anthropic.Anthropic(
base_url="http://localhost:3456",
api_key="sk-dummy" # any value works
)
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)curl http://localhost:3456/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: sk-dummy" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4-20250514",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello!"}]
}'import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
baseURL: 'http://localhost:3456',
apiKey: 'sk-dummy',
});
const response = await client.messages.create({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});{
"llm": {
"provider": "anthropic",
"apiKey": "sk-dummy",
"baseUrl": "http://localhost:3456"
}
}- API Base URL:
http://localhost:3456 - API Key:
sk-dummy(any value) - Model:
claude-sonnet-4-20250514
- API Type: Claude
- API URL:
http://localhost:3456 - API Key:
sk-dummy
import litellm
response = litellm.completion(
model="anthropic/claude-sonnet-4-20250514",
messages=[{"role": "user", "content": "Hello!"}],
api_base="http://localhost:3456",
api_key="sk-dummy",
)from mem0 import Memory
config = {
"llm": {
"provider": "anthropic",
"config": {
"model": "claude-sonnet-4-20250514",
"api_key": "sk-dummy",
"anthropic_base_url": "http://localhost:3456"
}
},
"embedder": {
"provider": "ollama",
"config": {
"model": "nomic-embed-text",
"ollama_base_url": "http://localhost:11434"
}
},
"vector_store": {
"provider": "chroma",
"config": {
"collection_name": "memories",
"path": "./chroma_data"
}
}
}
memory = Memory.from_config(config)
# Add a memory
memory.add("I prefer dark mode in all applications", user_id="user1")
# Search memories
results = memory.search("What are the user's UI preferences?", user_id="user1")Visit http://localhost:3456 and click Login, or run:
docker exec -it claude-ws-proxy-claude-ws-proxy-1 claude auth loginEnsure @anthropic-ai/claude-code is installed globally. In Docker, this is done automatically. For native, run npm install -g @anthropic-ai/claude-code.
Change ports via environment variables or config.json:
HTTP_PORT=4000 npm startTokens usually auto-refresh. If not, visit /auth/login to re-authenticate.
The internal WebSocket port (8765) is used only between the proxy and CLI subprocesses. It should NOT be exposed externally. If sessions fail, check that claude CLI is working: claude --version.
This project uses the reversed Claude Code SDK protocol documented by The-Vibe-Company/companion. The proxy spawns claude CLI processes and communicates with them over WebSocket to execute queries.