This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
OpenProxy is a lightweight, zero-config gateway designed for Claude Code and Gemini CLI that enables them to use OpenAI-Compatible APIs. It translates API calls from Gemini and Anthropic clients to OpenAI API calls.
npm run dev- Start development server with hot reload (tsx watch)npm run build- Compile TypeScript to JavaScriptnpm run start- Start production server from compiled codenpm test- Run test suite with Vitest
- Tests are located in
src/**/*.test.tsfiles - Uses Vitest with Node.js environment
- Run specific tests:
npm test -- src/path/to/test.ts
- Main Application (
src/index.ts): Hono-based web server with routing logic - Server Entry (
src/node-entry.ts): Node.js server with graceful shutdown handling - Provider Handlers (
src/providers/): Gemini and Anthropic-specific request/response mapping
- Client request → Proxy server
- Route matching based on path patterns:
- Anthropic:
/v1/messagesendpoint - Gemini:
:generateContentand:streamGenerateContentendpoints
- Anthropic:
- Provider handler extracts base URL and API key from request
- Request mapping to OpenAI-compatible format
- Forward to target OpenAI API
- Response mapping back to provider format
- Return to client
-
Anthropic (
src/providers/anthropic/):handler.ts- Main request routing and error handlingrequest.ts- Map Anthropic → OpenAI request formatresponse.ts- Map OpenAI → Anthropic response formatsse.ts- Stream conversion for server-sent eventsutils.ts- URL parsing and API key extraction
-
Gemini (
src/providers/gemini/):- Similar structure to Anthropic with Gemini-specific mappings
- Hono: Web framework for routing and middleware
- TypeScript: Type-safe development with strict mode
- Vitest: Testing framework with Node.js environment
- tsx: TypeScript execution for development
The service is stateless and requires no server-side configuration. All connection information is extracted from incoming requests:
- Base URL: Embedded in request path (e.g.,
/https://api.openrouter.ai/v1/...) - API Keys: Extracted from headers or query parameters
- Local:
npm run start(port 3000) - Docker: Pre-built image available
- Cloudflare Workers: Worker deployment supported
- Vercel: Serverless deployment supported
GRACEFUL_TIMEOUT_MS: Graceful shutdown timeout (default: 10000ms)
- The proxy handles both streaming and non-streaming requests
- CORS is enabled for all routes
- Structured logging is implemented via
src/logger.ts - Error handling follows provider-specific error formats
- Streaming uses
TransformStreamfor real-time SSE conversion