-
Notifications
You must be signed in to change notification settings - Fork 598
Description
Problem Description
I'm using claude-code-router (CCR), a proxy tool that routes Claude Code requests to different AI models and providers (Gemini, DeepSeek, Ollama, etc.). However, claudecodeui currently doesn't support integration with CCR due to its architecture.
Current Architecture Issue
claudecodeui uses two execution modes:
-
Claude SDK Mode (Primary - for chat/file editing):
- Uses
@anthropic-ai/claude-agent-sdknpm package - Makes direct API calls to Anthropic
- Bypasses the
claudeCLI entirely - ❌ Cannot be intercepted by CCR or similar proxy tools
- Uses
-
PTY Shell Mode (Secondary - for terminal):
- Spawns
claudecommand vianode-pty - ✅ Can use
CLAUDE_CLI_PATHconfiguration - ✅ Inherits all environment variables
- ✅ Works with CCR (but only for shell feature)
- Spawns
The Problem
Since the main chat functionality uses the SDK, it bypasses the claude CLI and connects directly to Anthropic's API, making it impossible to use proxy tools like CCR that work by intercepting CLI calls.
Proposed Solutions
Option 1: CLI Execution Mode (Recommended)
Add a configuration option to use the CLI instead of the SDK for chat functionality:
Configuration Example (.env):
# Execution mode: 'sdk' (default) or 'cli'
CLAUDE_EXECUTION_MODE=cli
# CLI path (only used when mode=cli)
CLAUDE_CLI_PATH=ccr codeImplementation:
// In server/claude-sdk.js
const executionMode = process.env.CLAUDE_EXECUTION_MODE || 'sdk';
async function queryClaudeSDK(command, options, ws) {
if (executionMode === 'cli') {
// Use PTY to spawn CLI command (similar to shell implementation)
return await queryViaCLI(command, options, ws);
} else {
// Use existing SDK implementation
return await queryViaSDK(command, options, ws);
}
}Option 2: Custom API Base URL
Allow users to specify a custom API endpoint in the SDK configuration:
Configuration Example (.env):
# Custom Anthropic API endpoint (for proxy servers)
ANTHROPIC_BASE_URL=https://my-proxy.com/
ANTHROPIC_API_KEY=<custom-key>Implementation:
// In server/claude-sdk.js
import { query } from '@anthropic-ai/claude-agent-sdk';
const sdkOptions = {
...options,
apiUrl: process.env.ANTHROPIC_BASE_URL,
apiKey: process.env.ANTHROPIC_API_KEY
};
const queryInstance = query({
prompt: finalCommand,
options: sdkOptions
});Option 3: Hybrid Mode
Provide both options and let users choose based on their needs:
# Default: SDK mode with custom API endpoint
CLAUDE_EXECUTION_MODE=sdk
ANTHROPIC_BASE_URL=https://my-proxy.com/
# Alternative: CLI mode with custom command
CLAUDE_EXECUTION_MODE=cli
CLAUDE_CLI_PATH=ccr codeUse Cases
This feature would benefit users who:
- Use CCR to route requests to different AI models (Gemini, DeepSeek, Claude variants, local models via Ollama)
- Need to use custom API endpoints for enterprise deployments
- Want to control costs by switching between different model providers
- Require request logging/monitoring through proxy servers
- Work in environments with restricted access to Anthropic's API
Additional Context
About CCR (claude-code-router)
- GitHub: https://github.com/musistudio/claude-code-router
- Purpose: Routes Claude Code requests to multiple AI providers
- Usage:
ccr code --model <model> "prompt"instead ofclaude "prompt" - Managed Mode: Supports hosted proxy services with authentication
Current Workaround Limitations
- The
CLAUDE_CLI_PATHconfiguration only affects the shell terminal feature - Main chat/editing functionality still uses the SDK, bypassing any CLI-based tools
- This limits integration with the broader Claude Code ecosystem
Related Projects
- Similar integration issue exists with opcode (Tauri-based desktop app)
- These tools represent a growing ecosystem around Claude Code that would benefit from proxy support
Expected Behavior
After implementing one of the proposed solutions, users should be able to:
- Configure claudecodeui to work with CCR or similar proxy tools
- Route all requests (chat, file editing, terminal) through custom endpoints
- Switch between different AI model providers seamlessly
- Use hosted/managed proxy services with authentication
Technical Notes
PTY Shell Implementation Reference
The existing shell implementation (/shell WebSocket endpoint) already demonstrates how to properly spawn CLI commands with environment variable inheritance:
shellProcess = pty.spawn(shell, shellArgs, {
name: 'xterm-256color',
cols: termCols,
rows: termRows,
cwd: projectPath,
env: process.env // ✅ Inherits all environment variables
});This approach could be adapted for the chat functionality when CLAUDE_EXECUTION_MODE=cli.
SDK Flexibility
If the @anthropic-ai/claude-agent-sdk package supports custom API endpoints (similar to OpenAI SDK's baseURL option), Option 2 would be the simplest implementation requiring minimal code changes.
Thank you for building claudecodeui! It's an excellent tool, and adding proxy support would make it even more powerful and compatible with the broader Claude Code ecosystem.