Skip to content

# Feature Request: Support for Claude Code Proxy Tools (CCR Integration) #234

@jiang-yuan

Description

@jiang-yuan

Problem Description

I'm using claude-code-router (CCR), a proxy tool that routes Claude Code requests to different AI models and providers (Gemini, DeepSeek, Ollama, etc.). However, claudecodeui currently doesn't support integration with CCR due to its architecture.

Current Architecture Issue

claudecodeui uses two execution modes:

  1. Claude SDK Mode (Primary - for chat/file editing):

    • Uses @anthropic-ai/claude-agent-sdk npm package
    • Makes direct API calls to Anthropic
    • Bypasses the claude CLI entirely
    • ❌ Cannot be intercepted by CCR or similar proxy tools
  2. PTY Shell Mode (Secondary - for terminal):

    • Spawns claude command via node-pty
    • ✅ Can use CLAUDE_CLI_PATH configuration
    • ✅ Inherits all environment variables
    • ✅ Works with CCR (but only for shell feature)

The Problem

Since the main chat functionality uses the SDK, it bypasses the claude CLI and connects directly to Anthropic's API, making it impossible to use proxy tools like CCR that work by intercepting CLI calls.

Proposed Solutions

Option 1: CLI Execution Mode (Recommended)

Add a configuration option to use the CLI instead of the SDK for chat functionality:

Configuration Example (.env):

# Execution mode: 'sdk' (default) or 'cli'
CLAUDE_EXECUTION_MODE=cli

# CLI path (only used when mode=cli)
CLAUDE_CLI_PATH=ccr code

Implementation:

// In server/claude-sdk.js
const executionMode = process.env.CLAUDE_EXECUTION_MODE || 'sdk';

async function queryClaudeSDK(command, options, ws) {
  if (executionMode === 'cli') {
    // Use PTY to spawn CLI command (similar to shell implementation)
    return await queryViaCLI(command, options, ws);
  } else {
    // Use existing SDK implementation
    return await queryViaSDK(command, options, ws);
  }
}

Option 2: Custom API Base URL

Allow users to specify a custom API endpoint in the SDK configuration:

Configuration Example (.env):

# Custom Anthropic API endpoint (for proxy servers)
ANTHROPIC_BASE_URL=https://my-proxy.com/
ANTHROPIC_API_KEY=<custom-key>

Implementation:

// In server/claude-sdk.js
import { query } from '@anthropic-ai/claude-agent-sdk';

const sdkOptions = {
  ...options,
  apiUrl: process.env.ANTHROPIC_BASE_URL,
  apiKey: process.env.ANTHROPIC_API_KEY
};

const queryInstance = query({
  prompt: finalCommand,
  options: sdkOptions
});

Option 3: Hybrid Mode

Provide both options and let users choose based on their needs:

# Default: SDK mode with custom API endpoint
CLAUDE_EXECUTION_MODE=sdk
ANTHROPIC_BASE_URL=https://my-proxy.com/

# Alternative: CLI mode with custom command
CLAUDE_EXECUTION_MODE=cli
CLAUDE_CLI_PATH=ccr code

Use Cases

This feature would benefit users who:

  • Use CCR to route requests to different AI models (Gemini, DeepSeek, Claude variants, local models via Ollama)
  • Need to use custom API endpoints for enterprise deployments
  • Want to control costs by switching between different model providers
  • Require request logging/monitoring through proxy servers
  • Work in environments with restricted access to Anthropic's API

Additional Context

About CCR (claude-code-router)

  • GitHub: https://github.com/musistudio/claude-code-router
  • Purpose: Routes Claude Code requests to multiple AI providers
  • Usage: ccr code --model <model> "prompt" instead of claude "prompt"
  • Managed Mode: Supports hosted proxy services with authentication

Current Workaround Limitations

  • The CLAUDE_CLI_PATH configuration only affects the shell terminal feature
  • Main chat/editing functionality still uses the SDK, bypassing any CLI-based tools
  • This limits integration with the broader Claude Code ecosystem

Related Projects

  • Similar integration issue exists with opcode (Tauri-based desktop app)
  • These tools represent a growing ecosystem around Claude Code that would benefit from proxy support

Expected Behavior

After implementing one of the proposed solutions, users should be able to:

  1. Configure claudecodeui to work with CCR or similar proxy tools
  2. Route all requests (chat, file editing, terminal) through custom endpoints
  3. Switch between different AI model providers seamlessly
  4. Use hosted/managed proxy services with authentication

Technical Notes

PTY Shell Implementation Reference

The existing shell implementation (/shell WebSocket endpoint) already demonstrates how to properly spawn CLI commands with environment variable inheritance:

shellProcess = pty.spawn(shell, shellArgs, {
    name: 'xterm-256color',
    cols: termCols,
    rows: termRows,
    cwd: projectPath,
    env: process.env  // ✅ Inherits all environment variables
});

This approach could be adapted for the chat functionality when CLAUDE_EXECUTION_MODE=cli.

SDK Flexibility

If the @anthropic-ai/claude-agent-sdk package supports custom API endpoints (similar to OpenAI SDK's baseURL option), Option 2 would be the simplest implementation requiring minimal code changes.


Thank you for building claudecodeui! It's an excellent tool, and adding proxy support would make it even more powerful and compatible with the broader Claude Code ecosystem.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions