-
-
Notifications
You must be signed in to change notification settings - Fork 4.8k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
When trying to add the codex models from copilot:
- model_name: copilot-gpt-5.1-codex
litellm_params:
model: github_copilot/gpt-5.1-codex
extra_headers: {"Editor-Version": "vscode/1.85.1", "Copilot-Integration-Id": "vscode-chat"}
and calling the endpoint, I get
400 Github_copilotException - model gpt-5.1-codex is not accessible via the /chat/completions endpoint.
I tried adding wire_api: responses or model: github_copilot/responses/gpt-5.1-codex
but both then lead to 500 internal errors:
Github_copilotException - functools.partial() got multiple values for keyword argument 'acompletion'
Relevant log output
litellm-1 | 12:54:59 - LiteLLM Proxy:ERROR: common_request_processing.py:707 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.APIConnectionError: APIConnectionError: Github_copilotException - functools.partial() got multiple values for keyword argument 'acompletion'. Received Model Group=copilot-gpt-5.1-codex
litellm-1 | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
litellm-1 | Traceback (most recent call last):
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/main.py", line 584, in acompletion
litellm-1 | func = partial(completion, **completion_kwargs, **kwargs)
litellm-1 | TypeError: functools.partial() got multiple values for keyword argument 'acompletion'
litellm-1 |
litellm-1 | During handling of the above exception, another exception occurred:
litellm-1 |
litellm-1 | Traceback (most recent call last):
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 4782, in chat_completion
litellm-1 | result = await base_llm_response_processor.base_process_llm_request(
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | ...<16 lines>...
litellm-1 | )
litellm-1 | ^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/proxy/common_request_processing.py", line 502, in base_process_llm_request
litellm-1 | responses = await llm_responses
litellm-1 | ^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1093, in acompletion
litellm-1 | raise e
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1069, in acompletion
litellm-1 | response = await self.async_function_with_fallbacks(**kwargs)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4037, in async_function_with_fallbacks
litellm-1 | return await self.async_function_with_fallbacks_common_utils(
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | ...<8 lines>...
litellm-1 | )
litellm-1 | ^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3995, in async_function_with_fallbacks_common_utils
litellm-1 | raise original_exception
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4029, in async_function_with_fallbacks
litellm-1 | response = await self.async_function_with_retries(*args, **kwargs)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4234, in async_function_with_retries
litellm-1 | raise original_exception
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4125, in async_function_with_retries
litellm-1 | response = await self.make_call(original_function, *args, **kwargs)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4245, in make_call
litellm-1 | response = await response
litellm-1 | ^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1372, in _acompletion
litellm-1 | raise e
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1324, in _acompletion
litellm-1 | response = await _response
litellm-1 | ^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1638, in wrapper_async
litellm-1 | raise e
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1484, in wrapper_async
litellm-1 | result = await original_function(*args, **kwargs)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/main.py", line 617, in acompletion
litellm-1 | raise exception_type(
litellm-1 | ...<5 lines>...
litellm-1 | )
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/main.py", line 598, in acompletion
litellm-1 | response = await init_response
litellm-1 | ^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/completion_extras/litellm_responses_transformation/handler.py", line 173, in acompletion
litellm-1 | result = await aresponses(
litellm-1 | ^^^^^^^^^^^^^^^^^
litellm-1 | ...<2 lines>...
litellm-1 | )
litellm-1 | ^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1638, in wrapper_async
litellm-1 | raise e
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1484, in wrapper_async
litellm-1 | result = await original_function(*args, **kwargs)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/responses/main.py", line 458, in aresponses
litellm-1 | raise litellm.exception_type(
litellm-1 | ...<5 lines>...
litellm-1 | )
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/responses/main.py", line 439, in aresponses
litellm-1 | response = await init_response
litellm-1 | ^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/responses/litellm_completion_transformation/handler.py", line 117, in async_response_api_handler
litellm-1 | ] = await litellm.acompletion(
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | **acompletion_args,
litellm-1 | ^^^^^^^^^^^^^^^^^^^
litellm-1 | )
litellm-1 | ^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1638, in wrapper_async
litellm-1 | raise e
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1484, in wrapper_async
litellm-1 | result = await original_function(*args, **kwargs)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/main.py", line 617, in acompletion
litellm-1 | raise exception_type(
litellm-1 | ~~~~~~~~~~~~~~^
litellm-1 | model=model,
litellm-1 | ^^^^^^^^^^^^
litellm-1 | ...<3 lines>...
litellm-1 | extra_kwargs=kwargs,
litellm-1 | ^^^^^^^^^^^^^^^^^^^^
litellm-1 | )
litellm-1 | ^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2328, in exception_type
litellm-1 | raise e # it's already mapped
litellm-1 | ^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 569, in exception_type
litellm-1 | raise APIConnectionError(
litellm-1 | ...<7 lines>...
litellm-1 | )
litellm-1 | litellm.exceptions.APIConnectionError: litellm.APIConnectionError: APIConnectionError: Github_copilotException - functools.partial() got multiple values for keyword argument 'acompletion'. Received Model Group=copilot-gpt-5.1-codex
litellm-1 | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
litellm-1 | INFO: 172.29.0.5:46852 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server ErrorAre you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.79.3-stable
Twitter / LinkedIn details
No response
nbsp1221 and yongs3
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working