Skip to content

Can't support Azure OpenAI api as configuration guide introduction #45

@hellangleZ

Description

@hellangleZ

As I add conf.yaml

The error is :

uv run main.py
/aml/deer-flow/src/llms/llm.py:48: UserWarning: WARNING! api_base is not default parameter.
api_base was transferred to model_kwargs.
Please confirm that api_base is what you intended.
basic_llm = get_llm_by_type("basic")
/aml/deer-flow/src/llms/llm.py:48: UserWarning: WARNING! api_version is not default parameter.
api_version was transferred to model_kwargs.
Please confirm that api_version is what you intended.
basic_llm = get_llm_by_type("basic")
/aml/deer-flow/src/llms/llm.py:48: UserWarning: WARNING! deployment_name is not default parameter.
deployment_name was transferred to model_kwargs.
Please confirm that deployment_name is what you intended.
basic_llm = get_llm_by_type("basic")
Enter your query: hello
2025-05-11 01:45:52,560 - src.workflow - INFO - Starting async workflow with user input: hello
================================ Human Message =================================

hello
2025-05-11 01:45:52,563 - src.graph.nodes - INFO - Coordinator talking.
Traceback (most recent call last):
File "/aml/deer-flow/main.py", line 146, in
ask(
File "/aml/deer-flow/main.py", line 33, in ask
asyncio.run(
File "/home/root123/miniconda3/envs/deerflow/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/home/root123/miniconda3/envs/deerflow/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/root123/miniconda3/envs/deerflow/lib/python3.12/asyncio/base_events.py", line 664, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/aml/deer-flow/src/workflow.py", line 78, in run_agent_workflow_async
async for s in graph.astream(
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langgraph/pregel/init.py", line 2305, in astream
async for _ in runner.atick(
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langgraph/pregel/runner.py", line 444, in atick
await arun_with_retry(
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langgraph/pregel/retry.py", line 128, in arun_with_retry
return await task.proc.ainvoke(task.input, config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 583, in ainvoke
input = await step.ainvoke(input, config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 371, in ainvoke
ret = await asyncio.create_task(coro, context=context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 588, in run_in_executor
return await asyncio.get_running_loop().run_in_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/root123/miniconda3/envs/deerflow/lib/python3.12/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 579, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/src/graph/nodes.py", line 207, in coordinator_node
.invoke(messages)
^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 5365, in invoke
return self.bound.invoke(
^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 307, in invoke
self.generate_prompt(
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 843, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 683, in generate
self._generate_with_cache(
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 908, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 823, in _generate
response = self.client.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/aml/deer-flow/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: Completions.create() got an unexpected keyword argument 'api_base'
During task with name 'coordinator' and id '8d53a07e-01f2-4ef0-66de-29761a6f9beb'

My conf.ymal

cat conf.yaml

BASIC_MODEL:
model: "azure/gpt-4o" # 指定确切的模型版本
api_key: $AZURE_API_KEY
api_base: $AZURE_API_BASE
api_version: $AZURE_API_VERSION

Image The configuration is config as the guidance

Could you please help to check?

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentation

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions