I met a issue when I try to use openrouter api.
My network is LAN of my company but I export a proxy.
And before I try openrouter api, I can successfully call model like qwen3-32b based on ollama. But it seems that open source llm model cannot call function good. So I turn to openrouter for calling gemini model because of your recommendation,
Could you help me identify the problem?
Thanks a lot!