Is your feature request related to a problem? Please describe.
Currently, the below two options are available for local self hosted LLMs.
Ollama can be proxied via other gateways like LiteLLM which support API Keys and other auth methods
'default': {
'ollama': {
'server_url': '',
'model': '',
},
},
Describe the solution you'd like
Would be great if auth or even http header options are added to the self hosted LLM configs
Describe alternatives you've considered
Jerry rig whitelisting of sources