Skip to content

Add auth options in LLM config for local models #3496

@metalshanked

Description

@metalshanked

Is your feature request related to a problem? Please describe.
Currently, the below two options are available for local self hosted LLMs.
Ollama can be proxied via other gateways like LiteLLM which support API Keys and other auth methods

    'default': {
        'ollama': {
            'server_url': '',
            'model': '',
        },
    },

Describe the solution you'd like
Would be great if auth or even http header options are added to the self hosted LLM configs

Describe alternatives you've considered
Jerry rig whitelisting of sources

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions