Skip to content

[Feature] Mannually interrupt when outputing answers #1005

@valentimarco

Description

@valentimarco

Discussed in #748

Originally posted by NeverOccurs March 13, 2024
Hi I really love your works. Just a quick question, when I use local ollama models sometimes it produces gibberish and never stops. I have to restart the container and it is quite annoying. How to mannually interrupt a generating process when I don't want it to continue? Many thanks in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    LLMRelated to language model / embedderV2agentRelated to cat agent (reasoner / prompt engine)endpointsRelated to http / ws endpoints

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions