Any luck integrating with Ollama or LM Studio? #1303
Unanswered
xxgeoffreyxx
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I tried for the better part of this weekend to get Stagehand working with a Local LLM - which based on a few gists I assumed was something that worked/was loosely supported?
Still new, but it looks like Stagehand v3 has changed from using /v1/chat/completions endpoint to a newer /v1/responses endpoint. For the life of me I couldn't get either Ollama or LM Studio models to work using the newer endpoint, but still very green so might be missing something.
If this was intentional, if anyone could jot up some notes on how to get it to work, I would be appreciative. Really love the potential of Stagehand + the massive improvements in V3.
Cheers.
Beta Was this translation helpful? Give feedback.
All reactions