diff --git a/weave/guides/integrations/local_models.mdx b/weave/guides/integrations/local_models.mdx index 8029ae60dd..efba0d5141 100644 --- a/weave/guides/integrations/local_models.mdx +++ b/weave/guides/integrations/local_models.mdx @@ -25,7 +25,7 @@ In the case of local models, the `api_key` can be any string but it should be ov Here's a list of apps that allows you to download and run models from Hugging Face on your computer, that support OpenAI SDK compatibility. 1. Nomic [GPT4All](https://www.nomic.ai/gpt4all) - support via Local Server in settings ([FAQ](https://docs.gpt4all.io/gpt4all_help/faq.html)) -1. [LMStudio](https://lmstudio.ai/) - Local Server OpenAI SDK support [docs](https://lmstudio.ai/docs/local-server) +1. [LMStudio](https://lmstudio.ai/) - Local Server OpenAI SDK support [docs](https://lmstudio.ai/docs/developer/core/server) 1. [Ollama](https://ollama.com/) - [Experimental Support](https://github.com/ollama/ollama/blob/main/docs/openai.mdx) for OpenAI SDK 1. llama.cpp via [llama-cpp-python](https://llama-cpp-python.readthedocs.io/en/latest/server/) python package 1. [llamafile](https://github.com/Mozilla-Ocho/llamafile#other-example-llamafiles) - `http://localhost:8080/v1` automatically supports OpenAI SDK on Llamafile run