diff --git a/.circleci/config.yml b/.circleci/config.yml index 6dd1177f79c9..a518628afb99 100644 --- a/.circleci/config.yml +++ b/.circleci/config.yml @@ -1785,7 +1785,7 @@ jobs: - audio_coverage installing_litellm_on_python: docker: - - image: circleci/python:3.8 + - image: cimg/python:3.11 auth: username: ${DOCKERHUB_USERNAME} password: ${DOCKERHUB_PASSWORD} @@ -3389,7 +3389,9 @@ jobs: nvm use 20 cd ui/litellm-dashboard - npm ci || npm install + # Remove node_modules and package-lock to ensure clean install (fixes optional deps issue) + rm -rf node_modules package-lock.json + npm install # CI run, with both LCOV (Codecov) and HTML (artifact you can click) CI=true npm run test -- --run --coverage \ diff --git a/.github/workflows/test-litellm.yml b/.github/workflows/test-litellm.yml index 1d9bd201fa87..c7de07aec624 100644 --- a/.github/workflows/test-litellm.yml +++ b/.github/workflows/test-litellm.yml @@ -37,7 +37,7 @@ jobs: - name: Setup litellm-enterprise as local package run: | cd enterprise - python -m pip install -e . + poetry run pip install -e . cd .. - name: Run tests run: | diff --git a/AGENTS.md b/AGENTS.md index d72b00f7e14f..2c778dc0d71c 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -98,6 +98,25 @@ LiteLLM supports MCP for agent workflows: Use `poetry run python script.py` to run Python scripts in the project environment (for non-test files). +## GITHUB TEMPLATES + +When opening issues or pull requests, follow these templates: + +### Bug Reports (`.github/ISSUE_TEMPLATE/bug_report.yml`) +- Describe what happened vs. expected behavior +- Include relevant log output +- Specify LiteLLM version +- Indicate if you're part of an ML Ops team (helps with prioritization) + +### Feature Requests (`.github/ISSUE_TEMPLATE/feature_request.yml`) +- Clearly describe the feature +- Explain motivation and use case with concrete examples + +### Pull Requests (`.github/pull_request_template.md`) +- Add at least 1 test in `tests/litellm/` +- Ensure `make test-unit` passes + + ## TESTING CONSIDERATIONS 1. **Provider Tests**: Test against real provider APIs when possible diff --git a/CLAUDE.md b/CLAUDE.md index 159843233948..23a0e97eaeec 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -28,6 +28,22 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co ### Running Scripts - `poetry run python script.py` - Run Python scripts (use for non-test files) +### GitHub Issue & PR Templates +When contributing to the project, use the appropriate templates: + +**Bug Reports** (`.github/ISSUE_TEMPLATE/bug_report.yml`): +- Describe what happened vs. what you expected +- Include relevant log output +- Specify your LiteLLM version + +**Feature Requests** (`.github/ISSUE_TEMPLATE/feature_request.yml`): +- Describe the feature clearly +- Explain the motivation and use case + +**Pull Requests** (`.github/pull_request_template.md`): +- Add at least 1 test in `tests/litellm/` +- Ensure `make test-unit` passes + ## Architecture Overview LiteLLM is a unified interface for 100+ LLM providers with two main components: diff --git a/Dockerfile b/Dockerfile index d9ea0d9a4711..f75706805e08 100644 --- a/Dockerfile +++ b/Dockerfile @@ -48,7 +48,7 @@ FROM $LITELLM_RUNTIME_IMAGE AS runtime USER root # Install runtime dependencies -RUN apk add --no-cache openssl tzdata +RUN apk add --no-cache openssl tzdata nodejs npm # Upgrade pip to fix CVE-2025-8869 RUN pip install --upgrade pip>=24.3.1 diff --git a/GEMINI.md b/GEMINI.md index efcee04d4c3b..a9d40c910b20 100644 --- a/GEMINI.md +++ b/GEMINI.md @@ -25,6 +25,25 @@ This file provides guidance to Gemini when working with code in this repository. - `poetry run pytest tests/path/to/test_file.py -v` - Run specific test file - `poetry run pytest tests/path/to/test_file.py::test_function -v` - Run specific test +### Running Scripts +- `poetry run python script.py` - Run Python scripts (use for non-test files) + +### GitHub Issue & PR Templates +When contributing to the project, use the appropriate templates: + +**Bug Reports** (`.github/ISSUE_TEMPLATE/bug_report.yml`): +- Describe what happened vs. what you expected +- Include relevant log output +- Specify your LiteLLM version + +**Feature Requests** (`.github/ISSUE_TEMPLATE/feature_request.yml`): +- Describe the feature clearly +- Explain the motivation and use case + +**Pull Requests** (`.github/pull_request_template.md`): +- Add at least 1 test in `tests/litellm/` +- Ensure `make test-unit` passes + ## Architecture Overview LiteLLM is a unified interface for 100+ LLM providers with two main components: diff --git a/README.md b/README.md index b29c86a1125f..20483ce70c0f 100644 --- a/README.md +++ b/README.md @@ -11,7 +11,7 @@
Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.]