Berri AI
The fastest way to take your LLM app to production
Pinned Loading
Repositories
Showing 10 of 55 repositories
- litellm Public
Call any LLM API with cost tracking, guardrails, logging and load balancing. 1.8k+ models, 80+ providers, 50+ endpoints (unified + native format). Available as a Python SDK or Proxy Server (AI Gateway).
BerriAI/litellm’s past year of commit activity - Automated_Perf_Tests Public
BerriAI/Automated_Perf_Tests’s past year of commit activity - serxng-deployment Public
BerriAI/serxng-deployment’s past year of commit activity - terraform-provider-litellm Public Forked from ncecere/terraform-provider-litellm
litellm terraform provider
BerriAI/terraform-provider-litellm’s past year of commit activity - simple_proxy_openai Public
BerriAI/simple_proxy_openai’s past year of commit activity - locust-load-tester Public
BerriAI/locust-load-tester’s past year of commit activity - cloudzero-litellm-etl Public Forked from Cloudzero/cloudzero-litellm-toolkit
LiteLLM data analysis, transformation and transmission to CloudZero utility
BerriAI/cloudzero-litellm-etl’s past year of commit activity
People
This organization has no public members. You must be a member to see who’s a part of this organization.
Most used topics
Loading…