Litellm
LiteLLM: LLM Gateway for managing and accessing 100+ LLMs in OpenAI format.
Please wait while we load the page
LiteLLM is an LLM Gateway (OpenAI Proxy) designed to manage authentication, load balancing, and spend tracking across 100+ LLMs, all while maintaining the OpenAI format. It simplifies the process of using LLM APIs from various providers like OpenAI, Azure, Cohere, Anthropic, Replicate, and Google. LiteLLM offers consistent outputs and exceptions for all LLM APIs, along with logging and error tracking for all models. It provides features like cost tracking, batches API, guardrails, model access, budgets, LLM observability, rate limiting, prompt management, S3 logging, and pass-through endpoints.
Use LiteLLM by calling LLM APIs using the chatGPT format - completion(model, messages). It provides consistent outputs and exceptions for all LLM APIs. You can deploy LiteLLM open source or try LiteLLM Enterprise for more features.
Choose this if you want a lightweight language model that’s easy to integrate and use. It’s perfect for developers needing a simple yet effective AI backend.
Free
For giving LLM access to a large number of developers and projects. Includes Enterprise Support, Custom SLAs, JWT Auth, SSO, and Audit Logs.
No products available