LiteLLM Chat Completions API

Provides an OpenAI-compatible /chat/completions endpoint that routes requests to 100+ LLM providers with unified request and response formatting, streaming support, cost tracking, and load balancing.

API entry from apis.yml

apis.yml Raw ↑
aid: litellm:chat-completions-api
name: LiteLLM Chat Completions API
description: Provides an OpenAI-compatible /chat/completions endpoint that routes requests to 100+ LLM
  providers with unified request and response formatting, streaming support, cost tracking, and load balancing.
humanURL: https://docs.litellm.ai/docs/completion
tags:
- AI
- Chat
- Completions
- LLM
properties:
- type: Documentation
  url: https://docs.litellm.ai/docs/completion
- type: GettingStarted
  url: https://docs.litellm.ai/docs/proxy/quick_start