API reference

GET /v1/models

Returns the union of your workspace's route aliases and the direct model ids cached from OpenAI and Anthropic at server startup. Matches the OpenAI SDK's expected shape.

Endpoint

GET https://api.usellm.io/v1/models

Authentication

Same as POST /v1/chat/completions — bearer ul_live_*.

Response

Responsejson
{
  "object": "list",
  "data": [
    {
      "id": "smart",
      "object": "model",
      "created": 1715600000,
      "owned_by": "route:anthropic"
    },
    {
      "id": "gpt-4o-mini",
      "object": "model",
      "created": 1715600000,
      "owned_by": "openai"
    }
  ]
}

Entries with owned_by starting with route: are workspace aliases. The remainder are direct model ids from the startup provider catalog. The gateway routes direct model ids based on prefix (gpt-* / o* / claude-* / text-embedding-*).

SDK behavior

Most OpenAI SDKs call this endpoint to populate autocompletion or to validate model strings before send. Because the gateway returns the union of your aliases and the startup-cached provider catalog, IDE autocomplete will surface route names like smart alongside the underlying provider models.

cURLbash
curl https://api.usellm.io/v1/models \
  -H "Authorization: Bearer ul_live_XXXXXXXXXXXXXXXXXXXXXXXX"

Related

/models in the dashboard shows the same catalog, enriched with pricing, context windows, and capability tags from the LiteLLM-backed model registry.