List available Ollama models

Fetches the list of models available from the Ollama instance. URL may be provided with or without /v1 suffix (e.g., http://localhost:11434 or http://localhost:11434/v1). The /v1 suffix will be automatically removed for model discovery. For actual LLM inference, use the URL with /v1 suffix for OpenAI-compatible endpoints.

Request

This endpoint expects an object.
base_urlstringRequired>=1 character

Base URL of the Ollama instance. May include /v1 suffix which will be automatically removed for connection testing. For inference, use the URL with /v1 suffix for OpenAI-compatible endpoints.

api_keystringOptional
Optional API key for authenticated Ollama instances. If provided, will be sent as Bearer token in Authorization header.

Response

Models retrieved successfully
namestring>=1 character
Model name
sizelong or null
Model size in bytes
digeststring or null

Model digest/hash

modified_atstring or nullformat: "date-time"
Model modification date

Errors