List available Ollama models
Fetches the list of models available from the Ollama instance. URL may be provided with or without /v1 suffix (e.g., http://localhost:11434 or http://localhost:11434/v1). The /v1 suffix will be automatically removed for model discovery. For actual LLM inference, use the URL with /v1 suffix for OpenAI-compatible endpoints.
Request
This endpoint expects an object.
base_url
Base URL of the Ollama instance. May include /v1 suffix which will be automatically removed for connection testing. For inference, use the URL with /v1 suffix for OpenAI-compatible endpoints.
api_key
Optional API key for authenticated Ollama instances. If provided, will be sent as Bearer token in Authorization header.
Response
Models retrieved successfully
name
Model name
size
Model size in bytes
digest
Model digest/hash
modified_at
Model modification date