Skip to content

Models

Configuring an LLM provider and model.

codeflow uses the AI SDK and Models.dev to support for 75+ LLM providers and it supports running local models.


Providers

Most popular providers are preloaded by default. If you’ve added the credentials for a provider through codeflow auth login, they’ll be available when you start codeflow.

Learn more about providers.


Select a model

Once you’ve configured your provider you can select the model you want by typing in:

/models

There are a lot of models out there, with new models coming out every week.

However, there are a only a few of them that are good at both generating code and tool calling.

Here are the ones we recommend with codeflow:

  • Claude Sonnet 4
  • Claude Opus 4
  • Kimi K2
  • Qwen3 Coder
  • GPT 4.1
  • Gemini 2.5 Pro

Set a default

To set one of these as the default model, you can set the model key in your codeflow config.

codeflow.json
{
"$schema": "https://codeflow.ai/config.json",
"model": "lmstudio/google/gemma-3n-e4b"
}

Here the full ID is provider_id/model_id.

If you’ve configured a custom provider, the provider_id is key from the provider part of your config, and the model_id is the key from provider.models.


Configure models

You can globally configure a model’s options through the config.

codeflow.jsonc
{
"$schema": "https://codeflow.ai/config.json",
"provider": {
"openai": {
"models": {
"gpt-5": {
"options": {
"reasoningEffort": "high",
"textVerbosity": "low",
"reasoningSummary": "auto",
"include": ["reasoning.encrypted_content"],
},
},
},
},
"anthropic": {
"models": {
"claude-sonnet-4-20250514": {
"options": {
"thinking": {
"type": "enabled",
"budgetTokens": 16000,
},
},
},
},
},
},
}

Here we’re configuring global settings for two models: gpt-5 when accessed via the openai provider, and claude-sonnet-4-20250514 when accessed via the anthropic provider.

You can also configure these options for any agents that you are using. The agent config overrides any global options here. Learn more.


Loading models

When codeflow starts up, it checks for models in the following priority order:

  1. The --model or -m command line flag. The format is the same as in the config file: provider_id/model_id.

  2. The model list in the codeflow config.

    codeflow.json
    {
    "$schema": "https://codeflow.ai/config.json",
    "model": "anthropic/claude-sonnet-4-20250514"
    }

    The format here is provider/model.

  3. The last used model.

  4. The first model using an internal priority.