Skip to main content
Use mistral in llm.profiles.<name>.provider.

Config example

{
  "llm": {
    "profiles": {
      "default": {
        "provider": "mistral",
        "model": "mistral-large-latest",
        "prompt": "Translate from {{source}} to {{target}}:\n\n{{input}}"
      }
    }
  }
}

Required environment variable

export MISTRAL_API_KEY="your-mistral-api-key"

Optional environment variable

MISTRAL_BASE_URL defaults to https://api.mistral.ai/v1.