Configure Ollama as an LLM provider for translation generation.
ollama
llm.profiles.<name>.provider
{ "llm": { "profiles": { "default": { "provider": "ollama", "model": "qwen2.5:7b", "prompt": "Translate from {{source}} to {{target}}:\n\n{{input}}" } } } }
export OLLAMA_BASE_URL="http://127.0.0.1:11434/v1" export OLLAMA_API_KEY="ollama"
Was this page helpful?