Skip to main content

Supported LLM Providers

Goose is compatible with a wide range of LLM providers, allowing you to choose and integrate your preferred model.

Available Providers

ProviderDescriptionParameters
AnthropicOffers Claude, an advanced AI model for natural language tasks.ANTHROPIC_API_KEY
DatabricksUnified data analytics and AI platform for building and deploying models.DATABRICKS_HOST, DATABRICKS_TOKEN
GeminiAdvanced LLMs by Google with multimodal capabilities (text, images).GOOGLE_API_KEY
GroqHigh-performance inference hardware and tools for LLMs.GROQ_API_KEY
OllamaLocal model runner supporting Qwen, Llama, DeepSeek, and other open-source models. Because this provider runs locally, you must first download and run a model.N/A
OpenAIProvides gpt-4o, o1, and other advanced language models.OPENAI_API_KEY
OpenRouterAPI gateway for unified access to various models with features like rate-limiting management.OPENROUTER_API_KEY
Model Recommendation

Goose currently works best with Anthropic's Claude 3.5 Sonnet and OpenAI's o1 model.

Configure Provider

To configure your chosen provider or see available options, run goose configure in the CLI or visit the Provider Settings page in the Goose Desktop.

  1. Run the following command:
goose configure
  1. Select Configure Providers from the menu and press Enter.
┌   goose-configure 

◆ What would you like to configure?
│ ● Configure Providers (Change provider or update credentials)
│ ○ Toggle Extensions
│ ○ Add Extension

  1. Choose a model provider and press Enter.
┌   goose-configure 

◇ What would you like to configure?
│ Configure Providers

◆ Which model provider should we use?
│ ● Anthropic (Claude and other models from Anthropic)
│ ○ Databricks
│ ○ Google Gemini
│ ○ Groq
│ ○ Ollama
│ ○ OpenAI
│ ○ OpenRouter

  1. Enter you API key (and any other configuration details) when prompted
┌   goose-configure 

◇ What would you like to configure?
│ Configure Providers

◇ Which model provider should we use?
│ Anthropic

◆ Provider Anthropic requires ANTHROPIC_API_KEY, please enter a value


Local LLMs (Ollama)

Ollama provides local LLMs, which requires a bit more set up before you can use it with Goose.

  1. Download Ollama.
  2. Run any model supporting tool-calling:
Limited Support for models without tool calling

Goose extensively uses tool calling, so models without it (e.g. DeepSeek-r1) can only do chat completion. If using models without tool calling, all Goose extensions must be disabled. As an alternative, you can use a custom DeepSeek-r1 model we've made specifically for Goose.

Example:

ollama run qwen2.5
  1. In a separate terminal window, configure with Goose:
goose configure
  1. Choose to Configure Providers
┌   goose-configure 

◆ What would you like to configure?
│ ● Configure Providers (Change provider or update credentials)
│ ○ Toggle Extensions
│ ○ Add Extension

  1. Choose Ollama as the model provider
┌   goose-configure 

◇ What would you like to configure?
│ Configure Providers

◆ Which model provider should we use?
│ ○ Anthropic
│ ○ Databricks
│ ○ Google Gemini
│ ○ Groq
│ ● Ollama (Local open source models)
│ ○ OpenAI
│ ○ OpenRouter

  1. Enter the model you have running
┌   goose-configure 

◇ What would you like to configure?
│ Configure Providers

◇ Which model provider should we use?
│ Ollama

◇ Provider Ollama requires OLLAMA_HOST, please enter a value
│ http://localhost:11434

◇ Enter a model from that provider:
│ qwen2.5

◇ Welcome! You're all set to explore and utilize my capabilities. Let's get started on solving your problems together!

└ Configuration saved successfully