Observability with Langfuse
This tutorial covers how to integrate Goose with Langfuse to monitor your Goose requests and understand how the agent is performing.
What is Langfuseβ
Langfuse is an open-source LLM engineering platform that enables teams to collaboratively monitor, evaluate, and debug their LLM applications.
Set up Langfuseβ
Sign up for Langfuse Cloud here or self-host Langfuse Docker Compose to get your Langfuse API keys.
Configure Goose to Connect to Langfuseβ
Set the environment variables so that Goose (written in Rust) can connect to the Langfuse server.
export LANGFUSE_INIT_PROJECT_PUBLIC_KEY=pk-lf-...
export LANGFUSE_INIT_PROJECT_SECRET_KEY=sk-lf-...
export LANGFUSE_URL=https://cloud.langfuse.com # EU data region πͺπΊ
# https://us.cloud.langfuse.com if you're using the US region πΊπΈ
# https://localhost:3000 if you're self-hosting
Run Goose with Langfuse Integrationβ
Now, you can run Goose and monitor your AI requests and actions through Langfuse.
With Goose running and the environment variables set, Langfuse will start capturing traces of your Goose activities.
Example trace (public) in Langfuse