Cognee Extension
This tutorial covers how to add the Cognee MCP Server as a Goose extension to enable knowledge graph memory capabilities, connecting to over 30 data sources for enhanced context and retrieval.
Command
uv --directory /path/to/cognee-mcp run python src/server.py
Environment Variables
LLM_API_KEY: <YOUR_OPENAI_API_KEY>
EMBEDDING_API_KEY: <YOUR_OPENAI_API_KEY>
Configuration
Note that you'll need uv installed on your system to run this command, as it uses uv
.
- Goose CLI
- First, install Cognee:
# Clone and install Cognee
git clone https://github.com/topoteretes/cognee
cd cognee-mcp
uv sync --dev --all-extras --reinstall
# On Linux, install additional dependencies
sudo apt install -y libpq-dev python3-dev
- Run the
configure
command:
goose configure
- Choose to add a
Command-line Extension
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension (Connect to a new extension)
│
◆ What type of extension would you like to add?
│ ○ Built-in Extension
│ ● Command-line Extension (Run a local command or script)
│ ○ Remote Extension (SSE)
│ ○ Remote Extension (Streaming HTTP)
└
- Give your extension a name
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension (Connect to a new extension)
│
◇ What type of extension would you like to add?
│ Command-line Extension
│
◆ What would you like to call this extension?
│ Cognee
└
- Enter the command
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension (Connect to a new extension)
│
◇ What type of extension would you like to add?
│ Command-line Extension
│
◇ What would you like to call this extension?
│ Cognee
│
◆ What command should be run?
│ uv --directory /path/to/cognee-mcp run python src/server.py
└
- Enter the number of seconds Goose should wait for actions to complete before timing out. Default is 300s
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension (Connect to a new extension)
│
◇ What type of extension would you like to add?
│ Command-line Extension
│
◇ What would you like to call this extension?
│ Cognee
│
◇ What command should be run?
│ uv --directory /path/to/cognee-mcp run python src/server.py
│
◆ Please set the timeout for this tool (in secs):
│ 300
│
└
- Choose to add a description. If you select "Yes" here, you will be prompted to enter a description for the extension.
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension (Connect to a new extension)
│
◇ What type of extension would you like to add?
│ Command-line Extension
│
◇ What would you like to call this extension?
│ Cognee
│
◇ What command should be run?
│ uv --directory /path/to/cognee-mcp run python src/server.py
│
◇ Please set the timeout for this tool (in secs):
│ 300
│
◇ Would you like to add a description?
│ No
│
└
- Add the required environment variables:
You'll need OpenAI API keys for both LLM and embedding models. Get your API keys here.
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension (Connect to a new extension)
│
◇ What type of extension would you like to add?
│ Command-line Extension
│
◇ What would you like to call this extension?
│ Cognee
│
◇ What command should be run?
│ uv --directory /path/to/cognee-mcp run python src/server.py
│
◇ Please set the timeout for this tool (in secs):
│ 300
│
◇ Would you like to add a description?
│ No
│
◆ Would you like to add environment variables?
│ Yes
│
◇ Environment variable name:
│ LLM_API_KEY
│
◇ Environment variable value:
│ ▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪
│
◇ Add another environment variable?
│ Yes
│
◇ Environment variable name:
│ EMBEDDING_API_KEY
│
◇ Environment variable value:
│ ▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪
│
◇ Add another environment variable?
│ No
└ Added Cognee extension
Example Usage
Cognee provides knowledge graph memory capabilities for Goose, allowing it to remember and connect information across conversations and documents.
OpenAI's GPT-4o was used for this task.
Goose Prompt
Goose, please cognify this information: "I prefer Python for data analysis and use pandas extensively. My current project involves analyzing customer behavior data." Then search for information about my programming preferences.
Goose Output
Tool Calls
─── cognee_cognify | abcdefgh ──────────────────────────
data: I prefer Python for data analysis and use pandas extensively. My current project involves analyzing customer behavior data.
─── cognee_search | abcdefgh ──────────────────────────
search_query: programming preferences Python data analysis
search_type: GRAPH_COMPLETION
I'll help you store that information in the knowledge graph and then search for your programming preferences.
First, let me cognify the information about your preferences:
Cognify Status: Background process launched due to MCP timeout limitations. The information about your Python and pandas preferences for data analysis has been added to the knowledge graph.
Now let me search for information about your programming preferences:
Search Results: Based on the knowledge graph, I found information about your programming preferences:
- Language Preference: You prefer Python for data analysis
- Key Libraries: You use pandas extensively for data manipulation
- Current Focus: Your current project involves analyzing customer behavior data
- Domain: Data science and analytics workflows
- Technical Stack: Python + pandas for data analysis tasks
The knowledge graph has successfully stored and can now retrieve your programming preferences and project context for future conversations.