Skip to main content

Smart Context Management

When working with Large Language Models (LLMs), there are limits to how much conversation history they can process at once. Goose provides smart context management features to help you maintain productive sessions even when reaching these limits. Here are the key concepts:

  • Context Length: The amount of conversation history the LLM can consider
  • Context Limit: The maximum number of tokens the model can process
  • Context Management: How Goose handles conversations approaching these limits

Smart Context Management Features

When a conversation reaches the context limit, Goose offers different ways to handle it:

FeatureDescriptionBest ForImpact
SummarizationCondenses conversation while preserving key pointsLong, complex conversationsMaintains most context
TruncationRemoves oldest messages to make roomSimple, linear conversationsLoses old context
ClearStarts fresh while keeping session activeNew direction in conversationLoses all context

Using Smart Context Management

When you reach the context limit in Goose Desktop:

  1. You'll see a notification that the context limit has been reached
  2. You'll need to start a new session to continue your conversation
tip

You can access previous context by:

  • Referencing information from your previous sessions
  • Using the Memory extension to maintain context across sessions and reference information from previous conversations