Skip to main content

3 Prompts to Test for Agent Readiness

· 3 min read
Angie Jones
Head of Developer Relations

blog cover

Goose is LLM-agnostic, meaning you can plug in the model of your choice. However, not every LLM is suitable to work with agents. Some may be great at answering things, but not actually doing things. If you're considering which model to use with an agent, these 3 prompts can quickly give you a sense of the model's capabilities.

How I Manage Localhost Port Conflicts With an AI Agent

· 3 min read
Rizel Scarlett
Staff Developer Advocate

blog cover

Localhost Ports Hoarding

I'm perpetually drowning in open tabs. Yes, I do need Bluesky, ChatGPT, Claude, Goose, Cursor, Discord, Slack, Netflix, and Google Docs all open at the same time. I've learned that tab management isn't my only vice.

"Hi, my name is Rizel, and I'm a localhost ports hoarder. 👋🏿"

It always starts innocently with me running one project that uses localhost:3000. Then a teammate pings me to review their PR. Their project also runs on localhost:3000. Fortunately, modern frameworks are smart enough to automatically bump to ports 3001 or 3002. Before I know it, 3003, 3004, and 3005 are running too, and my laptop sounds like it’s about to take off as it crawls under the weight of too many processes.

Traditional Ports Management

Typically, freeing up ports involves running the following commands:

# Step 1: list all open ports
lsof -i -P | grep LISTEN

# Step 2: find a specific port
lsof -i :3000

# Step 3: kill the process
kill -9 98026

Because I don't do this often, I often end up searching for the commands, which derails what I was initially working on.

Using Goose as My Port Manager

To avoid that interruption, I started offloading the task to Goose, an open source AI agent.

Goose uses its built-in Developer MCP server to handle coding-related tasks on my machine. Here's what the interaction looks like:

Prompt to Goose:

Can you tell me which ports I have running?

Goose Response:

Development Servers:
- Node.js servers on ports 3000, 3001, 3002
- Port 18030

Database:
- PostgreSQL on port 5432

Development Tools:
- Ollama on port 11434
- Multiple Goose instances (ports 49377, 49899, 63594, 64365)

Applications:
- Okta Verify (port 8769)
- inSync (ports 7010, 49831, 49830)
- Pieces (ports 39300, 62638)
- Rapportd macOS service (port 60151)

Prompt to Goose:

Kill all the Node.js servers running on localhost.

Goose Response:

The command executed successfully, freeing up ports 3000, 3001, and 3002.

Why Use AI for Simple Tasks

Freeing up ports is something I know how to do. But I use AI tools like Goose for small tasks that create friction and slow me down. I'm not trying to cram more work into the day. My goal is to work efficiently, make a positive impact, and still have time for my personal life.

Goose Gets a Driver's License!

· 6 min read
W Ian Douglas
Staff Developer Advocate

blog cover

I taught Goose how to drive (a rover)

Goose has no hands, no eyes, and no spatial awareness, but it can drive a rover!

I came across a demo video from Deemkeen, where he used Goose to control a Makeblock mbot2 rover using natural language commands like "drive forward/backward," "beep," and "turn left/right" powered by a Java-based MCP server and MQTT.

Inspired and excited to take it further, I taught the rover to spin, blink colorful lights, and help me take over the world!

Goose and Qwen3 for Local Execution

· 3 min read
Michael Neale
Principal Engineer

local AI agent

A couple of weeks back, Qwen 3 launched with a raft of capabilities and sizes. This model showed promise and even in very compact form, such as 8B parameters and 4bit quantization, was able to do tool calling successfully with goose. Even multi turn tool calling.

I haven't seen this work at such a scaled down model so far, so this is really impressive and bodes well for both this model, but also future open weight models both large and small. I would expect the Qwen3 larger models work quite well on various tasks but even this small one I found useful.

4 Things You Need to Know Before Using Goose

· 5 min read
Ebony Louis
Developer Advocate

blog cover

So you’ve heard about Goose. Maybe you saw a livestream, someone on your team mentioned it, or you just stumbled into our corner of the internet while trying to automate your dev setup. Either way—love that for you.

Goose is a local, open source AI agent that can automate tasks, interact with your codebase, and connect to a growing ecosystem of tools. But before you hit install, here are four things you should know to get the most out of it.

How One Contribution Can Spark Many Wins

· 3 min read
Tania Chakraborty
Senior Technical Community Manager

blog cover

The only way to discover how much of an impact your contributions can make is to submit them and hope for the best. Sometimes, what feels like "just a small fix" can end up reshaping an open source project or inspiring a brand new feature. Here's how one of our top contributors turned a small build fix into important improvements for the Goose experience.

MCP Is Rewriting the Rules of API Integration

· 10 min read
W Ian Douglas
Staff Developer Advocate

blog cover

As developers, we're always looking for ways to build more efficient, scalable, and intelligent applications. For years, RESTful APIs have been our go-to for connecting services. Here are some ways you can integrate AI agents and MCP into your existing API infrastructure to make it smarter, more efficient, and easier to maintain.

MCP in the Enterprise: Real World Adoption at Block

· 6 min read
Angie Jones
Head of Developer Relations

blog cover

At Block, we've been exploring how to make AI agents genuinely useful in a business setting. Not just for demos or prototypes, but for real, everyday work. As one of the early collaborators on the Model Context Protocol (MCP), we partnered with Anthropic to help shape and define the open standard that bridges AI agents with real-world tools and data.

MCP lets AI agents interact with APIs, tools, and data systems through a common interface. It eliminates the guesswork by exposing deterministic tool definitions, so the agent doesn't have to guess how to call an API. Instead, it focuses on what we actually want... results!

While others are still experimenting, we've rolled this out company-wide at Block, and with real impact.