Goose Complete Guide: 2026 Open Source AI Coding Assistant
What is Goose?
Goose is an AI coding assistant open sourced by Block, designed to provide developers with a free, customizable coding assistant alternative. Unlike commercial tools like Claude Code and Cursor, Goose is fully open source, supports multiple LLM backends including local models.
Core Features
- Multi-model support: Compatible with Anthropic, OpenAI, Google, Ollama, and more
- Local-first: Supports locally running open source models, protecting code privacy
- Completely free: MIT license, no subscription fees
- Terminal native: Command-line interface, seamless integration into development workflows
- Extensible: Supports custom plugins and tools
Why Choose Goose?
Compared to Commercial Tools
| Feature | Goose | Claude Code | Cursor |
|---|---|---|---|
| Price | Free | $20/month | $20/month |
| Open Source | ✅ | ❌ | ❌ |
| Local Models | ✅ | ❌ | Limited |
| Privacy Protection | ✅ | ❌ | ❌ |
| Custom Extensions | ✅ | Limited | Limited |
Use Cases
- Budget-conscious developers: No subscription fees
- Privacy-sensitive projects: Use local models, code stays local
- Enterprise self-deployment: Customizable and deployable internally
- Learning and research: Open source code available for study
Quick Start
Installation Requirements
- Python 3.10+
- pip or uv package manager
- At least 8GB RAM (16GB+ recommended for local models)
Installation Steps
# Method 1: Using pip
pip install goose-ai
# Method 2: Using uv (recommended, faster)
uv pip install goose-ai
# Method 3: From source
git clone https://github.com/block/goose.git
cd goose
pip install -e .
Configuring Models
Goose supports multiple model backends. Here are common configurations:
# Using Anthropic Claude (API Key required)
goose configure --provider anthropic --model claude-sonnet-4-20250514
# Using OpenAI
goose configure --provider openai --model gpt-4.1
# Using local Ollama model (free)
goose configure --provider ollama --model llama3.1:8b
# Using Google Gemini
goose configure --provider google --model gemini-2.5-pro
Environment Variables
# ~/.bashrc or ~/.zshrc
export ANTHROPIC_API_KEY="your-api-key"
export OPENAI_API_KEY="your-api-key"
export GOOGLE_API_KEY="your-api-key"
# Ollama requires no API Key, ensure service is running
ollama serve
Core Features
1. Code Generation and Completion
Goose can generate complete functional modules:
# Create new file
goose create file src/utils.py --prompt "Create a Python utility module with string processing, date formatting, and JSON parsing"
# Edit existing file
goose edit src/main.py --prompt "Add error handling and logging"
2. Code Review
# Review code changes
goose review --diff git diff HEAD~1
# Find potential issues
goose analyze --security src/
3. Test Generation
# Generate tests for existing code
goose test src/calculator.py --framework pytest
# Generate complete test suite
goose test --all src/
4. Documentation Generation
# Generate function documentation
goose doc src/api.py --format google
# Generate project README
goose doc --readme
Practical Examples
Example 1: Create REST API
# Have Goose create a complete FastAPI project
goose create project my-api --template fastapi
# Add user authentication
goose add auth --provider oauth2
# Generate database models
goose generate model User --fields "name:str,email:str,created_at:datetime"
Example 2: Refactor Existing Code
# Refactor functions for performance
goose refactor src/slow_function.py --goal "Optimize time complexity from O(n²) to O(n)"
# Convert to async code
goose convert src/sync_code.py --to async
Example 3: Debugging Help
# Analyze error logs
goose debug --error "Traceback: IndexError: list index out of range" --context src/parser.py
# Interactive debug session
goose debug --interactive src/main.py
Advanced Configuration
Custom Plugins
Goose supports creating custom tool plugins:
# plugins/custom_tool.py
from goose.plugins import Plugin
class CustomTool(Plugin):
name = "custom_tool"
def execute(self, query: str) -> str:
# Custom logic
return f"Processing result: {query}"
Workflow Automation
# .goose/workflow.yaml
name: Daily Code Review
trigger: daily 9:00 AM
steps:
- analyze: src/
- review: --pending-changes
- report: --output slack
Model Switching Strategy
# Configure model routing
goose route --simple-tasks ollama/llama3.1:8b
goose route --complex-tasks anthropic/claude-sonnet-4
goose route --code-review openai/gpt-4.1
Performance Optimization
Local Model Optimization
# Use quantized model to reduce memory usage
ollama pull llama3.1:8b-q4_K_M
# Configure GPU acceleration
export OLLAMA_NUM_GPU=1
export OLLAMA_MAX_VRAM=8GB
Caching Strategy
# Enable response caching
goose config set cache.enabled true
goose config set cache.ttl 3600
# Prefetch commonly used models
goose prefetch llama3.1:8b claude-sonnet-4
FAQ
Q: What is the difference between Goose and Claude Code?
A: Goose is open source and free, supporting local models; Claude Code is a commercial product that only supports Anthropic models. Goose is better suited for budget-conscious or privacy-protecting scenarios.
Q: How good are local models?
A: For simple tasks, Llama 3.1 8B performs well; for complex tasks, it's recommended to use cloud models or larger local models (70B+).
Q: How to protect code privacy?
A: Use local Ollama models, code is processed entirely locally. Configure goose config set privacy.mode local to disable all external calls.
Q: Which IDE integrations are supported?
A: Currently primarily terminal-based. VS Code and JetBrains plugins are in development, follow the official GitHub for updates.
Summary
Goose provides developers with a powerful open source AI coding assistant option. Its multi-model support, local running capability, and completely free nature make it a strong alternative to Claude Code and Cursor.
Recommended Getting Started Steps:
- Install quickly with
pip install goose-ai - Configure Ollama and local models for the free experience
- Add cloud model API keys as needed
- Gradually integrate Goose workflows into daily development
For developers who value privacy, have limited budgets, or love the open source ecosystem, Goose is worth trying.
FAQ
What is Goose AI coding assistant?
Goose is an open-source AI coding assistant built by Block (formerly Square). It supports multiple AI models, runs locally on your machine, and is completely free.
Is Goose free to use?
Yes! Goose is 100% free and open source under the Apache 2.0 license. You only pay for the AI models you connect to it — or use free local models for zero cost.
How does Goose compare to GitHub Copilot?
Goose is open source and supports any AI model you choose, including local models. Copilot is a proprietary service tied to OpenAI models. Goose gives you full control and privacy.
Can Goose run entirely offline?
Yes. If you connect Goose to a local model like Ollama or LM Studio, it can run completely offline with no data sent to external servers.
What editors does Goose support?
Goose works as a terminal-based assistant and integrates with popular editors through plugins. It supports VS Code, JetBrains IDEs, and standalone terminal usage.
Related links: