Opencode is a Go-based terminal AI assistant that provides interactive coding help via a polished TUI built with Bubble Tea. It supports multiple AI providers including OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Groq, Azure OpenAI, and OpenRouter with configurable endpoints and API keys.
Features include session management with SQLite storage, tool integration for command execution and code modification, LSP support for language intelligence, file change tracking, Vim-like editing, and auto-compact to prevent context overflow. Cross-platform builds are available for macOS, Windows, and Linux.
Use Cases:
This repository is no longer maintained and has been archived for provenance.
The project has continued under the name Crush, developed by the original author and the Charm team.
Please follow Crush for ongoing development.
⚠️ Early Development Notice: This project is in early development and is not yet ready for production use. Features may change, break, or be incomplete. Use at your own risk.
A powerful terminal-based AI assistant for developers, providing intelligent coding assistance directly in your terminal.
OpenCode is a Go-based CLI application that brings AI assistance to your terminal. It provides a TUI (Terminal User Interface) for interacting with various AI models to help with coding tasks, debugging, and more.
For a quick video overview, check out OpenCode + Gemini 2.5 Pro: BYE Claude Code! I'm SWITCHING To the FASTEST AI Coder!
# Install the latest version
curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | bash
# Install a specific version
curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | VERSION=0.1.0 bash
brew install opencode-ai/tap/opencode
# Using yay
yay -S opencode-ai-bin
# Using paru
paru -S opencode-ai-bin
go install github.com/opencode-ai/opencode@latest
OpenCode looks for configuration in the following locations:
$HOME/.opencode.json$XDG_CONFIG_HOME/opencode/.opencode.json./.opencode.json (local directory)OpenCode includes an auto compact feature that automatically summarizes your conversation when it approaches the model's context window limit. When enabled (default setting), this feature:
You can enable or disable this feature in your configuration file:
{
"autoCompact": true // default is true
}
You can configure OpenCode using environment variables:
| Environment Variable | Purpose |
|---|---|
ANTHROPIC_API_KEY |
For Claude models |
OPENAI_API_KEY |
For OpenAI models |
GEMINI_API_KEY |
For Google Gemini models |
GITHUB_TOKEN |
For Github Copilot models (see Using Github Copilot) |
VERTEXAI_PROJECT |
For Google Cloud VertexAI (Gemini) |
VERTEXAI_LOCATION |
For Google Cloud VertexAI (Gemini) |
GROQ_API_KEY |
For Groq models |
AWS_ACCESS_KEY_ID |
For AWS Bedrock (Claude) |
AWS_SECRET_ACCESS_KEY |
For AWS Bedrock (Claude) |
AWS_REGION |
For AWS Bedrock (Claude) |
AZURE_OPENAI_ENDPOINT |
For Azure OpenAI models |
AZURE_OPENAI_API_KEY |
For Azure OpenAI models (optional when using Entra ID) |
AZURE_OPENAI_API_VERSION |
For Azure OpenAI models |
LOCAL_ENDPOINT |
For self-hosted models |
SHELL |
Default shell to use (if not specified in config) |
AI-powered video translation and dubbing tool supporting 100 languages with voice cloning, automated subtitle generation, and platform-optimized output for global content distribution.
All-in-one AI content marketing platform for creating, publishing, and monetizing across 14+ social channels with automation, trend tracking, and engagement tools.
Privacy-first AI meeting assistant with local transcription, speaker diarization, and automated summarization running entirely on your infrastructure without cloud dependencies.
Modern cross-platform system monitor built with Rust offering real-time CPU and memory tracking with beautiful UI, process management, and advanced search capabilities.
Ultra-efficient large language model achieving 3x faster reasoning generation on end devices with hybrid sparse attention and extensive hardware acceleration support.
Feature-rich Flutter-based Bilibili third-party client supporting multiple platforms with offline playback, DLNA casting, and extensive social interaction features.