Dive is an open-source MCP Host Desktop Application that provides universal LLM integration supporting ChatGPT, Anthropic, Ollama, and any OpenAI-compatible models with function calling capabilities. It features dual architecture (modern Tauri version and traditional Electron) for cross-platform deployment on Windows, macOS, and Linux. The app supports Model Context Protocol enabling seamless AI agent integration via both stdio and Server-Sent Events (SSE) transport modes.
Key features include OAPHub.ai cloud integration for one-click managed MCP servers (zero configuration), multi-language support (24+ languages), advanced API management with model_settings.json for multiple keys and model switching, granular MCP tool enable/disable controls, custom system prompts, comprehensive keyboard shortcuts, auto-saving chat drafts, automatic updates, and OAuth support for SSE MCP authentication. Includes real-time token usage tracking and enhanced UI/UX.
Use Cases:
Dive AI Agent
!GitHub stars (https://img.shields.io/github/stars/OpenAgentPlatform/Dive?style=social) !GitHub forks (https://img.shields.io/github/forks/OpenAgentPlatform/Dive?style=social) !GitHub watchers (https://img.shields.io/github/watchers/OpenAgentPlatform/Dive?style=social) !GitHub repo size (https://img.shields.io/github/repo-size/OpenAgentPlatform/Dive) !GitHub language count (https://img.shields.io/github/languages/count/OpenAgentPlatform/Dive) !GitHub top language (https://img.shields.io/github/languages/top/OpenAgentPlatform/Dive) !GitHub last commit (https://img.shields.io/github/last-commit/OpenAgentPlatform/Dive?color=red) Discord (https://img.shields.io/badge/Discord-Dive-blue?logo=discord&logoColor=white) Twitter Follow (https://img.shields.io/twitter/follow/Dive_ai_agent?style=social)
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. β¨
!Dive Demo
model_settings.jsonβ οΈ Note: This feature is currently unstable and may require frequent re-authorization
| Platform | Electron | Tauri |
|---|---|---|
| Windows | β | β |
| macOS | β | π |
| Linux | β | β |
Migration Note: Existing local MCP/LLM configurations remain fully supported. OAP integration is additive and does not affect current workflows.
Get the latest version of Dive: Download (https://img.shields.io/badge/Download-Latest%20Release-blue.svg)
Choose between two architectures:
Choose between two architectures:
--no-sandbox parameterchmod +x to make the AppImage executableparu -S dive-aiFor more detailed instructions, please see MCP Servers Setup.
The easiest way to get started! Access enterprise-grade MCP tools instantly:
Benefits:
See BUILD.md for more details.
We welcome contributions from the community! Here's how you can help:
git clone https://github.com/YOUR_USERNAME/Dive.gitnpm installnpm run dev (Electron) or cargo tauri dev (Tauri)Dive is open-source software licensed under the MIT License.
Kubernetes log visualization tool transforming audit logs into interactive timelines and cluster diagrams for agentless troubleshooting across GKE and OSS clusters
PowerPoint plugin for scientific presentations with image auto-captions grid layouts LaTeX formulas code blocks and Markdown insertion capabilities
Desktop LLM client packaging entire codebases with AI-selected files direct API execution to GPT/Gemini/OpenRouter and prompt templates for developers
Android client for hanime1 with ExoPlayer custom themes download management dual layouts playlists and privacy features including app lock
Kubernetes-native AI agent framework with declarative configs MCP tools multi-LLM support OpenTelemetry tracing and CNCF cloud-native architecture
High-performance local LLM server with GPU and NPU acceleration support, featuring multiple inference engines, OpenAI-compatible API, and cross-platform model deployment for AMD Ryzen AI processors.