βœ“
Tuui Download

Desktop MCP client built with Vue3/Vuetify accelerating AI adoption through Model Context Protocol with cross-vendor LLM API orchestration (ChatGPT/Claude/Qwen/OpenRouter/DeepInfra), supporting zero accounts, full control, open source, download-and-run for Windows/Linux/macOS, TypeScript/multilingual, and remote MCP via mcp-remote with Apache 2.0 license.

⭐ 1,111 stars on GitHub
Latest Release: v1.3.5

About Software

TUUI is a desktop MCP client designed as tool unitary utility integration, accelerating AI adoption through Model Context Protocol (MCP) and enabling cross-vendor LLM API orchestration. Zero accounts, full control, open source, download and run. Repository essentially LLM chat desktop application based on MCP, representing bold experiment in creating complete project using AI. Many components directly converted/generated from prototype project through AI. Employs strict syntax checks and naming conventions; further development requires using linting tools to check/auto-fix syntax issues. Features include accelerate AI tool integration via MCP, orchestrate cross-vendor LLM APIs through dynamic configuring, automated application testing support, TypeScript support, multilingual support, basic layout manager, global state management through Pinia store, quick support through GitHub community and official documentation.

Getting started options: explore at TUUI.com, download from GitHub Releases, developer setup via Getting Started guide (English/δΈ­ζ–‡), ask AI at TUUI@DeepWiki. Core requirements: LLM backend (ChatGPT, Claude, Qwen, or self-hosted) supporting tool/function calling, Node.js for NPX/NODE-based servers (JavaScript/TypeScript tools), Python + UV library for UV/UVX-based servers, DockerHub for Docker-based servers, macOS/Linux systems may need to modify default MCP configuration (CLI paths/permissions). LLM configuration template supports JSON object (single chatbot) or JSON array (multiple chatbots) with fields: name, apiKey, url, urlList, path, model, modelList, maxTokensValue, mcp. Additional configurations: llm.json (LLM endpoints), mcp.json (MCP servers), startup.json (startup screen news), popup.json (popup prompts). Modify built release configs in resources/assets/config/ or clear from Tray Menu > Clear Storage. Remote MCP server via mcp-remote on Cloudflare. MCP server issues: ensure commands run on system (uv/uvx, npx), spawn ENOENT errors resolved with absolute paths, connection timeout due to slow pip/npm repository (manual install first). Apache 2.0 license.

Use Cases:

  • Desktop MCP client designed as tool unitary utility integration accelerating AI adoption through Model Context Protocol with cross-vendor LLM API orchestration
  • Local AI playground with zero accounts, full control, open source, download-and-run supporting Windows/Linux/macOS with Vue3/Vuetify interface
  • LLM chat desktop application with dynamic cross-vendor API configuration (ChatGPT, Claude, Qwen, OpenRouter, DeepInfra), automated testing, TypeScript/multilingual support
  • MCP server management via NPX/NODE (Node.js), UV/UVX (Python + UV library), Docker-based servers, with remote MCP server support via mcp-remote
  • Core requirements: LLM backend with tool/function calling, Node.js for JavaScript/TypeScript tools, Python + UV for UVX servers, DockerHub for Docker servers

Downloads

v1.3.5 December 07, 2025
tuui__1.3.5_linux_amd64.debdeb
tuui__1.3.5_mac_arm64.dmgdmg
tuui__1.3.5_mac_universal.dmgdmg
tuui__1.3.5_mac_x64.dmgdmg
tuui__1.3.5_win_x64.exeexe
tuui__1.3.5_win_x64_Portable.exeexe
v1.3.4 December 03, 2025
tuui__1.3.4_linux_amd64.debdeb
tuui__1.3.4_mac_arm64.dmgdmg
tuui__1.3.4_mac_universal.dmgdmg
tuui__1.3.4_mac_x64.dmgdmg
tuui__1.3.4_win_x64.exeexe
tuui__1.3.4_win_x64_Portable.exeexe
v1.3.3 November 27, 2025
tuui__1.3.3_linux_amd64.debdeb
tuui__1.3.3_mac_arm64.dmgdmg
tuui__1.3.3_mac_universal.dmgdmg
tuui__1.3.3_mac_x64.dmgdmg
tuui__1.3.3_win_x64.exeexe
tuui__1.3.3_win_x64_Portable.exeexe
v1.3.2 November 18, 2025
tuui__1.3.2_linux_amd64.debdeb
tuui__1.3.2_mac_arm64.dmgdmg
tuui__1.3.2_mac_universal.dmgdmg
tuui__1.3.2_mac_x64.dmgdmg
tuui__1.3.2_win_x64.exeexe
tuui__1.3.2_win_x64_Portable.exeexe
v1.3.1 October 29, 2025
tuui__1.3.1_linux_amd64.debdeb
tuui__1.3.1_mac_arm64.dmgdmg
tuui__1.3.1_mac_universal.dmgdmg
tuui__1.3.1_mac_x64.dmgdmg
tuui__1.3.1_win_x64.exeexe
tuui__1.3.1_win_x64_Portable.exeexe
v1.3.0 October 04, 2025
tuui__1.3.0_linux_amd64.debdeb
tuui__1.3.0_mac_arm64.dmgdmg
tuui__1.3.0_mac_universal.dmgdmg
tuui__1.3.0_mac_x64.dmgdmg
tuui__1.3.0_win_x64.exeexe
tuui__1.3.0_win_x64_Portable.exeexe
v1.2.3 October 03, 2025
tuui__1.2.3_linux_amd64.debdeb
tuui__1.2.3_mac_arm64.dmgdmg
tuui__1.2.3_mac_universal.dmgdmg
tuui__1.2.3_mac_x64.dmgdmg
tuui__1.2.3_win_x64.exeexe
tuui__1.2.3_win_x64_Portable.exeexe
v1.2.2 October 02, 2025
tuui__1.2.2_linux_amd64.debdeb
tuui__1.2.2_mac_arm64.dmgdmg
tuui__1.2.2_mac_universal.dmgdmg
tuui__1.2.2_mac_x64.dmgdmg
tuui__1.2.2_win_x64.exeexe
tuui__1.2.2_win_x64_Portable.exeexe
v1.2.1 October 01, 2025
tuui__1.2.1_linux_amd64.debdeb
tuui__1.2.1_mac_arm64.dmgdmg
tuui__1.2.1_mac_universal.dmgdmg
tuui__1.2.1_mac_x64.dmgdmg
tuui__1.2.1_win_x64.exeexe
tuui__1.2.1_win_x64_Portable.exeexe
v1.2.0 September 30, 2025
tuui__1.2.0_linux_amd64.debdeb
tuui__1.2.0_mac_arm64.dmgdmg
tuui__1.2.0_mac_universal.dmgdmg
tuui__1.2.0_mac_x64.dmgdmg
tuui__1.2.0_win_x64.exeexe
tuui__1.2.0_win_x64_Portable.exeexe
v1.1.4 September 27, 2025
tuui__1.1.4_linux_amd64.debdeb
tuui__1.1.4_mac_arm64.dmgdmg
tuui__1.1.4_mac_universal.dmgdmg
tuui__1.1.4_mac_x64.dmgdmg
tuui__1.1.4_win_x64.exeexe
tuui__1.1.4_win_x64_Portable.exeexe
v1.1.3 September 26, 2025
tuui__1.1.3_linux_amd64.debdeb
tuui__1.1.3_mac_arm64.dmgdmg
tuui__1.1.3_mac_universal.dmgdmg
tuui__1.1.3_mac_x64.dmgdmg
tuui__1.1.3_win_x64.exeexe
tuui__1.1.3_win_x64_Portable.exeexe
v1.1.2 September 26, 2025
tuui__1.1.2_linux_amd64.debdeb
tuui__1.1.2_mac_arm64.dmgdmg
tuui__1.1.2_mac_universal.dmgdmg
tuui__1.1.2_mac_x64.dmgdmg
tuui__1.1.2_win_x64.exeexe
tuui__1.1.2_win_x64_Portable.exeexe
v1.1.1 September 24, 2025
tuui__1.1.1_linux_amd64.debdeb
tuui__1.1.1_mac_arm64.dmgdmg
tuui__1.1.1_mac_universal.dmgdmg
tuui__1.1.1_mac_x64.dmgdmg
tuui__1.1.1_win_x64.exeexe
tuui__1.1.1_win_x64_Portable.exeexe
v1.1.0 September 19, 2025
tuui__1.1.0_linux_amd64.debdeb
tuui__1.1.0_mac_arm64.dmgdmg
tuui__1.1.0_mac_universal.dmgdmg
tuui__1.1.0_mac_x64.dmgdmg
tuui__1.1.0_win_x64.exeexe
tuui__1.1.0_win_x64_Portable.exeexe
v1.0.4 September 14, 2025
tuui__1.0.4_linux_amd64.debdeb
tuui__1.0.4_mac_arm64.dmgdmg
tuui__1.0.4_mac_universal.dmgdmg
tuui__1.0.4_mac_x64.dmgdmg
tuui__1.0.4_win_x64.exeexe
tuui__1.0.4_win_x64_Portable.exeexe
v1.0.3 September 13, 2025
tuui__1.0.3_linux_amd64.debdeb
tuui__1.0.3_mac_arm64.dmgdmg
tuui__1.0.3_mac_universal.dmgdmg
tuui__1.0.3_mac_x64.dmgdmg
tuui__1.0.3_win_x64.exeexe
tuui__1.0.3_win_x64_Portable.exeexe
v1.0.2 September 12, 2025
tuui__1.0.2_linux_amd64.debdeb
tuui__1.0.2_mac_arm64.dmgdmg
tuui__1.0.2_mac_universal.dmgdmg
tuui__1.0.2_mac_x64.dmgdmg
tuui__1.0.2_win_x64.exeexe
tuui__1.0.2_win_x64_Portable.exeexe
v1.0.1 September 11, 2025
tuui__1.0.1_linux_amd64.debdeb
tuui__1.0.1_mac_arm64.dmgdmg
tuui__1.0.1_mac_universal.dmgdmg
tuui__1.0.1_mac_x64.dmgdmg
tuui__1.0.1_win_x64.exeexe
tuui__1.0.1_win_x64_Portable.exeexe
v1.0.0 September 09, 2025
tuui__1.0.0_linux_amd64.debdeb
tuui__1.0.0_mac_arm64.dmgdmg
tuui__1.0.0_mac_universal.dmgdmg
tuui__1.0.0_mac_x64.dmgdmg
tuui__1.0.0_win_x64.exeexe
tuui__1.0.0_win_x64_Portable.exeexe
v1.0.0-beta September 08, 2025
tuui__1.0.0-beta_linux_amd64.debdeb
tuui__1.0.0-beta_mac_arm64.dmgdmg
tuui__1.0.0-beta_mac_universal.dmgdmg
tuui__1.0.0-beta_mac_x64.dmgdmg
tuui__1.0.0-beta_win_x64.exeexe
tuui__1.0.0-beta_win_x64_Portable.exeexe
v1.0.0-alpha September 06, 2025
tuui__1.0.0-alpha_linux_amd64.debdeb
tuui__1.0.0-alpha_mac_arm64.dmgdmg
tuui__1.0.0-alpha_mac_universal.dmgdmg
tuui__1.0.0-alpha_mac_x64.dmgdmg
tuui__1.0.0-alpha_win_x64.exeexe
tuui__1.0.0-alpha_win_x64_Portable.exeexe
v0.9.1 September 05, 2025
tuui__0.9.1_linux_amd64.debdeb
tuui__0.9.1_mac_arm64.dmgdmg
tuui__0.9.1_mac_universal.dmgdmg
tuui__0.9.1_mac_x64.dmgdmg
tuui__0.9.1_win_x64.exeexe
tuui__0.9.1_win_x64_Portable.exeexe
v0.9.0 September 04, 2025
tuui__0.9.0_linux_amd64.debdeb
tuui__0.9.0_mac_arm64.dmgdmg
tuui__0.9.0_mac_universal.dmgdmg
tuui__0.9.0_mac_x64.dmgdmg
tuui__0.9.0_win_x64.exeexe
tuui__0.9.0_win_x64_Portable.exeexe
v0.8.9 August 29, 2025
tuui__0.8.9_linux_amd64.debdeb
tuui__0.8.9_mac_arm64.dmgdmg
tuui__0.8.9_mac_universal.dmgdmg
tuui__0.8.9_mac_x64.dmgdmg
tuui__0.8.9_win_x64.exeexe
tuui__0.8.9_win_x64_Portable.exeexe
v0.8.8 August 28, 2025
tuui__0.8.8_linux_amd64.debdeb
tuui__0.8.8_mac_arm64.dmgdmg
tuui__0.8.8_mac_universal.dmgdmg
tuui__0.8.8_mac_x64.dmgdmg
tuui__0.8.8_win_x64.exeexe
tuui__0.8.8_win_x64_Portable.exeexe
v0.8.7 August 23, 2025
tuui__0.8.7_linux_amd64.debdeb
tuui__0.8.7_mac_arm64.dmgdmg
tuui__0.8.7_mac_universal.dmgdmg
tuui__0.8.7_mac_x64.dmgdmg
tuui__0.8.7_win_x64.exeexe
tuui__0.8.7_win_x64_Portable.exeexe
v0.8.6 August 21, 2025
tuui__0.8.6_linux_amd64.debdeb
tuui__0.8.6_mac_arm64.dmgdmg
tuui__0.8.6_mac_universal.dmgdmg
tuui__0.8.6_mac_x64.dmgdmg
tuui__0.8.6_win_x64.exeexe
tuui__0.8.6_win_x64_Portable.exeexe
v0.8.5 August 20, 2025
tuui__0.8.5_linux_amd64.debdeb
tuui__0.8.5_mac_arm64.dmgdmg
tuui__0.8.5_mac_universal.dmgdmg
tuui__0.8.5_mac_x64.dmgdmg
tuui__0.8.5_win_x64.exeexe
tuui__0.8.5_win_x64_Portable.exeexe
v0.8.4 August 15, 2025
tuui__0.8.4_linux_amd64.debdeb
tuui__0.8.4_mac_arm64.dmgdmg
tuui__0.8.4_mac_universal.dmgdmg
tuui__0.8.4_mac_x64.dmgdmg
tuui__0.8.4_win_x64.exeexe
tuui__0.8.4_win_x64_Portable.exeexe

Package Info

Last Updated
Dec 07, 2025
Latest Version
v1.3.5
License
Apache-2.0
Total Versions
30

README

TUUI - Local AI Playground with MCP

TUUI is a desktop MCP client designed as a tool unitary utility integration, accelerating AI adoption through the Model Context Protocol (MCP) and enabling cross-vendor LLM API orchestration.

Zero accounts Full control Open source Download and Run

(https://img.shields.io/badge/Windows-blue?logo=icloud) (https://img.shields.io/badge/Linux-orange?logo=linux) (https://img.shields.io/badge/macOS-lightgrey?logo=apple)

πŸ“œ Introduction

(https://camo.githubusercontent.com/077907eb137aa9b2d46ca4af30b77714cb69225eb8be49ad89f3e0ae668c90ca/68747470733a2f2f62616467652e6d6370782e6465763f747970653d636c69656e74) (https://img.shields.io/badge/Vue3-brightgreen.svg) (https://img.shields.io/badge/Vuetify-blue.svg) LICENSE (https://img.shields.io/github/license/AI-QL/tuui) Ask DeepWiki (https://deepwiki.com/badge.svg)

This repository is essentially an LLM chat desktop application based on MCP. It also represents a bold experiment in creating a complete project using AI. Many components within the project have been directly converted or generated from the prototype project through AI.

Given the considerations regarding the quality and safety of AI-generated content, this project employs strict syntax checks and naming conventions. Therefore, for any further development, please ensure that you use the linting tools I've set up to check and automatically fix syntax issues.

✨ Features

  • ✨ Accelerate AI tool integration via MCP
  • ✨ Orchestrate cross-vendor LLM APIs through dynamic configuring
  • ✨ Automated application testing Support
  • ✨ TypeScript support
  • ✨ Multilingual support
  • ✨ Basic layout manager
  • ✨ Global state management through the Pinia store
  • ✨ Quick support through the GitHub community and official documentation

πŸš€ Getting Started

You can quickly get started with the project through a variety of options tailored to your role and needs:

  • To explore the project, visit the wiki page: TUUI.com (https://www.tuui.com)

  • To download and use the application directly, go to the releases page: Releases (https://github.com/AI-QL/tuui/releases/latest)

  • For developer setup, refer to the installation guide: Getting Started (English) | εΏ«ι€Ÿε…₯ι—¨ (δΈ­ζ–‡)

  • To ask the AI directly about the project, visit: TUUI@DeepWiki (https://deepwiki.com/AI-QL/tuui)

βš™οΈ Core Requirements

To use MCP-related features, ensure the following preconditions are met for your environment:

  • Set up an LLM backend (e.g., ChatGPT, Claude, Qwen or self-hosted) that supports tool/function calling.

  • For NPX/NODE-based servers: Install Node.js to execute JavaScript/TypeScript tools.

  • For UV/UVX-based servers: Install Python and the UV library.

  • For Docker-based servers: Install DockerHub.

  • For macOS/Linux systems: Modify the default MCP configuration (e.g., adjust CLI paths or permissions).

    Refer to the MCP Server Issue documentation for guidance

For guidance on configuring the LLM, refer to the template(i.e.: Qwen):

{
  "name": "Qwen",
  "apiKey": "",
  "url": "https://dashscope.aliyuncs.com/compatible-mode",
  "path": "/v1/chat/completions",
  "model": "qwen-turbo",
  "modelList": ["qwen-turbo", "qwen-plus", "qwen-max"],
  "maxTokensValue": "",
  "mcp": true
}

The configuration accepts either a JSON object (for a single chatbot) or a JSON array (for multiple chatbots):

[
  {
    "name": "Openrouter && Proxy",
    "apiKey": "",
    "url": "https://api3.aiql.com",
    "urlList": ["https://api3.aiql.com", "https://openrouter.ai/api"],
    "path": "/v1/chat/completions",
    "model": "openai/gpt-4.1-mini",
    "modelList": [
      "openai/gpt-4.1-mini",
      "openai/gpt-4.1",
      "anthropic/claude-sonnet-4",
      "google/gemini-2.5-pro-preview"
    ],
    "maxTokensValue": "",
    "mcp": true
  },
  {
    "name": "DeepInfra",
    "apiKey": "",
    "url": "https://api.deepinfra.com",
    "path": "/v1/openai/chat/completions",
    "model": "Qwen/Qwen3-32B",
    "modelList": [
      "Qwen/Qwen3-32B",
      "Qwen/Qwen3-235B-A22B",
      "meta-llama/Meta-Llama-3.1-70B-Instruct"
    ],
    "mcp": true
  }
]

πŸ“• Additional Configuration

Configuration Description Location Note
LLM Endpoints Default LLM Chatbots config llm.json Full config types could be found in llm.d.ts
MCP Servers Default MCP servers configs mcp.json For configuration syntax, see MCP Servers (https://github.com/modelcontextprotocol/servers?tab=readme-ov-file#using-an-mcp-client)
Startup Screen Default News on Startup Screen startup.json
Popup Screen Default Prompts on Startup Screen popup.json

For the decomposable package, you can also modify the default configuration of the built release:

For example, src/main/assets/config/llm.json will be located in resources/assets/config/llm.json

Once you modify or import the configurations, it will be stored in your localStorage by default.

Alternatively, you can clear all configurations from the Tray Menu by selecting Clear Storage.

🌐 Remote MCP server

You can utilize Cloudflare's recommended mcp-remote (https://github.com/geelen/mcp-remote) to implement the full suite of remote MCP server functionalities (including Auth). For example, simply add the following to your mcp.json file:

{
  "mcpServers": {
    "cloudflare": {
      "command": "npx",
      "args": ["-y", "mcp-remote", "https://YOURDOMAIN.com/sse"]
    }
  }
}

In this example, I have provided a test remote server: https://YOURDOMAIN.com on Cloudflare (https://blog.cloudflare.com/remote-model-context-protocol-servers-mcp/). This server will always approve your authentication requests.

If you encounter any issues (please try to maintain OAuth auto-redirect to prevent callback delays that might cause failures), such as the common HTTP 400 error. You can resolve them by clearing your browser cache on the authentication page and then attempting verification again:

🚸 MCP Server Issue

General

When launching the MCP server, if you encounter any issues, first ensure that the corresponding command can run on your current system β€” for example, uv/uvx, npx, etc.

ENOENT Spawn Errors

When launching the MCP server, if you encounter spawn errors like ENOENT, try running the corresponding MCP server locally and invoking it using an absolute path.

If the command works but MCP initialization still returns spawn errors, this may be a known issue:

  • Windows: The MCP SDK includes a workaround specifically for Windows systems, as documented in ISSUE 101 (https://github.com/modelcontextprotocol/typescript-sdk/issues/101).

    Details: ISSUE 40 - MCP servers fail to connect with npx on Windows (https://github.com/modelcontextprotocol/servers/issues/40) (fixed)

  • mscOS: The issue remains unresolved on other platforms, specifically macOS. Although several workarounds are available, this ticket consolidates the most effective ones and highlights the simplest method: How to configure MCP on macOS (https://github.com/AI-QL/tuui/issues/2).

    Details: ISSUE 64 - MCP Servers Don't Work with NVM (https://github.com/modelcontextprotocol/servers/issues/64) (still open)

MCP Connection Timeout

If initialization takes too long and triggers the 90-second timeout protection, it may be because the uv/uvx/npx runtime libraries are being installed or updated for the first time.

When your connection to the respective pip or npm repository is slow, installation can take a long time.

In such cases, first complete the installation manually with pip or npm in the relevant directory, and then start the MCP server again.

Related Software

Data Peek

Fast SQL client for PostgreSQL, MySQL and SQL Server with AI assistant that converts natural language to queries. Features Monaco editor, ERD diagrams, query plans and inline editing. Built with Electron and React.

⭐ 1,323developer-tools, electron

Tuboshu

Convert websites into desktop apps with Electron. Features multi-account support, global hotkey switching, custom JavaScript injection and portable packaging for Windows, macOS and Linux.

⭐ 1,291

Pluely

Open-source AI meeting assistant built with Tauri at 10MB. Features real-time transcription with OpenAI Whisper, GPT-4, Claude, Gemini and Grok support, translucent overlay, and undetectable in video calls.

⭐ 1,274ai-assistant, claude, cluely-alternative

Fluent M3 U8

Cross-platform M3U8/MPD video downloader built with PySide6 and QFluentWidgets featuring multi-threaded downloads, task management, fluent design GUI, FFmpeg and N_m3u8DL-RE integration, Python 3.11 conda environment, and deployment support for Windows/macOS/Linux with GPL-3.0 license.

⭐ 1,267fluent, m3u8, m3u8-downloader

Xiaozhi Android Client

Flutter AI voice assistant for Android and iOS with real-time conversation, Live2D characters, echo cancellation, multi-service support for Xiaozhi, Dify and OpenAI, and image messaging.

⭐ 1,252ai, chat, chatgpt

Github Stars Manager

GitHub starred repository manager with AI-powered auto-sync, semantic search, automatic categorization, release tracking, one-click downloads, smart asset filters, bilingual wiki integration, and cross-platform Electron client for Windows/macOS/Linux with 100% local data storage and MIT license.

⭐ 1,224