Klee is an Electron-based desktop application combining AI chat, knowledge base management, and note-taking with dual operation modes: Cloud Mode (Google OAuth, PostgreSQL, Supabase sync, file storage) or Private Mode (local Ollama AI, SQLite, LanceDB vector search, complete offline). Built with React 18, TypeScript, TanStack Router/Query, RadixUI/shadcn components, and TailwindCSS frontend, plus Hono RPC backend.
Features include RAG-powered document querying, Tiptap rich text editor with collaborative editing, multi-model support for switching between cloud (OpenAI) and local (Ollama) LLMs, full-text and semantic search across content, and an agent marketplace. Tech stack includes Drizzle ORM for both PostgreSQL and SQLite, AI SDK (Vercel) with OpenAI/Ollama providers, electron-ollama for embedded local AI, and AWS Elastic Beanstalk backend deployment.
Use Cases:
An AI-Powered Knowledge Management Desktop Application
License: MIT (https://img.shields.io/badge/License-MIT-yellow.svg) Electron (https://img.shields.io/badge/Electron-33.4.11-blue) React (https://img.shields.io/badge/React-18.3.1-blue) TypeScript (https://img.shields.io/badge/TypeScript-5.4.2-blue)
Features • Architecture • Getting Started • Documentation • Contributing
Klee is a modern desktop application that combines AI-powered chat, knowledge base management, and note-taking capabilities. It offers both Cloud Mode for seamless synchronization and Private Mode for complete offline functionality.
Frontend
Backend
Infrastructure
klee/
├── client/ # Electron + React app
│ ├── src/
│ │ ├── main/ # Electron main process
│ │ │ ├── ipc/ # IPC handlers
│ │ │ └── local/ # Private mode services
│ │ └── renderer/ # React app
│ │ ├── components/
│ │ ├── hooks/ # TanStack Query hooks
│ │ ├── routes/ # TanStack Router routes
│ │ └── lib/ # Utilities and clients
├── server/ # Hono API server
│ ├── src/
│ │ ├── routes/ # API routes
│ │ └── db/ # Database schemas and queries
├── docs/ # Documentation
└── specs/ # Feature specifications
Clone the repository
git clone https://github.com/signerlabs/Klee.git
cd klee
Configure Tiptap Pro (Required)
Klee uses Tiptap Pro for advanced editor features. You'll need a Tiptap Pro account:
# Copy the .npmrc template
cp .npmrc.example .npmrc
# Edit .npmrc and replace YOUR_TIPTAP_PRO_TOKEN_HERE with your actual token
# Get your token from https://cloud.tiptap.dev/
Install dependencies
npm install
Set up environment variables
Copy .env.example files and configure:
cp .env.example .env
cp server/.env.example server/.env
cp client/.env.example client/.env
See Environment Configuration for details.
Set up Ollama for Private Mode (Optional)
Private Mode requires Ollama binaries and models. You have two options:
Option A: Use System Ollama (Recommended for Development)
# Install Ollama on your system
brew install ollama # macOS
# or download from https://ollama.ai/
# Start Ollama service
ollama serve
Option B: Use Embedded Ollama (For Distribution)
For bundled distributions, copy the offline Ollama resources:
# The structure should be:
# client/resources/ollama/
# ├── binaries/v0.9.0/darwin/arm64/ollama
# └── models/nomic-embed-text/...
# You can obtain these from:
# 1. Download from https://github.com/ollama/ollama/releases
# 2. Export models: ollama export nomic-embed-text
# 3. Follow client/resources/ollama/README.md for structure
Note: The embedded Ollama binaries (~56MB) are not included in the repository. See
client/resources/ollama/README.mdfor detailed setup instructions.
Start the development server
npm run dev
This will start:
http://localhost:3000.env (for macOS builds)# Apple Developer credentials (only needed for signed builds)
[email protected]
APPLE_APP_SPECIFIC_PASSWORD=your_app_specific_password
APPLE_TEAM_ID=YOUR_TEAM_ID
CODESIGN_IDENTITY="Developer ID Application: Your Company Name (TEAMID)"
server/.env# OpenAI Configuration
OPENAI_API_KEY=your_openai_api_key
# Database (Cloud Mode)
DATABASE_URL=postgresql://user:pass@localhost:5432/klee
# Supabase Configuration
VITE_SUPABASE_URL=https://your-project.supabase.co
VITE_SUPABASE_ANON_KEY=your_supabase_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key
client/.env# Supabase Configuration
VITE_SUPABASE_URL=https://your-project.supabase.co
VITE_SUPABASE_ANON_KEY=your_supabase_anon_key
Start PostgreSQL
npm run db:up
Run migrations
npm run db:push
Configure Supabase
.env filesklee://auth/callback# Development
npm run dev # Start both client and server in dev mode
npm run client:dev # Start Electron app only
npm run server:dev # Start API server only
# Building
npm run build # Build for production
npm run client:build # Build Electron app
npm run server:build # Build API server
npm run build:mac # Build signed macOS .dmg
# Database
npm run db:up # Start PostgreSQL with Docker
npm run db:push # Push schema changes
npm run db:generate # Generate migrations
npm run db:migrate # Run migrations
# Deployment
npm run server:deploy # Deploy backend to AWS EB
Kubernetes log visualization tool transforming audit logs into interactive timelines and cluster diagrams for agentless troubleshooting across GKE and OSS clusters
PowerPoint plugin for scientific presentations with image auto-captions grid layouts LaTeX formulas code blocks and Markdown insertion capabilities
Desktop LLM client packaging entire codebases with AI-selected files direct API execution to GPT/Gemini/OpenRouter and prompt templates for developers
Android client for hanime1 with ExoPlayer custom themes download management dual layouts playlists and privacy features including app lock
Kubernetes-native AI agent framework with declarative configs MCP tools multi-LLM support OpenTelemetry tracing and CNCF cloud-native architecture
High-performance local LLM server with GPU and NPU acceleration support, featuring multiple inference engines, OpenAI-compatible API, and cross-platform model deployment for AMD Ryzen AI processors.