Skip to content

A Chat Client for LLMs, written in Compose Multiplatform.

Notifications You must be signed in to change notification settings

succlz123/DeepCo

Repository files navigation

Deep-Co

icon

windows macos linux
android iOS
kotlin compose
stars gpl release

A Chat Client for LLMs, written in Compose Multiplatform. Target supports API providers such as OpenRouter, Anthropic, Grok, OpenAI, DeepSeek, Coze, Dify, Google Gemini, etc. You can also configure any OpenAI-compatible API or use native models via LM Studio/Ollama.

Release

v1.0.6

Feature

  • Desktop Platform Support(Windows/MacOS/Linux)
  • Mobile Platform Support(Android/iOS)
  • Chat(Stream&Complete) / Chat History
  • Chat Messages Export / Chat Translate Server
  • Prompt Management / User Define
  • SillyTavern Character Adaptation(PNG&JSON)
  • DeepSeek LLM / Grok LLM / Google Gemini LLM
  • Claude LLM / OpenAI LLM / OLLama LLM
  • Online API polling
  • MCP Support
  • MCP Server Market
  • RAG
  • TTS(Edge API)
  • i18n(Chinese/English) / App Color Theme / App Dark&Light Theme

Chat With LLMs

1

Config Your LLMs API Key

2

Prompt Management

4

Chat With Tavern Character

6

User Management

5

Config MCP Servers

3

Setting

7

Model Context Protocol (MCP) ENV

MacOS

brew install uv
brew install node

windows

winget install --id=astral-sh.uv  -e
winget install OpenJS.NodeJS.LTS

Build

Run desktop via Gradle

./gradlew :desktopApp:run

Building desktop distribution

./gradlew :desktop:packageDistributionForCurrentOS
# outputs are written to desktopApp/build/compose/binaries

Run Android via Gradle

./gradlew :androidApp:installDebug

Building Android distribution

./gradlew clean :androidApp:assembleRelease
# outputs are written to androidApp/build/outputs/apk/release

Thanks