GitHubスター
32
ユーザー評価
未評価
お気に入り
0
閲覧数
13
フォーク
6
イシュー
0
README
Deep-Co
A Chat Client for LLMs, written in Compose Multiplatform. Target supports API providers such as OpenRouter, Anthropic, Grok, OpenAI, DeepSeek,
Coze, Dify, Google Gemini, etc. You can also configure any OpenAI-compatible API or use native models via LM Studio/Ollama.
Release
Feature
- Desktop Platform Support(Windows/MacOS/Linux)
- Mobile Platform Support(Android/iOS)
- Chat(Stream&Complete) / Chat History
- Chat Messages Export / Chat Translate Server
- Prompt Management / User Define
- SillyTavern Character Adaptation(PNG&JSON)
- DeepSeek LLM / Grok LLM / Google Gemini LLM
- Claude LLM / OpenAI LLM / OLLama LLM
- Online API polling
- MCP Support
- MCP Server Market
- RAG
- TTS(Edge API)
- i18n(Chinese/English) / App Color Theme / App Dark&Light Theme
Chat With LLMs
Config Your LLMs API Key
Prompt Management
Chat With Tavern Character
User Management
Config MCP Servers
Setting
Model Context Protocol (MCP) ENV
MacOS
brew install uv
brew install node
windows
winget install --id=astral-sh.uv -e
winget install OpenJS.NodeJS.LTS
Build
Run desktop via Gradle
./gradlew :desktopApp:run
Building desktop distribution
./gradlew :desktop:packageDistributionForCurrentOS
# outputs are written to desktopApp/build/compose/binaries
Run Android via Gradle
./gradlew :androidApp:installDebug
Building Android distribution
./gradlew clean :androidApp:assembleRelease
# outputs are written to androidApp/build/outputs/apk/release
Thanks
関連するMCP
koog
2957
Koog is the official Kotlin framework for building and running robust, scalable and production-ready AI agents across all platforms – from backend services to Android, JVM, and even in-browser environments. Koog is based on our AI products expertise and provides proven solutions for complex LLM and AI problems