kuon
久远:一个开发中的大模型语音助手,当前关注易用性,简单上手,支持对话选择性记忆和Model Context Protocol (MCP)服务。 KUON:A large language model-based voice assistant under development, currently focused on ease of use and simple onboarding. It supports selective memory in conversations and the Model Context Protocol (MCP) service.
GitHub Stars
46
User Rating
Not Rated
Favorites
0
Views
17
Forks
11
Issues
0
Kuon
ä¹ è¿ï¼ä¸ä¸ªå¼åä¸ç大模åè¯é³å©æãä¹å代ç 太èè¿ï¼äºæ¯æ°åæ¯éåï¼éç¹æ¾å¨æç¨æ§ä¸ï¼ä½¿å ¶æä¸ºä¸ä¸ªå®ç¨çä¸è¥¿ã
å¼åç®è¿°
ç®çº¦ä»£ç åä¸åä½¿ç¨æ¬å°æ¨¡åï¼å³ä½¿è¦æ¬å°åä¹ä½¿ç¨æ¥å£æ¹å¼å¯¹æ¥æ¬ç¨åºã大模ååªå¯¹æ¥openaiæ¥å£ï¼å
¶ä»å忍¡åå¯ä»¥ä½¿ç¨oneapiçæ¹å¼å¹é
ãç®ååæ¶äºè¯é³è¾å
¥ï¼åå å¨äºå®å¨ä¸å¸¸ç¨ãåç»åèèæ¯å¦å å
¥ã使ç¨MCPæ¥æ©å±å©æè½åã
ç®ååè½ï¼
- 大模åçè®°å¿åå¨
- ä½¿ç¨ææ¬è¾å ¥äº¤æµï¼è¾åºææ¬åè¯é³
- MCPåè½
åç»è®¡åï¼
- ä¼åTTS
- ä¼åè®°å¿åå¨ï¼æåè®°å¿ä»·å¼
- GUI交äº
å®è£ ä¸ä½¿ç¨
ç¯å¢åå¤
- åå»ºå¹¶æ¿æ´»condaç¯å¢ï¼å¯éï¼
conda create -n kuon python=3.10
conda activate kuon
- å éä»åº
git clone https://github.com/yourusername/kuon.git
cd kuon
- å®è£ ä¾èµ
pip install -r requirements.txt
é ç½®
1. APIå¯é¥é ç½®
对è¯å¯é¥ï¼å¿ éï¼
# Windows (PowerShell)
$env:OPENAI_API_KEY = "æ¨çOpenAI APIå¯é¥"
$env:OPENAI_BASE_URL = "APIåºç¡URL"
# Linux/macOS
export OPENAI_API_KEY="æ¨çOpenAI APIå¯é¥"
export OPENAI_BASE_URL="APIåºç¡URL"
- TTSå¯é¥ï¼å¯éï¼ç®ååªæé¿éTTSï¼å¯ä»¥åªæå交äºï¼
# Windows (PowerShell)
$env:ALIYUN_ACCESS_KEY_ID=Â ""
# Linux/macOS
export ALIYUN_ACCESS_KEY_ID=""
2. é ç½®æä»¶
æ ¹ç®å½çconfig.yamlæä»¶ï¼
tts:
enabled: true # æ¯å¦å¯ç¨TTS
engine: "aliyun" # TTS弿鿩ï¼ç®åæ¯æ "aliyun"
mcp:
enabled: true # æ¯å¦é»è®¤å¯ç¨MCPå·¥å
·
config_path: "mcp_server/temp_mcp_server.json" # MCPæå¡å¨é
ç½®æä»¶è·¯å¾
3. MCPé ç½®ï¼å¯éï¼
å¦é使ç¨MCPåè½ï¼è¯·åèmcp_server/temp_mcp_server.jsoné
ç½®æä»¶ï¼
{
"mcpServers": {
"general": {
"type": "stdio",
"command": "æ§è¡å½ä»¤",
"args": ["å½ä»¤åæ°"],
"env": {
"OPENWEATHERMAP_API_KEY": "é¢å¤ç¯å¢åé"
}
},
"mcp-hotnews-server": {
"type": "sse",
"url": "https://mcp.modelscope.cn/sse/"
}
}
}
å¯å¨
è¿è¡ä¸»ç¨åº:
python kuon.py
ç¨åºå¯å¨åï¼ç´æ¥è¾å ¥ææ¬ä¸AI交äºãè¾å ¥"exit"æ"quit"éåºç¨åºã
示ä¾å±ç¤º
ä¸å¾å±ç¤ºäºä¸ä¹ è¿å©æçå®é äº¤äºææ:

å ¶ä»
ç®å对è¯è®°å¿è¢«ç´æ¥åå¨å¨äºchat_engines/memory.jsonæä»¶ä¸ï¼å¯ä»¥æ ¹æ®éæ±è¿è¡å æ¹ã
ç¹å«æ¯åå¨äºä¸äºå¥æªçä¸è¥¿æ¶ã
ARGO is an open-source AI Agent platform that brings Local Manus to your desktop. With one-click model downloads, seamless closed LLM integration, and offline-first RAG knowledge bases, ARGO becomes a DeepResearch powerhouse for autonomous thinking, task planning, and 100% of your data stays locally. Support Win/Mac/Docker.