botanika-desktop
A local LLM + tooling (with MCP support) client. All data is stored locally. Bring your own API keys.
GitHubスター
4
ユーザー評価
未評価
フォーク
0
イシュー
0
閲覧数
1
お気に入り
0
README
Botanika
A local LLM + tooling (with MCP support) client. All data is stored locally. Bring your own API keys.
Client Features
| Support | TTS | STT | Open source | MCP Support | Desktop App | Web App |
|---|---|---|---|---|---|---|
| Botanika | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
| ChatGPT | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
| Copilot | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
| Claude | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
| T3.Chat | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
Native integrations
If you want to use any of these integrations, add them on the "Settings" page.
| Integration name | MCP Server URL |
|---|---|
| Google Search | http://localhost:48678/mcp/sse/google/search |
| Spotify | http://localhost:48678/mcp/sse/spotify |
Supported LLM providers
| Provider | Notes | API key link | Environment variable |
|---|---|---|---|
| OpenAI | OpenAI | OPENAI_API_KEY | |
| Groq | Groq | GROQ_API_KEY | |
| OpenRouter | OpenRouter | OPENROUTER_API_KEY | |
| Azure | AZURE_RESOURCE_NAME, AZURE_API_KEY | ||
| Ollama | Might not work | OLLAMA_URL |
Transcription
If you don't want to use OpenAI, you can use Whisper locally. This requires a bit of setup:
Install pnpm, then run the following command and wait until the model is downloaded:
pnpm whisper-tnode download --model large-v1
Screenshots
Run
You can set your environment variables in the .env file or through the "Settings" page.
npm run setup
npm install
npm run dev
LLM provider
An LLM provider is used to generate most responses.
| Provider name | ENV variable | API key link |
|---|---|---|
| OpenAI | OPENAI_API_KEY | OpenAI |
| Groq | GROQ_API_KEY | Groq |
作者情報
5
フォロワー
31
リポジトリ
0
Gist
148
貢献数
トップ貢献者
スレッド