botanika-desktop
A local LLM + tooling (with MCP support) client. All data is stored locally. Bring your own API keys.
GitHub Stars
4
User Rating
Not Rated
Forks
0
Issues
0
Views
0
Favorites
0
README
Botanika
A local LLM + tooling (with MCP support) client. All data is stored locally. Bring your own API keys.
Client Features
| Support | TTS | STT | Open source | MCP Support | Desktop App | Web App |
|---|---|---|---|---|---|---|
| Botanika | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
| ChatGPT | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
| Copilot | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
| Claude | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
| T3.Chat | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
Native integrations
If you want to use any of these integrations, add them on the "Settings" page.
| Integration name | MCP Server URL |
|---|---|
| Google Search | http://localhost:48678/mcp/sse/google/search |
| Spotify | http://localhost:48678/mcp/sse/spotify |
Supported LLM providers
| Provider | Notes | API key link | Environment variable |
|---|---|---|---|
| OpenAI | OpenAI | OPENAI_API_KEY | |
| Groq | Groq | GROQ_API_KEY | |
| OpenRouter | OpenRouter | OPENROUTER_API_KEY | |
| Azure | AZURE_RESOURCE_NAME, AZURE_API_KEY | ||
| Ollama | Might not work | OLLAMA_URL |
Transcription
If you don't want to use OpenAI, you can use Whisper locally. This requires a bit of setup:
Install pnpm, then run the following command and wait until the model is downloaded:
pnpm whisper-tnode download --model large-v1
Screenshots
Run
You can set your environment variables in the .env file or through the "Settings" page.
npm run setup
npm install
npm run dev
LLM provider
An LLM provider is used to generate most responses.
| Provider name | ENV variable | API key link |
|---|---|---|
| OpenAI | OPENAI_API_KEY | OpenAI |
| Groq | GROQ_API_KEY | Groq |
Author Information
5
Followers
31
Repositories
0
Gists
148
Total Contributions
Top Contributors
Threads