Gemini-MCP-CLI
Gemini-MCP-CLI is a simple command-line interface chat application powered by Google's Gemini model, functioning as an MCP host. It leverages tools from multiple MCP servers, featuring dynamic tool discovery and runtime server management, allowing users to add new MCP servers while the application is running.
GitHub Stars
2
User Rating
Not Rated
Favorites
0
Views
18
Forks
3
Issues
0
MCP Gemini Chat CLI 💬
A simple command-line interface (CLI) chat application powered by Google's Gemini model that acts as an MCP (Model Context Protocol) host.
✨ Features
- 🤖 Gemini Powered: Uses the Gemini API (
gemini-2.0-flash-001by default) for conversational responses. - 🔌 MCP Host & Client: Connects to and leverages tools from multiple MCP servers.
- 🔎 Dynamic Tool Discovery: Automatically discovers available tools from connected servers.
- ⚡ Runtime Server Management: Add new MCP servers dynamically while the chat app is running using the
add_servercommand.
🛠️ Prerequisites
- 🐍 Python 3.10+
- 📦
uv(orpip) for package management. - 🔑 Google Gemini API Key.
🚀 Quick Start
Clone & Enter Directory:
git clone <your-repo-url> # Replace <your-repo-url> with the actual repo URL cd mcp-gemini-chatCreate Virtual Environment & Activate:
uv venv # On Linux/macOS source .venv/bin/activate # On Windows # .venv\Scripts\activateInstall Dependencies:
(Installsmcp,google-genai,python-dotenv,httpx)uv pip install -r requirements.txtConfigure API Key: ⚠️ Replace
YOUR_GEMINI_API_KEYwith your actual key!echo "GEMINI_API_KEY=YOUR_GEMINI_API_KEY" > .envRun the App: (Optionally specify MCP server scripts to connect on startup)
# Example assuming the app script is in a 'src' directory python src/mcp_chat_app.py [path/to/server1.py] [path/to/server2.py] # Example without servers at startup # python src/mcp_chat_app.py
🌊 Flow of Calls
Basic Chat Flow (No Tools)
👤 User Input → 🤖 MCP Gemini Chat App → ✨ Gemini API → 🖥️ Display to User
Tool Usage Flow (e.g., Calculator, Weather)
- 👤 User Input → 🤖 MCP Gemini Chat App → ✨ Gemini API
- ✨ Gemini decides to use a tool → 🔌 MCP Client → ⚙️ MCP Server (e.g., Calculator Server)
- ⚙️ Server executes the tool → 📄 Returns result → 🔌 MCP Client
- 📄 Result → ✨ Gemini API → 🤔 Synthesizes final response
- 🗣️ Final response → 🖥️ Display to User
▶️ Usage Guide
Running the Application
Start the application script, optionally providing paths to MCP server scripts you want to connect to immediately.
# Start without any initial servers
python src/mcp_chat_app.py
# Start and connect to specific servers
python src/mcp_chat_app.py mcp_server_weather.py mcp_server_calc.py
Adding MCP Servers
You can add more tools by connecting to more MCP servers:
1. At Startup:
Provide the paths to your server scripts as command-line arguments when starting mcp_chat_app.py:
python src/mcp_chat_app.py path/to/mcp_server_weather.py path/to/mcp_server_calc.py
2. At Runtime:
Use the add_server command within the chat interface:
You: add_server path/to/my_other_server.py
The application will attempt to connect and integrate the tools from the new server.
Example Session
$ python src/mcp_chat_app.py mcp_server_calc.py
# ... Initialization logs ...
MCP Gemini Chat App
Enter your message, 'add_server <path>' to add a server, or 'quit' to exit.
You: hi
Gemini: Hello! How can I help you today?
You: what is 5 plus 12?
# ... Tool call logs ...
Gemini: 5 plus 12 is 17.0.
You: add_server mcp_server_weather.py
# ... Connection and tool discovery logs ...
Gemini: Successfully added server 'mcp_server_weather.py' with tools: ['get_alerts', 'get_forecast']
You: any weather alerts in London?
# ... Tool call logs ...
Gemini: # ... (Response about weather alerts in London) ...
You: quit
# ... Cleanup logs ...
$
🤩 Easy-to-use global IM bot platform designed for the LLM era / 简单易用的大模型即时通信机器人开发平台 ⚡️ Bots for QQ / QQ频道 / Discord / WeChat(微信)/ Telegram / 飞书 / 钉钉 / Slack 🧩 Integrated with ChatGPT(GPT)、DeepSeek、Dify、n8n、Claude、Google Gemini、xAI、PPIO、Ollama、阿里云百炼、SiliconFlow、Qwen、Moonshot(Kimi K2)、SillyTraven、MCP、WeClone etc. LLM & Agent & RAG