chatbot-mcp-server
This project is designed to build a chatbot server using Jupyter Notebook. It automates interactions with users and provides functionalities to process data via an API. Notably, it features response generation leveraging natural language processing.
GitHub Stars
0
User Rating
Not Rated
Favorites
0
Views
38
Forks
0
Issues
0
Chatbot MCP Server
Overview
This repository implements an MCP-compatible research assistant chatbot system. It consists of:
- An MCP server that exposes research tools for searching and extracting arXiv paper information.
- An MCP client chatbot that interacts with the server and uses an LLM (Anthropic Claude) to process user queries and call tools.
- Jupyter notebooks that provide educational lessons, demos, and development workflows.
About This Project
This project follows the DeepLearning.AI course on understanding MCP (Model Context Protocol) and building a chatbot server as a demo. For more details and learning resources, see:
MCP: Build Rich-Context AI Apps with Anthropic (DeepLearning.AI Short Course)
Main Components
research_server.py:
The MCP server, exposing two tools:search_papers(topic, max_results): Search arXiv and save paper metadata.extract_info(paper_id): Retrieve metadata for a specific paper.
mcp_chatbot.py:
The MCP client chatbot, which:- Connects to the server via stdio.
- Uses Anthropic Claude for LLM-based chat.
- Handles tool-calling and interactive chat loop.
Jupyter Notebooks:
- Provide step-by-step lessons on building the server, client, and integrating with MCP.
- Include code examples, explanations, and environment setup instructions.
Dependencies
- Python 3.13+
anthropic(LLM API)arxiv(arXiv API)mcp(MCP protocol client/server)nest-asyncio,python-dotenv- (See
pyproject.tomlfor details)
Installation
Clone the repository:
git clone <repo-url> cd chatbot-mcp-serverSet up a virtual environment:
python3 -m venv .venv source .venv/bin/activateInstall dependencies:
pip install -r requirements.txt # or, if using uv: uv pip install -r requirements.txtSet up environment variables:
- Create a
.envfile with your Anthropic API key and any other required variables.
- Create a
Usage
Running the MCP Server
uv run research_server.py
- This starts the server and exposes the research tools via MCP protocol.
Running the Chatbot Client
python mcp_chatbot.py
- This launches the interactive chatbot, which connects to the server and allows you to query research papers.
Example Workflow
- Start the server (
uv run research_server.py). - In another terminal, start the chatbot (
python mcp_chatbot.py). - Type a research query (e.g., "Find recent papers on quantum computing").
- The chatbot will use the LLM to process your query, call the appropriate tool, and return results.
Notebooks and Demos
- chatbot-mcp-client.ipynb:
Lesson on building the MCP client chatbot. - chatbot-mcp-server-demo.ipynb:
Lesson on building the MCP server. - chatbot-demo.ipynb:
Standalone chatbot example. - chatbot-mcp-server.ipynb:
Full example integrating server and client.
These notebooks are recommended for step-by-step learning and experimentation.
Project Structure
chatbot-mcp-server/
├── mcp_chatbot.py
├── research_server.py
├── main.py
├── papers/
├── .venv/ or venv/
├── pyproject.toml
├── uv.lock
├── .python-version
├── README.md
├── *.ipynb (notebooks)
└── .gitignore
Contributing
Contributions are welcome! Please open issues or submit pull requests for improvements, bug fixes, or new features.
🤩 Easy-to-use global IM bot platform designed for the LLM era / 简单易用的大模型即时通信机器人开发平台 ⚡️ Bots for QQ / QQ频道 / Discord / WeChat(微信)/ Telegram / 飞书 / 钉钉 / Slack 🧩 Integrated with ChatGPT(GPT)、DeepSeek、Dify、n8n、Claude、Google Gemini、xAI、PPIO、Ollama、阿里云百炼、SiliconFlow、Qwen、Moonshot(Kimi K2)、SillyTraven、MCP、WeClone etc. LLM & Agent & RAG