GitHubスター
0
ユーザー評価
未評価
フォーク
0
イシュー
0
閲覧数
3
お気に入り
0
README
MCP CLI LangChain Demo
This project demonstrates how to use the langchain-mcp-adapters
library to connect a LangChain agent with multiple Model Context Protocol (MCP) servers, allowing the agent to leverage tools provided by these servers.
Features
- Loads MCP server configurations from a YAML file (
mcp_servers_config.yaml
). - Connects to multiple MCP servers concurrently (examples include local Python scripts for Math and Weather, and the official SQLite reference server).
- Integrates MCP tools seamlessly into a LangChain agent using
langchain-mcp-adapters
. - Includes a simple demonstration script (
main.py
) to showcase loading configurations and fetching tools. - Provides an interactive chat interface (
chat_interface.py
) powered by a local Ollama model (llama3.3
by default) that can utilize the configured MCP tools. - The chat interface includes helpful custom commands:
/list_servers
: List the names of servers defined in the configuration file./list_tools
: List the names of all available tools and the server providing them./list_tools_details
: List detailed information (name, description) for all tools, grouped by server.
Setup
- Clone this repository:
git clone <your-repository-url> cd mcp_cli_langchain
- Python Environment: Requires Python 3.x. Create and activate a virtual environment:
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
- Install Dependencies: Install required Python packages:
(Note:pip install -r requirements.txt
requirements.txt
includespython-dotenv
which is needed to load the.env
file.) - Install
langchain-mcp-adapters
(Editable): This project uses a potentially modified local version of the adapter library. Install it in editable mode:# Ensure you are in the project root directory (mcp_cli_langchain) cd langchain-mcp-adapters pip install -e . cd ..
- Clone MCP Reference Servers: The configuration uses the official SQLite reference server. Clone the repository into the project root:
git clone https://github.com/modelcontextprotocol/servers.git
- Install
uv
: The SQLite server configuration usesuv
to run. Installuv
by following the instructions on https://github.com/astral-sh/uv. - Setup Ollama:
- Ensure Ollama is installed and the service is running.
- Pull the required model (defaults to
llama3.3
inchat_interface.py
):ollama pull llama3.3
Configuration
- MCP Servers: MCP server connections are defined in
mcp_servers_config.yaml
.- You can add, remove, or modify server entries in this file.
- Pay attention to paths (e.g., for local script servers or the cloned
servers
directory), ensuring they are correct relative to the project root (mcp_cli_langchain
). - The default configuration includes
math
,weather
, andsqlite
servers.
- Ollama: Ollama settings for the chat interface are configured via the
.env
file in the project root.OLLAMA_MODEL
: Specifies the Ollama model to use (e.g.,llama3.3
).OLLAMA_BASE_URL
: The base URL for your running Ollama instance (e.g.,http://localhost:11434
).OLLAMA_TEMPERATURE
: Controls the creativity/randomness of the model's output (e.g.,0.8
).- Create a
.env
file if it doesn't exist, based on the example:
# .env example OLLAMA_MODEL=llama3.3 OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_TEMPERATURE=0.8
Running the Project
Ensure your virtual environment is activated