mcp_cli_langgraph

Langgraph MCP Adapters

GitHub Stars

0

User Rating

Not Rated

Forks

0

Issues

0

Views

2

Favorites

0

README
MCP CLI LangChain Demo

This project demonstrates how to use the langchain-mcp-adapters library to connect a LangChain agent with multiple Model Context Protocol (MCP) servers, allowing the agent to leverage tools provided by these servers.

Features
  • Loads MCP server configurations from a YAML file (mcp_servers_config.yaml).
  • Connects to multiple MCP servers concurrently (examples include local Python scripts for Math and Weather, and the official SQLite reference server).
  • Integrates MCP tools seamlessly into a LangChain agent using langchain-mcp-adapters.
  • Includes a simple demonstration script (main.py) to showcase loading configurations and fetching tools.
  • Provides an interactive chat interface (chat_interface.py) powered by a local Ollama model (llama3.3 by default) that can utilize the configured MCP tools.
  • The chat interface includes helpful custom commands:
    • /list_servers: List the names of servers defined in the configuration file.
    • /list_tools: List the names of all available tools and the server providing them.
    • /list_tools_details: List detailed information (name, description) for all tools, grouped by server.
Setup
  1. Clone this repository:
    git clone <your-repository-url>
    cd mcp_cli_langchain
    
  2. Python Environment: Requires Python 3.x. Create and activate a virtual environment:
    python -m venv venv
    source venv/bin/activate  # On Windows use `venv\Scripts\activate`
    
  3. Install Dependencies: Install required Python packages:
    pip install -r requirements.txt
    
    (Note: requirements.txt includes python-dotenv which is needed to load the .env file.)
  4. Install langchain-mcp-adapters (Editable): This project uses a potentially modified local version of the adapter library. Install it in editable mode:
    # Ensure you are in the project root directory (mcp_cli_langchain)
    cd langchain-mcp-adapters
    pip install -e .
    cd ..
    
  5. Clone MCP Reference Servers: The configuration uses the official SQLite reference server. Clone the repository into the project root:
    git clone https://github.com/modelcontextprotocol/servers.git
    
  6. Install uv: The SQLite server configuration uses uv to run. Install uv by following the instructions on https://github.com/astral-sh/uv.
  7. Setup Ollama:
    • Ensure Ollama is installed and the service is running.
    • Pull the required model (defaults to llama3.3 in chat_interface.py):
      ollama pull llama3.3
      
Configuration
  • MCP Servers: MCP server connections are defined in mcp_servers_config.yaml.
    • You can add, remove, or modify server entries in this file.
    • Pay attention to paths (e.g., for local script servers or the cloned servers directory), ensuring they are correct relative to the project root (mcp_cli_langchain).
    • The default configuration includes math, weather, and sqlite servers.
  • Ollama: Ollama settings for the chat interface are configured via the .env file in the project root.
    • OLLAMA_MODEL: Specifies the Ollama model to use (e.g., llama3.3).
    • OLLAMA_BASE_URL: The base URL for your running Ollama instance (e.g., http://localhost:11434).
    • OLLAMA_TEMPERATURE: Controls the creativity/randomness of the model's output (e.g., 0.8).
    • Create a .env file if it doesn't exist, based on the example:
    # .env example
    OLLAMA_MODEL=llama3.3
    OLLAMA_BASE_URL=http://localhost:11434
    OLLAMA_TEMPERATURE=0.8
    
Running the Project

Ensure your virtual environment is activated

Author Information

0

Followers

6

Repositories

0

Gists

0

Total Contributions

Threads