mcp-langgraph-demo
mcp-langgraph-demoは、Pythonを用いて言語グラフを生成するデモプロジェクトです。自然言語処理の技術を活用し、テキストデータから意味的な関係を視覚化します。特に、言語モデルの理解を深めるためのツールとして有用です。
GitHubスター
1
ユーザー評価
未評価
お気に入り
0
閲覧数
5
フォーク
0
イシュー
0
MCP LangGraph Demo
A production-ready project demonstrating the integration of Model Context Protocol (MCP) with LangGraph for building scalable, extensible AI agent systems. This platform showcases microservices architecture, containerized deployment, and modular MCP server integration.
🏗 Architecture Overview
Core Components
Component | Technology | Port | Purpose |
---|---|---|---|
Web UI | Streamlit | 8501 | User interface for AI interactions |
API Gateway | FastAPI | 8000 | Request routing and authentication |
LangGraph Agent | FastAPI + LangGraph | 8001 | AI orchestration and tool execution |
Weather MCP Server | FastMCP | 8002 | Weather data provider |
Math MCP Server | FastMCP | 8003 | Mathematical operations |
🚀 Quick Start
Prerequisites
- Python 3.13+
- UV Package Manager (recommended)
- Docker & Docker Compose
- Git
Environment Setup
Clone the repository:
git clone <repository-url> cd mcp-langgraph-demo
Install UV (if not already installed):
# Windows powershell -c "irm https://astral.sh/uv/install.ps1 | iex" # macOS/Linux curl -LsSf https://astral.sh/uv/install.sh | sh # Using pip pip install uv
Create and activate virtual environment:
uv venv # Windows .venv\Scripts\activate # macOS/Linux source .venv/bin/activate
Install dependencies:
uv pip install -r requirements.txt
🔧 Configuration
Environment Variables
The project uses environment-specific configuration files located in infrastructure/docker/
:
.env.dev
(Development Configuration)
# API Configuration
INTERNAL_API_KEY=supersecretkey
API_GATEWAY_URL=http://api-gateway:8000
LANGGRAPH_AGENT_URL=http://langgraph-agent:8001
BACKEND_URL=http://api-gateway:8000
# External Service APIs
GROQ_API_KEY=your_groq_api_key_here
WEATHER_API_KEY=your_openweather_api_key_here
# MCP Tool Configuration
MCP_TOOL_CONFIG={"weather":{"url":"http://weather-server:8002/mcp","transport":"streamable_http"}}
Required API Keys
- Groq API Key: Get from Groq Console
- OpenWeather API Key: Get from OpenWeatherMap
🐳 Docker Deployment
Development Environment
Start all services with Docker Compose:
cd infrastructure/docker docker-compose --env-file .env.dev -f docker-compose.dev.yml up --build
Access the application:
- Web UI: http://localhost:8501
- API Gateway: http://localhost:8000
- LangGraph Agent: http://localhost:8001
- Weather MCP Server: http://localhost:8002
Service Health Checks
# Check all services
curl http://localhost:8000/health # API Gateway
curl http://localhost:8001/health # LangGraph Agent
curl http://localhost:8002/health # Weather MCP Server
💻 Local Development
Running Services Individually
1. API Gateway
$env:PYTHONPATH = ".\services\api-gateway"
python -m uvicorn src.main:app --host 0.0.0.0 --port 8000 --reload
2. LangGraph Agent
$env:PYTHONPATH = ".\services\langgraph-agent"
python -m uvicorn src.main:app --host 0.0.0.0 --port 8001 --reload
3. Weather MCP Server
$env:PYTHONPATH = ".\services\mcp-servers\weather-server"
python -m uvicorn src.main:app --host 0.0.0.0 --port 8002 --reload
4. Streamlit UI
cd services/ui
streamlit run src/app.py --server.port 8501
🔌 MCP Server Architecture
Model Context Protocol (MCP)
MCP enables secure, standardized connections between AI models and external tools/data sources. This project demonstrates:
- Tool Integration: Weather data, mathematical operations
- Security: Authentication and request validation
- Scalability: Microservices architecture
- Extensibility: Plugin-based MCP server system
Existing MCP Servers
Weather Server (services/mcp-servers/weather-server/
)
- Functionality: Real-time weather data retrieval
- API: OpenWeatherMap integration
- Tools:
get_weather(city: str) -> str
Math Server (services/mcp-servers/math-server/
)
- Functionality: Mathematical calculations
- Tools: Basic arithmetic and advanced operations
Adding New MCP Servers
Create server structure:
mkdir -p services/mcp-servers/your-server/src cd services/mcp-servers/your-server
Create basic files:
# src/main.py from mcp.server.fastmcp import FastMCP from config import settings from logger import setup_logger logger = setup_logger("your-server") mcp = FastMCP(name="your-server") @mcp.tool() async def your_tool(input_param: str) -> str: """Your tool description.""" # Implementation here return result if __name__ == "__main__": import uvicorn uvicorn.run(mcp.app, host="0.0.0.0", port=8003)
Update configuration:
# Add to .env.dev MCP_TOOL_CONFIG={"weather":{"url":"http://weather-server:8002/mcp","transport":"streamable_http"},"your-server":{"url":"http://your-server:8003/mcp","transport":"streamable_http"}}
Add to Docker Compose:
your-server: build: context: ../../services/mcp-servers/your-server dockerfile: Dockerfile ports: - "8003:8003" environment: - ENV=dev - INTERNAL_API_KEY=${INTERNAL_API_KEY} restart: on-failure
🤖 LangGraph Integration
Agent Architecture
The LangGraph agent orchestrates tool execution with:
- State Management: Conversation context and history
- Tool Routing: Dynamic selection of appropriate MCP tools
- Error Handling: Graceful fallback mechanisms
- Streaming: Real-time response generation
Agent Flow
# Simplified agent workflow
def create_agent():
tools = load_mcp_tools() # Load from MCP servers
agent = create_graph(
tools=tools,
model=ChatGroq(),
memory=memory_store
)
return agent
📊 RAG Integration Roadmap
Planned RAG Capabilities
Document Ingestion MCP Server
@mcp.tool() async def ingest_document(file_path: str, metadata: dict) -> str: """Ingest documents into vector store."""
Vector Search MCP Server
@mcp.tool() async def semantic_search(query: str, top_k: int = 5) -> List[dict]: """Perform semantic search across documents."""
Knowledge Base Integration
- Vector Stores: Pinecone, Weaviate, ChromaDB
- Embeddings: OpenAI, Cohere, local models
- Document Processing: PDF, Word, Web scraping
Implementation Plan
- Phase 1: Document ingestion MCP server
- Phase 2: Vector search and retrieval
- Phase 3: Hybrid search (semantic + keyword)
- Phase 4: Multi-modal RAG (text, images, code)
🛠 Development Guidelines
Code Structure
services/
├── api-gateway/ # Request routing and auth
│ ├── src/
│ │ ├── routes/ # API endpoints
│ │ ├── middleware/ # Auth, CORS, logging
│ │ └── config.py # Configuration management
├── langgraph-agent/ # AI orchestration
│ ├── src/
│ │ ├── agents/ # LangGraph agent definitions
│ │ ├── tools/ # MCP tool integrations
│ │ └── routes/ # FastAPI endpoints
├── mcp-servers/ # Tool implementations
│ ├── weather-server/ # Weather data provider
│ └── math-server/ # Mathematical operations
└── ui/ # Frontend interface
└── src/
└── app.py # Streamlit application
🔐 Security
Authentication
- Internal API Keys: Service-to-service communication
- External API Keys: Third-party service access
- CORS Configuration: Frontend security
Best Practices
- Environment-specific configurations
- Secret management via environment variables
- Request validation and sanitization
- Rate limiting and throttling
📖 API Documentation
Interactive Documentation
- API Gateway: http://localhost:8000/docs
- LangGraph Agent: http://localhost:8001/docs
- Weather MCP: http://localhost:8002/docs
Example API Calls
# Query the agent
curl -X POST "http://localhost:8000/ask" \
-H "Content-Type: application/json" \
-H "X-INTERNAL-KEY: supersecretkey" \
-d '{"query": "What is the weather in London?"}'
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🆘 Support & Troubleshooting
Common Issues
- Import Errors: Ensure
PYTHONPATH
is set correctly - Port Conflicts: Check if ports 8000-8502 are available
- API Key Issues: Verify environment variables are set
- Docker Build Failures: Check Dockerfile syntax and dependencies
Getting Help
- Issues: GitHub Issues tracker
- Discussions: GitHub Discussions
- Documentation: Inline code documentation
Built with ❤️ using MCP, LangGraph, and FastAPI