GitHub Stars
3
User Rating
Not Rated
Forks
0
Issues
0
Views
0
Favorites
0
MCP Customer Service Assistant: Building AI Integrations with the Model Context Protocol
This project contains working examples for building AI integrations using the Model Context Protocol (MCP).
Overview
Learn how to build standardized AI integrations that work across multiple platforms using MCP.
- Build MCP servers with FastMCP framework
- Create resources, tools, and prompts for AI models
- Integrate with Claude Desktop, OpenAI, OpenAI Agent SDK, Anthropic, LangChain, DSPy, and LiteLLM
- Implement async operations for high performance
- Use Pydantic for data validation and type safety
Prerequisites
- Python 3.12.9 (managed via pyenv)
- Poetry for dependency management
- Go Task for build automation
- API key for OpenAI or Anthropic (Claude) OR Ollama installed locally
Setup
Clone this repository
Copy
.env.exampleto.envand configure your LLM provider:cp .env.example .envEdit
.envto select your provider and model:- For OpenAI: Set
LLM_PROVIDER=openaiand add your API key - For Claude: Set
LLM_PROVIDER=anthropicand add your API key - For Ollama: Set
LLM_PROVIDER=ollama(install Ollama and pull phi3 model first)
- For OpenAI: Set
Run the setup task:
task setup
Supported LLM Providers
OpenAI
- Model: gpt-4.1-2025-04-14
- Requires: OpenAI API key
Anthropic (Claude)
- Model: claude-sonnet-4-20250514
- Requires: Anthropic API key
Ollama (Local)
- Model: gemma3:27b
- Requires: Ollama installed and gemma3:27b model pulled
- Install:
brew install ollama(macOS) or see ollama.ai - Pull model:
ollama pull gemma3:27b
Project Structure
.
├── src/
│ ├── __init__.py
│ ├── config.py # LLM configuration
│ ├── main.py # MCP server implementation
│ ├── openai_integration.py # OpenAI MCP integration
│ ├── openai_agents_integration.py # OpenAI Assistant MCP integration
│ ├── anthropic_integration.py # Anthropic MCP integration
│ ├── langchain_integration.py # LangChain MCP integration
│ ├── dspy_integration.py # DSPy MCP integration
│ └── litellm_integration.py # LiteLLM MCP integration
├── tests/
│ └── test_mcp_server.py # Unit tests
├── .env.example # Environment template
├── Taskfile.yml # Task automation
├── server_config.json # MCP server configuration
└── pyproject.toml # Poetry configuration
Key Concepts Demonstrated
- MCP Architecture: Three-layer system with hosts, clients, and servers
- Resources: Standardized data access through custom URI schemes
- Tools: AI-executable functions for performing actions
- Prompts: Structured templates for consistent AI behavior
- FastMCP Framework: Simplified MCP server development with FastAPI
- Multi-Platform Integration: Connect once, use everywhere
Running Examples
Run the MCP server:
task run
Or run individual integration examples:
task run-openai # OpenAI integration
task run-openai-agents # OpenAI Assistant integration
task run-anthropic # Anthropic integration
task run-langchain # LangChain integration
task run-dspy # DSPy integration
task run-litellm # LiteLLM integration
Direct Python execution:
poetry run python src/main.py
poetry run python src/openai_integration.py
poetry run python src/openai_agents_integration.py
poetry run python src/anthropic_integration.py
Available Tasks
task setup- Set up Python environment and install dependenciestask run- Run the MCP servertask test- Run unit teststask format- Format code with Black and Rufftask clean- Clean up generated filestask build- Build the package for distributiontask install-global- Install the package globally for use with uvxtask install-claude- Install and show Claude Desktop configuration
Installation for Claude Desktop
Quick Setup
Clone and setup the project:
git clone <repository-url> cd mcp_article1 task setup # or: poetry installGet Claude Desktop configuration:
task install-claudeAdd the configuration to Claude Desktop:
The configuration will use a shell script wrapper:
{ "mcpServers": { "customer-service": { "command": "/path/to/mcp_article1/run-mcp-server.sh" } } }Restart Claude Desktop to load the MCP server
Why the Shell Script?
The run-mcp-server.sh script ensures:
- The correct working directory is set
- The virtual environment is activated
- All dependencies are available
- The server runs in the proper context
Troubleshooting Claude Desktop Integration
If the server doesn't appear in Claude Desktop:
- Check the logs: Look for errors in Claude Desktop's developer console
- Test the script manually:
./run-mcp-server.sh - Verify the path: Make sure the command path in the config is absolute
- Check permissions: Ensure the script is executable (
chmod +x run-mcp-server.sh)
Virtual Environment Setup Instructions
Prerequisites
Install pyenv (if not already installed):
# macOS brew install pyenv # Linux curl https://pyenv.run | bashAdd pyenv to your shell:
# Add to ~/.zshrc or ~/.bashrc echo 'export PYENV_ROOT="$HOME/.pyenv"' >> ~/.zshrc echo 'command -v pyenv >/dev/null || export PATH="$PYENV_ROOT/bin:$PATH"' >> ~/.zshrc echo 'eval "$(pyenv init -)"' >> ~/.zshrc # Reload shell source ~/.zshrc
Setup Steps
Install Python 3.12.9:
pyenv install 3.12.9Navigate to your project directory:
cd /path/to/mcp-customer-serviceSet local Python version:
pyenv local 3.12.9Install Poetry (if not installed):
curl -sSL https://install.python-poetry.org | python3 -Install project dependencies:
poetry installActivate the virtual environment:
poetry config virtualenvs.in-project true source .venv/bin/activate
Alternative: If you have Go Task installed
Simply run:
brew install go-task
task setup
Configure your LLM provider
Copy the example env file:
cp .env.example .envEdit .env and set your provider:
# For OpenAI LLM_PROVIDER=openai OPENAI_API_KEY=your-key-here OPENAI_MODEL=gpt-4.1-2025-04-14 # For Anthropic/Claude LLM_PROVIDER=anthropic ANTHROPIC_API_KEY=your-key-here ANTHROPIC_MODEL=claude-sonnet-4-20250514 # For Ollama (local) LLM_PROVIDER=ollama OLLAMA_MODEL=gemma3:27b # Make sure Ollama is running: ollama serve # Pull the model: ollama pull gemma3:27b
Verify setup
# Check Python version
python --version # Should show 3.12.9
# Test imports
python -c "import fastmcp; print('MCP tools installed successfully')"
Run the example
poetry run python src/main.py
Note: The main.py runs the MCP server, while integration examples demonstrate different client implementations.
Example Output
The examples demonstrate:
- Creating an MCP server with customer service resources and tools
- Integrating with multiple AI platforms using the same server
- Handling async operations for better performance
- Using Pydantic for data validation
- Implementing structured prompts for consistent AI responses
Troubleshooting
- Ollama connection error: Make sure Ollama is running (
ollama serve) - API key errors: Check your
.envfile has the correct keys - Model not found: For Ollama, ensure you've pulled the model (
ollama pull gemma3:27b) - MCP server not starting: Check the logs for port conflicts or missing dependencies
Learn More
One of the primary developers on QBit Microservices Lib, Reakt Reactive Java Lib, Boon, Slumber DB and more.
108
Followers
135
Repositories
32
Gists
4
Total Contributions