mcp-playground

MCP Playground is a Streamlit-based interface that allows users to interact with large language models while seamlessly integrating external Multi-Server Command Protocol (MCP) tools. It enables the deployment of multiple FastMCP servers managed via Docker Compose, creating a provider-agnostic client using LangChain and LangGraph.

GitHub Stars

33

User Rating

Not Rated

Favorites

0

Views

83

Forks

11

Issues

0

README
MCP Playground ๐Ÿ› ๏ธ๐ŸŒฉ๏ธ

A Streamlit-based playground that lets you chat with large language models and seamlessly plug in external Multi-Server Command Protocol (MCP) tools. Spin up multiple FastMCP servers (Weather & Currency) alongside a Streamlit client, all orchestrated with Docker Compose. The client is provider-agnostic (OpenAI โ€ข Amazon Bedrock โ€ข Anthropic โ€ข Google Gemini) thanks to LangChain + LangGraph.

๐Ÿ“– Learn More

Want a deep dive into how it all works? Check out the detailed walkthrough in this Medium article:
https://medium.com/@elkhan.alizada/your-own-ai-agent-playground-build-it-with-streamlit-langgraph-and-docker-4caeb6fe0ac4


๐Ÿ–ฅ๏ธ๐Ÿ”Œ Main Interface โ€“ Connected View

Interface


๐Ÿ—๏ธ Architecture

Architecture


โœจ Key Features
Feature Description
๐Ÿ”Œ Multi-Server MCP Register any number of MCP servers; the agent auto-detects available tools & routes calls.
๐Ÿ–ฅ๏ธ Streamlit Chat UI Rich chat experience with history, sidebar controls and live tool execution output.
๐Ÿงฉ Provider-Agnostic One LangChain interface for OpenAI, Bedrock, Anthropic, Google, Groq. Switch on the fly.
๐Ÿค– React Agent via LangGraph create_react_agent enables dynamic tool selection and reasoning.
๐Ÿณ Docker-First Separate Dockerfiles for client & each server + a single docker-compose.yaml.
๐Ÿ“ฆ Extensible Drop-in new MCP servers or providers without touching UI code.

๐Ÿ“‚ Project Layout
mcp-playground/
โ”œโ”€ docker-compose.yaml          # One-command orchestration
โ”œโ”€ client/                      # Streamlit UI
โ”‚  โ”œโ”€ app.py                    # Main entry-point
โ”‚  โ”œโ”€ config.py                 # Typed settings & defaults
โ”‚  โ”œโ”€ servers_config.json       # MCP endpoint catalogue
โ”‚  โ”œโ”€ ui_components/            # Streamlit widgets
โ”‚  โ””โ”€ ...
โ””โ”€ servers/
   โ”œโ”€ server1/                  # Weather Service MCP
   โ”‚  โ””โ”€ main.py
   โ””โ”€ server2/                  # Currency Exchange MCP
      โ””โ”€ main.py

๐Ÿš€ Quick Start
1 ยท Prerequisites
  • Docker โ‰ฅ 24 & Docker Compose
  • At least one LLM provider key (e.g. OPENAI_API_KEY) or AWS creds for Bedrock.
2 ยท Clone & Run
git clone https://github.com/your-org/mcp-playground.git
cd mcp-playground
docker compose up --build
Service URL Default Port
Streamlit Client http://localhost:8501 8501
Weather MCP http://localhost:8000 8000
Currency MCP http://localhost:8001 8001

โš™๏ธ Configuration

All runtime settings are concentrated in client/config.py and environment variables.

Variable Purpose
MODEL_ID Provider selector (OpenAI, Bedrock, Anthropic, Google, Groq).
TEMPERATURE Sampling temperature (sidebar slider).
MAX_TOKENS Token limit (sidebar).
MODEL_OPTIONS = {
    'OpenAI': 'gpt-4o',
    'Antropic': 'claude-3-5-sonnet-20240620',
    'Google': 'gemini-2.0-flash-001',
    'Bedrock': 'us.anthropic.claude-3-7-sonnet-20250219-v1:0',
    'Groq': 'meta-llama/llama-4-scout-17b-16e-instruct'
}

MCP endpoints live in servers_config.json โ€“ edit to add/remove servers without code changes.


๐Ÿ’ฌ Using the Playground
  1. Select Provider ยท Pick your LLM in the sidebar and paste the corresponding credentials.
  2. Connect MCP Servers ยท Toggle connections; available tools appear in the MCP Tools list.
  3. Chat ยท Type a question.
    • If connected, the React agent decides whether to call an MCP tool (e.g. get_current_weather).
    • Otherwise it falls back to plain LLM chat.
  4. Inspect Tool Calls ยท Tool invocations are streamed back as YAML blocks with inputs & outputs.

Try: "What will the weather be in Baku tomorrow and how much is 100 USD in AZN?"


๐Ÿ› ๏ธ Included MCP Servers
Weather Service :8000
mcp = FastMCP("Weather Service", host="0.0.0.0", port=8000)

@mcp.tool()
async def get_current_weather(location: str) -> dict: ...

@mcp.tool()
async def get_forecast(location: str, days: int = 3) -> dict: ...
Currency Exchange :8001
mcp = FastMCP("Currency Exchange", host="0.0.0.0", port=8001)

@mcp.tool()
async def get_currency_rates(date: str = None) -> dict: ...

@mcp.tool()
async def convert_currency(amount: float, from_currency: str, to_currency: str, date: str = None) -> dict: ...

๐Ÿ™ Acknowledgements