mcp-ollama-streamlit-agent
This repository offers a multi-agent chatbot powered by the Model Context Protocol (MCP) using the Ollama qwen3:1.7b model and a Streamlit-based frontend. It integrates domain-specific utilities such as weather APIs, math evaluation, and CSV dataset analysis, supporting tool calling for enhanced functionality.
GitHub Stars
3
User Rating
Not Rated
Forks
1
Issues
0
Views
1
Favorites
0
🧠 MCP + Ollama Streamlit Chatbot
This repository contains a fully functional multi-agent chatbot powered by the Model Context Protocol (MCP), Ollama with the qwen3:1.7b
model, and a Streamlit-based frontend. The chatbot supports tool calling and integrates domain-specific utilities like weather APIs, math evaluation, and CSV dataset analysis.
📁 Project Structure
.
├── app.py # Streamlit frontend interface
├── client.py # Async MCP client with Ollama integration and tool handling
├── server.py # MCP-compatible server with weather, math, and dataset tools
├── data/
│ └── dataset.csv # Sample CSV file for dataset analysis
├── .env # Environment variables (MCP_SSE_URL, etc.)
🧰 Features & Tools
The assistant supports the following built-in tools via MCP server:
- 🌤️ Weather Tools – Fetch alerts and forecasts using the National Weather Service API
- ➗ Math Evaluator – Safe evaluation of arithmetic expressions
- 📊 Dataset Inspector – Summary statistics, shape, and NLP-powered queries on a local
dataset.csv
file
flowchart TD
subgraph Streamlit_UI
A1[User Prompt]
A2[Display Chat History]
A3[Streamlit App - app_py]
end
subgraph MCP_Client
B1[Connect to SSE Server]
B2[Process Query]
B3[Call Ollama API]
B4[Handle Tool Calls]
end
subgraph MCP_Server
C1[Weather Alerts Tool]
C2[Forecast Tool]
C3[Math Evaluation Tool]
C4[Dataset Analysis Tool]
C5[Dataset Query Tool]
end
subgraph Ollama_Model
D1[qwen3 1_7b Model]
end
A1 --> A3
A3 --> B2
A2 --> A3
B2 --> B3
B3 --> D1
D1 --> B4
B4 -->|Tool Call| C1
B4 -->|Tool Call| C2
B4 -->|Tool Call| C3
B4 -->|Tool Call| C4
B4 -->|Tool Call| C5
C1 --> B2
C2 --> B2
C3 --> B2
C4 --> B2
C5 --> B2
🔧 Requirements
Ensure you have the following installed:
- Python 3.9+
- Ollama with the
qwen3:1.7b
model available locally - MCP library (see installation below)
- Streamlit
- Uvicorn for ASGI server
- A
.env
file with MCP server URL defined:MCP_SSE_URL=http://localhost:8080/sse
Python Packages
You can install all required packages via:
pip install -r requirements.txt
If you don't have a requirements.txt
, use:
pip install streamlit uvicorn httpx python-dotenv pandas scikit-learn mcp
▶️ How to Run
1. Launch the MCP Server (Tool Provider)
Run the server to expose SSE-compatible endpoints:
python server.py --host 0.0.0.0 --port 8080
This will start a FastAPI-compatible MCP server exposing tools on:
http://localhost:8080/sse
2. Start the Streamlit Frontend
In another terminal:
streamlit run app.py
This will open the chat interface in your browser at:
http://localhost:8501
📦 Ollama Model Setup
Install and run Ollama:
ollama pull qwen3:1.7b
ollama run qwen3:1.7b
Ensure the model is loaded and responding at:
http://localhost:11434/api/chat
🧪 Supported Use Cases
Here are some queries you can try:
Weather
- What’s the weather in San Francisco?
- Are there any alerts in NY?
Math
- What is 2 + 3 * 4?
- Calculate the square root of 81
Dataset Analysis
- What are the columns in the dataset?
- How many records are in the file?
- Do any descriptions mention "cloud" or "AI"?
🧼 Resetting State
You can reset the chat history using the sidebar button in the Streamlit UI.
📁 Notes
- Make sure
data/dataset.csv
exists if using dataset tools. - Ensure
MCP_SSE_URL
in.env
matches your server setup. - This system uses
asyncio.run_coroutine_threadsafe()
to allow asynchronous tool execution within Streamlit's synchronous model.
0
Followers
10
Repositories
0
Gists
5
Total Contributions