GitHub Stars
0
User Rating
Not Rated
Forks
0
Issues
0
Views
1
Favorites
0
Ollama MCP Bridge π
A flexible bridge that enables Ollama to interact with MCP (Model Context Protocol) servers, allowing you to query and manage databases using natural language.
Features β¨
- Multiple Bridge Types: Choose between FastMCP (subprocess-based) or full MCP protocol
- Natural Language Database Interaction: Query your MariaDB database conversationally
- Tool Detection: Automatically detects when Ollama needs to use database tools
- Async Support: Built with asyncio for efficient concurrent operations
- Extensible: Easy to add support for other MCP servers
Installation π¦
Basic Installation
pip install mariadb-mcp-ollama-bridge
From Source
git clone https://github.com/mfrederico/mariadb-mcp-ollama-bridge.git
cd mariadb-mcp-ollama-bridge
pip install -e .
Optional Dependencies
For full MCP protocol support:
pip install mariadb-mcp-ollama-bridge[mcp]
For FastMCP support:
pip install mariadb-mcp-ollama-bridge[fastmcp]
Prerequisites π§
Ollama installed and running:
curl -fsSL https://ollama.com/install.sh | sh ollama serve
Ollama model installed:
ollama pull llama3.2
MariaDB MCP Server set up with proper
.env
configuration:DB_HOST=localhost DB_USER=your_user DB_PASSWORD=your_password DB_NAME=your_database
Quick Start π
- copy or link the mcp-mariadb/src
- make sure your mcp-mariadb server is running
Interactive Chat (FastMCP)
The easiest way to get started:
python -m examples.fastmcp_chat
Interactive Chat (Full MCP)
For full MCP protocol support:
python -m examples.mcp_chat --model llama3.2
Programmatic Usage
import asyncio
from ollama_mcp_bridge import FastMCPBridge
async def main():
bridge = FastMCPBridge()
messages = [{"role": "user", "content": "List all databases"}]
response = await bridge.chat_with_tools("llama3.2", messages)
print(response)
asyncio.run(main())
Available Tools π οΈ
The bridge supports all MariaDB MCP tools:
Tool | Description | Parameters |
---|---|---|
list_databases |
List all databases | None |
list_tables |
List tables in database | database_name |
get_table_schema |
Get table structure | database_name , table_name |
execute_sql |
Run SQL query | sql_query , database_name |
create_database |
Create new database | database_name |
create_vector_store |
Create embedding store | database_name , vector_store_name |
search_vector_store |
Semantic search | database_name , vector_store_name , user_query |
Example Queries π¬
Ask natural questions like:
- "Show me all databases"
- "List tables in the test database"
- "What's the schema of the users table?"
- "Create a new database called demo"
- "Search for documents about machine learning"
Architecture ποΈ
βββββββββββ βββββββββββββββ βββββββββββββββ
β Ollama ββββββΆβ MCP Bridge ββββββΆβ MCP Server β
β LLM βββββββ βββββββ (MariaDB) β
βββββββββββ βββββββββββββββ βββββββββββββββ
- User Query β Ollama processes natural language
- Tool Detection β Bridge identifies database operations
- MCP Execution β Commands sent to MCP server
- Results β Formatted and returned to user
Bridge Types π
FastMCP Bridge
- Runs MCP server as subprocess
- No additional dependencies
- Ideal for simple use cases
Full MCP Bridge
- Uses MCP client protocol
- Supports streaming and advanced features
- Requires
mcp
library
Development π¨
Setup Development Environment
# Clone the repository
git clone https://github.com/mfrederico/mariadb-mcp-ollama-bridge.git
cd mariadb-mcp-ollama-bridge
# Install development dependencies
pip install -r requirements-dev.txt
# Install in editable mode
pip install -e .
Running Tests
# Run all tests
pytest
# Run with coverage
pytest --cov=ollama_mcp_bridge
# Run specific test file
pytest tests/test_fastmcp_bridge.py
Code Quality
# Format code
black ollama_mcp_bridge tests
# Lint code
ruff check ollama_mcp_bridge tests
# Type checking
mypy ollama_mcp_bridge
Configuration βοΈ
Environment Variables
Create a .env
file in your project root:
# Database Configuration
DB_HOST=localhost
DB_PORT=3306
DB_USER=your_user
DB_PASSWORD=your_password
DB_NAME=your_database
# Ollama Configuration (optional)
OLLAMA_HOST=http://localhost:11434
OLLAMA_MODEL=llama3.2
Custom Bridge Configuration
from ollama_mcp_bridge import FastMCPBridge
# Custom Ollama host
bridge = FastMCPBridge(ollama_host="http://remote-ollama:11434")
# Custom MCP command
bridge = FastMCPBridge(mcp_command=["python", "path/to/server.py"])
Troubleshooting π
"Connection refused"
- Check Ollama is running:
curl http://localhost:11434/api/tags
- Verify MariaDB credentials in
.env
"Tool not found"
- Ensure you're in the correct directory with MCP server
- Check that
src/server.py
exists
"Model not found"
- Pull the model:
ollama pull llama3.2
- List available models:
ollama list
Contributing π€
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
License π
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments π
- Ollama for the local LLM runtime
- Anthropic MCP for the Model Context Protocol
- MariaDB for the database server
Support π¬
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: Wiki
10
Followers
12
Repositories
0
Gists
2
Total Contributions