mcp-server-suite
The mcp-server-suite is a comprehensive collection of MCP (Model Context Protocol) servers designed to be the 'HuggingFace for automation'. It aims to create a community-driven platform where developers can use pre-built servers and contribute their own. With a modular architecture, it allows for easy addition of new tools and services, making automation and AI-powered workflows accessible.
GitHub Stars
0
User Rating
Not Rated
Favorites
0
Views
27
Forks
0
Issues
0
mcp-server-suite
mcp-server-suite is a comprehensive collection of MCP (Model Context Protocol) servers designed to be the "HuggingFace for automation". Our vision is to create a community-driven platform where developers can both use pre-built servers and contribute their own, making automation and AI-powered workflows accessible to everyone.
Features
- Model Context Protocol core for flexible context management
- Modular architecture for adding new tools and services
- Growing collection of ready-to-use MCP servers
- Easy integration with APIs and external systems
- Environment variable support via
.envfiles - Community-driven expansion and customization
Available Servers
Core Servers
- Web Search Server: Integrate web search capabilities using Tavily API
- File Explorer Server: Safe file system operations with progress reporting
- PostgreSQL Server: Both sync and async database operations
- More coming soon!
Getting Started
Prerequisites
- Python 3.13 or higher
- UV - Fast Python package installer and resolver
Installation
Clone the repository:
git clone https://github.com/Dadiya-Harsh/mcp-server-suite.git cd mcp-server-suiteInstall UV if you haven't already:
pip install uvCreate and activate a virtual environment:
uv venv # On Windows: .venv\Scripts\activate # On Unix/macOS: source .venv/bin/activateInstall dependencies using UV:
uv pip install -e .Set up your environment variables in a
.envfile:# Required for web search server TAVILY_API_KEY=your_api_key_here # Required for PostgreSQL server DATABASE_URI=postgresql://user:password@localhost:5432 # Required for file explorer server ALLOWED_BASE_PATH=E:\Your\Safe\Path
Running the Servers
Each server can be run independently:
# Web Search Server
python web_search/web_search_sse.py
# File Explorer Server
python file_explore/basic_file_server.py
# PostgreSQL Server (async)
python database/postgresql/async_postgresql_server.py
Using the Platform
As a User
Install the package:
uv pip install mcp-server-suiteImport and use any server:
from mcp_server_suite.web_search import web_search_server from mcp_server_suite.database import async_postgresql_server
📌 Planned MCP Servers
For AI/ML Engineers
- Dataset Explorer MCP: Browse/filter datasets from HuggingFace, Kaggle, etc.
- Model Evaluation MCP: Compare model performance across tasks or datasets.
- Experiment Tracker MCP: Visualize experiment logs and metrics from MLflow/W&B.
- Training Job Manager MCP: Start, monitor, and manage training jobs locally or on the cloud.
- Paper Summary MCP: Summarize and interact with AI papers via arXiv or Semantic Scholar.
For Software Engineers
- Codebase Q&A MCP: Ask natural language questions about source code.
- Issue Debugger MCP: Summarize GitHub/GitLab issues and provide solutions.
- CI/CD Monitor MCP: Check pipeline status and deployment history.
- API Inspector MCP: Query and understand OpenAPI/Swagger docs via natural prompts.
For Non-Technical Users
- Health Assistant MCP: Get simple, symptom-based answers and medication guidance.
- Finance Tracker MCP: Analyze and summarize personal spending from CSV/PDF exports.
- Smart Task Planner MCP: Convert vague goals into structured, scheduled learning/task plans.
- FAQ Assistant MCP: Answer questions based on uploaded files or organization docs.
Vision & Roadmap
We're building the "HuggingFace for automation" with these goals:
Immediate Goals
- Expand server collection with most-requested functionalities
- Build a central registry for discovering and sharing servers
- Create comprehensive documentation and examples
- Implement authentication and security features
Community Features (Coming Soon)
- Server discovery and search
- Easy server deployment and sharing
- Version control and dependency management
- Usage analytics and monitoring
- Community ratings and reviews
Server Categories
Data & Analytics
- Database connectors (MySQL, MongoDB, etc.)
- Analytics engines
- Data transformation tools
AI/ML Tools
- Model deployment servers
- Training job managers
- Dataset handlers
- Experiment trackers
DevOps & Infrastructure
- CI/CD integrators
- Cloud service managers
- Monitoring tools
- Log analyzers
Application Integration
- Email/SMS servers
- Payment processors
- Authentication services
- File storage connectors
Contributing
Join us in building the future of automation! Contributions are welcome:
- Fork the repository
- Create your feature branch
- Add your server or improvements
- Submit a pull request
See our Contributing Guidelines for more details.
Support
License
This project is licensed under the MIT License. See the LICENSE file for details.
5
Followers
58
Repositories
0
Gists
0
Total Contributions
AgentUp is an automation tool built with Python that aims to enhance task efficiency. It excels in data processing and API integration, allowing users to easily construct workflows. Designed with an intuitive interface and rich features, it is accessible even to users with minimal programming knowledge.
TuriX is an open-source tool that allows users to perform actions directly on their desktop using AI models. Users can easily customize their models through a configuration file, and it is available for free for personal and research purposes. With a computer-use agent that achieves over 68% accuracy, TuriX combines ease of use with high performance.
MCP-Airflow-API is an open-source tool that allows users to manage Apache Airflow workflows using natural language. It simplifies the traditional REST API calls and web interface manipulations, enabling users to intuitively operate workflows. This enhances the management of data pipelines and improves the productivity of developers and data engineers.