AgentBridge-MCP

MCP server for an AI assistant

GitHub Stars

0

User Rating

Not Rated

Forks

0

Issues

0

Views

0

Favorites

0

README
AgentBridge-MCP

MCP server for an AI assistant

AgentBridge-MCP

A modular, extensible Model Context Protocol (MCP) server for secure and auditable AI agent task routing.


πŸ”§ What is This?

This repo is a foundational local MCP server built with Python + FastAPI that:

  • Accepts structured JSON requests for tasks
  • Routes them to secure modular task handlers (e.g. email, CSV parsing, summarization)
  • Can be extended for any AI agent or tool

πŸš€ What You Get Out of the Box

βœ… FastAPI MCP server
βœ… Modular task/ folder (e.g. echo, get_time)
βœ… Simple POST /mcp handler
βœ… Dispatch logic with task map
βœ… Ready for local testing & extension


πŸ’‘ Why Use It?

This MCP server acts as the command center for AI agents.

It separates "what needs to be done" from "how it’s done" β€” enabling:

  • πŸ“¦ Task modularity
  • πŸ” A2SPA-compliant secure execution
  • ⚑ Agent-Agnostic Logic (GPT, Claude, Ollama, etc.)

πŸ“ Folder Structure

app/ β”œβ”€β”€ main.py # FastAPI entry point β”œβ”€β”€ router.py # Dispatch logic └── tasks/ β”œβ”€β”€ echo.py β”œβ”€β”€ get_time.py └── summarize_text.py

tests/ β”œβ”€β”€ test_mcp.py

run.sh # Dev runner requirements.txt # Dependencies


πŸ“Š Workflow Diagram
       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
       β”‚  HTTP Client  β”‚  (e.g. curl, frontend)
       β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚  POST /mcp
       β”Œβ”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”
       β”‚   main.py    β”‚  (FastAPI app)
       β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
       β”Œβ”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”
       β”‚  router.py   β”‚  β†’ Parses task key
       β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
       β”Œβ”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
       β”‚ app/tasks/{task}.py           β”‚  β†’ Executes logic (e.g., datetime, echo, LLM)
       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                    β”‚
             β”Œβ”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”
             β”‚ Response JSONβ”‚
             β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ“ˆ Coming Soon

This core server will soon power a Modular Investor Outreach AI Agent, with:

CSV-based email flows

Validation + personalization via MCP

PKI-audited A2SPA security

GPT/Ollama integration

🧠 Inspired By

Anthropic’s Claude MCP architecture

A2SPA Protocol (Agent-to-Agent Secure Protocol Architecture)

---
πŸ›  How to Run the Email Agent System
πŸ”§ 1. Install dependencies

pip install -r requirements.txt

πŸ“‘ 2. Start the MCP server (backend task router)

uvicorn app.main:app --reload

🧠 3. Optional: Start Ollama if not running already

ollama run llama3

πŸ–₯️ 4. Launch the Streamlit UI
streamlit run streamlit_app.py
πŸ’‘ What This System Does
Upload a .csv of names and emails (columns: name,email)

Set a subject and shared body for the outreach message

Attach any files (pitch decks, PDFs, etc.)

Ollama personalizes each email with greeting and sign-off

MCP server dispatches email sending via Gmail SMTP
---
πŸ“ File Requirements

Make sure .env is configured with: EMAIL_USER=your@gmail.com EMAIL_PASS=your_app_password # from Gmail App Passwords


Author Information

5

Followers

82

Repositories

0

Gists

9

Total Contributions

Top Contributors

Threads