mcpObservability
Exploring the Model Context Protocol for connecting observability data with LLMs
GitHubスター
0
ユーザー評価
未評価
お気に入り
0
閲覧数
8
フォーク
0
イシュー
0
MCP Observability
A project that explores the Model Context Protocol architecture for integrating different sources of observability data (alerts, incidents, events, logs and metrics) to power LLMs for observability.
Objective
The goal for this repo is to implement an end to end LLM workflow that is able to rationalise data from different observability datasources to achieve the following functions:
- Summarize alerts and incidents.
- Perform RCA by correlating incidents, alerts, metrics, and logs.
- Recommend actionable solutions post-investigation.
Components of this repo
server.py— MCP Server (mock interface to observability data)main.py— Custom MCP Client + AI Agent + Terminal UI/datasources/— Mock Observability Data (logs, alerts, incidents, metrics, events)
Usage
- Clone Repo
git clone https://github.com/chensxb97/mcpObs.git
cd mcpObs
- Setup python virtual environment.
python3 -m venv venv
source venv/bin/activate
pip3 install -r requirements.txt
- Connect the MCP server to a MCP Host (Github Copilot, Claude, Cursor, etc.) or run a custom MCP client + LLM programmatically.
Github Copilot Integration
- Open the Ask Copilot chat using
Command + Shift + Iand selectAgentmode.

- Click on the Tools icon. Type and select
Add More Tools. SelectAdd MCP Serverafterwards.



- Select
Command (stdio), which is the mode of transport defined inserver.py.

- Provide the run command for your mcp server. In this case, I define the path of the python executable in the virtual environment and the mcp server file -
server.py.
/<path to server file>/venv/bin/<python executable> server.py
- On submitting the run command, a mcp server definition* will be generated in Github Copilot's
settings.json. You should see something similar to this.
"mcp": {
"servers": {
"my-mcp-server-XXXXX": {
"type": "stdio",
"command": "<path to server file>/venv/bin/<python executable>",
"args": [
"<path to server file>/server.py"
]
}
}
}
*For newer versions of VSCode, mcp server definitions are managed in .vscode/mcp.json instead of settings.json.
- You can now start prompting Github Copilot to test the MCP server! On identifying some relevance with the tools, Copilot should prompt back asking for permission to run these tools.

- After collecting all relevant data, it combines them with its own response to generate an enhanced output below. Congratulations, you have now successfully set up your own local MCP server for observability!

Custom MCP Client Integration
Alternatively, if you wish to build a custom MCP client + LLM that programmatically connects to the MCP server, an example implementation is provided in main.py.
What you need
You will need a Mistral API key — the free-tier is sufficient for this POC. No billing info is required unless used in production.
This project follows the instructions outlined in Mistral's docs for the MCP Client setup.
- Setup .env with your MISTRAL_API_KEY. This file is currently ignored in
.gitignore, do not ever commit this file.
touch .env
MISTRAL_API_KEY=<YOUR_API_KEY>
- Run
main.py. This will instantiate an LLM agent with an MCP client that establishes a connection with the local MCP server -server.py.
python3 main.py
You will be prompted to enter a command (e.g. “summarize recent alerts” or “what caused the latest incident?”). The agent will:
- With the knowledge of tools from the MCP server, decide which tool is most relevant from the input prompt.
- Call the tools to fetch relevant observability data.
- After receiving the data, a chat completions API is called to summarise the data as an observability expert.
Screenshots
Request critical alerts for an application

Asking for recent error logs for an application

Request for an investigation

1
フォロワー
24
リポジトリ
0
Gist
0
貢献数
🎯 告别信息过载,AI 助你看懂新闻资讯热点,简单的舆情监控分析 - 多平台热点聚合+基于 MCP 的AI分析工具。监控35个平台(抖音、知乎、B站、华尔街见闻、财联社等),智能筛选+自动推送+AI对话分析(用自然语言深度挖掘新闻:趋势追踪、情感分析、相似检索等13种工具)。支持企业微信/个人微信/飞书/钉钉/Telegram/邮件/ntfy/bark/slack 推送,30秒网页部署,1分钟手机通知,无需编程。支持Docker部署⭐ 让算法为你服务,用AI理解热点【不要 fork】 而是 【use this template】,以及建议大家使用【 docker部署】,作者正在和官方沟通