auto-MCP-client
auto-MCP-clientは、Go言語で開発された自動化ツールで、APIとの連携を通じて効率的なワークフローを実現します。特に、データの取得や処理を自動化する機能が強化されており、開発者にとって使いやすい設計がされています。ドキュメントも充実しており、導入もスムーズです。
GitHubスター
48
ユーザー評価
未評価
お気に入り
0
閲覧数
1
フォーク
9
イシュー
0
inspiration from https://github.com/mark3labs/mcphost
English
Project Goal: Implement an "MCP Engine" library in Go. This library allows Go applications to interface with Large Language Models (LLMs) and dynamically utilize tools exposed by external MCP Servers (processes or services adhering to the Model Context Protocol).
MCP Introduction: The Model Context Protocol (MCP) aims to standardize how applications provide context (like tools and data sources) to Large Language Models (LLMs). It acts like a common interface, enabling LLMs to discover and utilize capabilities provided by different external systems. Read more at https://github.com/modelcontext/modelcontextprotocol.
Configuration: MCP Servers are configured via a JSON file (e.g.,
mcp_servers.json
). The file specifies a map of server names to their configuration details. The library supports servers communicating viastdio
(process
) orsse
.command
: The executable to run (forprocess
type).args
: Arguments for the command (forprocess
type).url
: The endpoint URL (forsse
type).transport
: Communication method ("process"
or"sse"
). Can often be inferred.workingDir
: (Optional) Working directory for the server process (process
type).headers
: (Optional) Headers for the connection (sse
type).
See
example/mcp_servers.json
for a sample (create this file based on your needs):{ "mcpServers": { "filesystem": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-filesystem", "./fs_data" ], "transport": "process", "workingDir": "." }, "kubernetes": { "command": "npx", "args": [ "-y", "kubernetes-mcp-server@latest" ], "transport": "process" } // Example SSE server config (if you have one running) // "sse_example": { // "url": "http://localhost:8080/sse", // "transport": "sse", // "headers": { // "Authorization": "Bearer your_token" // } // } } }
Usage (LLM Interaction with MCP Tools):
- Import the library:
import "github.com/Chen-speculation/MCPKit/mcp"
(using the correct module path from yourgo.mod
). - Create an MCP server configuration file (e.g.,
mcp_servers.json
). - Load the MCP configuration:
mcpServers, err := mcp.LoadConfigFromFile("mcp_servers.json")
. - Prepare the engine configuration, including LLM details (API key, model) and the loaded MCP servers:
engineConfig := mcp.EngineConfig{...}
. - Initialize the engine:
engine, err := mcp.NewEngine(engineConfig)
. This starts MCP clients. - Ensure clean shutdown:
defer engine.Close()
. - Start a chat stream:
go engine.ChatStream(ctx, sessionID, prompt, history, outputChan)
. - Process events (
mcp.ChatEvent
) fromoutputChan
: Handle text chunks, tool call requests from the LLM, tool results, session ID, and errors.
Note: The current implementation is focused on OpenAI-compatible LLMs for the chat and tool-calling logic.
See
example/example.go
for a runnable demonstration. Run it with flags, e.g.:go run example/example.go \ -openai-key="YOUR_OPENAI_API_KEY" \ -mcp-config="./example/mcp_servers.json" \ -prompt="List the files in the root directory using the filesystem tool, then tell me about kubernetes pods."
- Import the library:
How it Works: The
mcp.Engine
reads the MCP server configuration and creates corresponding MCP clients (usinggithub.com/mark3labs/mcp-go/client
). WhenChatStream
is called, the engine (currently theopenaiProvider
):- Fetches available tools from all connected MCP clients using
ListTools
. - Formats the user prompt, history, and available tools for the configured LLM (OpenAI API).
- Sends the request to the LLM.
- Processes the LLM's response stream:
- Yields text chunks via the output channel.
- If the LLM requests a tool call (using OpenAI's function calling format), the engine identifies the target MCP client based on the tool name prefix (e.g.,
filesystem__list_files
). - Sends a
tool_call
event. - Executes the tool call via the corresponding MCP client using
CallTool
. - Sends a
tool_result
event. - Sends the tool result back to the LLM to continue the conversation.
- Repeats the process if further tool calls are needed, until the LLM finishes.
- Fetches available tools from all connected MCP clients using
䏿
项ç®ç®æ : å®ç°ä¸ä¸ª Go è¯è¨ç "MCP 弿" åºã该åºå 许 Go åºç¨ç¨åºä¸å¤§åè¯è¨æ¨¡å (LLM) 对æ¥ï¼å¹¶å¨æå©ç¨å¤é¨ MCP æå¡å¨ï¼éµå¾ªæ¨¡åä¸ä¸æåè®®çè¿ç¨ææå¡ï¼æä¾çå·¥å ·ã
MCP ç®ä»: 模åä¸ä¸æåè®® (MCP) æ¨å¨æ åååºç¨ç¨åºå大åè¯è¨æ¨¡å (LLM) æä¾ä¸ä¸æï¼å¦å·¥å ·åæ°æ®æºï¼çæ¹å¼ãå®åä¸ä¸ªéç¨æ¥å£ï¼ä½¿ LLM è½å¤åç°å¹¶å©ç¨ä¸åå¤é¨ç³»ç»æä¾çè½åãæ´å¤ä¿¡æ¯è¯·è®¿é® https://github.com/modelcontext/modelcontextprotocolã
é ç½®: MCP æå¡å¨éè¿ JSON æä»¶ï¼ä¾å¦
mcp_servers.json
ï¼è¿è¡é ç½®ã该æä»¶æå®äºä¸ä¸ªä»æå¡å¨åç§°å°å ¶é ç½®ç»èçæ å°ãè¯¥åºæ¯æéè¿stdio
(process
) æsse
è¿è¡éä¿¡çæå¡å¨ãcommand
: è¦è¿è¡ç坿§è¡æä»¶ï¼ç¨äºprocess
ç±»åï¼ãargs
: ä¼ éç»å½ä»¤çåæ°ï¼ç¨äºprocess
ç±»åï¼ãurl
: ç«¯ç¹ URLï¼ç¨äºsse
ç±»åï¼ãtransport
: éä¿¡æ¹æ³ï¼"process"
æ"sse"
ï¼ãé常å¯ä»¥æ¨æåºæ¥ãworkingDir
: ï¼å¯éï¼æå¡å¨è¿ç¨çå·¥ä½ç®å½ï¼process
ç±»åï¼ãheaders
: ï¼å¯éï¼è¿æ¥ç Headersï¼sse
ç±»åï¼ã
请åé
example/mcp_servers.json
ä¸ç示ä¾ï¼è¯·æ ¹æ®æ¨çéæ±åå»ºæ¤æä»¶ï¼:{ "mcpServers": { "filesystem": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-filesystem", "./fs_data" ], "transport": "process", "workingDir": "." }, "kubernetes": { "command": "npx", "args": [ "-y", "kubernetes-mcp-server@latest" ], "transport": "process" } // SSE æå¡å¨é ç½®ç¤ºä¾ (妿æ¨è¿è¡äºä¸ä¸ª) // "sse_example": { // "url": "http://localhost:8080/sse", // "transport": "sse", // "headers": { // "Authorization": "Bearer your_token" // } // } } }
ä½¿ç¨æ¹æ³ (LLM 交äºä¸ MCP å·¥å ·):
- 导å
¥åº:
import "github.com/Chen-speculation/MCPKit/mcp"
(ä½¿ç¨æ¨go.mod
䏿£ç¡®ç模åè·¯å¾)ã - å建ä¸ä¸ª MCP æå¡å¨é
ç½®æä»¶ (ä¾å¦,
mcp_servers.json
)ã - å è½½ MCP é
ç½®:
mcpServers, err := mcp.LoadConfigFromFile("mcp_servers.json")
ã - åå¤å¼æé
ç½®ï¼å
æ¬ LLM 详ç»ä¿¡æ¯ï¼API å¯é¥ã模åï¼åå è½½ç MCP æå¡å¨:
engineConfig := mcp.EngineConfig{...}
ã - åå§å弿:
engine, err := mcp.NewEngine(engineConfig)
ãè¿å°å¯å¨ MCP 客æ·ç«¯ã - ç¡®ä¿å¹²åå°å
³é:
defer engine.Close()
ã - å¯å¨è天æµ:
go engine.ChatStream(ctx, sessionID, prompt, history, outputChan)
ã - å¤çæ¥èª
outputChan
çäºä»¶ (mcp.ChatEvent
)ï¼å¤çææ¬åãæ¥èª LLM çå·¥å ·è°ç¨è¯·æ±ãå·¥å ·ç»æãä¼è¯ ID åé误ã
注æ: å½åå®ç°ä¸æ³¨äºä¸ OpenAI å ¼å®¹ç LLM çè天åå·¥å ·è°ç¨é»è¾ã
请åé
example/example.go
è·åå¯è¿è¡çæ¼ç¤ºãä½¿ç¨æ å¿è¿è¡å®ï¼ä¾å¦:go run example/example.go \ -openai-key="æ¨ç_OPENAI_API_å¯é¥" \ -mcp-config="./example/mcp_servers.json" \ -prompt="ä½¿ç¨ filesystem å·¥å ·ååºæ ¹ç®å½ä¸çæä»¶ï¼ç¶ååè¯ææå ³ kubernetes pod çä¿¡æ¯ã"
- 导å
¥åº:
å·¥ä½åç:
mcp.Engine
读å MCP æå¡å¨é 置并å建ç¸åºç MCP 客æ·ç«¯ (使ç¨github.com/mark3labs/mcp-go/client
)ãå½è°ç¨ChatStream
æ¶ï¼å¼æï¼å½åæ¯openaiProvider
ï¼ï¼- 使ç¨
ListTools
仿æè¿æ¥ç MCP 客æ·ç«¯è·åå¯ç¨å·¥å ·ã - 为é ç½®ç LLM (OpenAI API) æ ¼å¼åç¨æ·æç¤ºãåå²è®°å½åå¯ç¨å·¥å ·ã
- å°è¯·æ±åéå° LLMã
- å¤ç LLM çååºæµï¼
- éè¿è¾åºééäº§çææ¬åã
- 妿 LLM 请æ±å·¥å
·è°ç¨ï¼ä½¿ç¨ OpenAI ç function calling æ ¼å¼ï¼ï¼å¼æä¼æ ¹æ®å·¥å
·åç§°åç¼ï¼ä¾å¦
filesystem__list_files
ï¼è¯å«ç®æ MCP 客æ·ç«¯ã - åéä¸ä¸ª
tool_call
äºä»¶ã - éè¿ç¸åºç MCP 客æ·ç«¯ä½¿ç¨
CallTool
æ§è¡å·¥å ·è°ç¨ã - åéä¸ä¸ª
tool_result
äºä»¶ã - å°å·¥å ·ç»æåéå LLM 以继ç»å¯¹è¯ã
- 妿éè¦è¿ä¸æ¥çå·¥å ·è°ç¨ï¼åéå¤è¯¥è¿ç¨ï¼ç´å° LLM 宿ã
- 使ç¨