anet-mcp-server
The anet-mcp-server is a high-performance server application built in Rust, ideal for microservices architecture. It leverages asynchronous processing to achieve high scalability, efficiently handling numerous requests. Designed for developers, it is easily extensible and customizable.
GitHub Stars
1
User Rating
Not Rated
Favorites
0
Views
52
Forks
0
Issues
0
Anet MCP Server
A Rust implementation of the Model Control Protocol (MCP) server that enables communication between clients and AI models via a standardized protocol.
This project provides a scalable and asynchronous framework for building AI services using Rust, Tokio, and NATS. It is designed for developers building AI agent systems, LLM-based tools, or custom JSON-RPC 2.0 service layers. The architecture supports real-time message passing, making it ideal for microservices, AI orchestration, and tool-based model interaction.
Features
- β JSON-RPC 2.0 compatible API
- π NATS transport layer for message passing
- π οΈ Extensible tool system
- π§ Support for prompts and resources
- β‘ Asynchronous request handling with Tokio
Requirements
- Rust 1.70+
- NATS server running locally or accessible via network
Installation
Add the following to your Cargo.toml:
[dependencies]
anet_mcp_server = "0.1.0"
Getting Started
Running the Example Server
The repository includes a basic example server that demonstrates core functionality:
# Start a NATS server in another terminal or ensure one is already running
# Example:
nats-server
# Run the example server
cargo run --example basic_server
Testing the Server
You can test the server using the included test client:
cargo run --example test_client
This will send various requests to the server and print the responses.
Usage
Creating a Server
use anet_mcp_server::{
ServerBuilder, ServerCapabilities,
transport::nats::NatsTransport,
};
use serde_json::json;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let transport = NatsTransport::new("nats://localhost:4222", "mcp.requests").await?;
let server = ServerBuilder::new()
.transport(transport)
.name("my-mcp-server")
.version("0.1.0")
.capabilities(ServerCapabilities {
tools: Some(json!({})),
prompts: Some(json!({})),
resources: Some(json!({})),
notification_options: None,
experimental_capabilities: None,
})
.build()?;
server.run().await
}
Implementing a Custom Tool
use anet_mcp_server::{Content, Tool};
use async_trait::async_trait;
use serde_json::{json, Value};
struct MyTool;
#[async_trait]
impl Tool for MyTool {
fn name(&self) -> String {
"my_tool".to_string()
}
fn description(&self) -> String {
"A custom tool".to_string()
}
fn input_schema(&self) -> Value {
json!({
"type": "object",
"properties": {
"input": { "type": "string" }
}
})
}
async fn call(&self, input: Option<Value>) -> anyhow::Result<Vec<Content>> {
Ok(vec![Content::Text {
text: "Tool response".to_string()
}])
}
}
API Reference
The server implements the following JSON-RPC methods:
initializeβ Initialize the connection and get server informationlistToolsβ Get a list of available toolscallToolβ Call a specific tool with argumentslistResourcesβ Get a list of available resourcesreadResourceβ Read a specific resourcelistPromptsβ Get a list of available promptsgetPromptβ Get a specific prompt with arguments
Architecture
The server follows a modular design:
- server β Core server logic and request handling
- transport β Message transport layer (currently NATS)
- tools β Tool interfaces and implementations
- types β Common data structures
License
MIT License
Let me know if you want badges, contribution guidelines, or example JSON-RPC payloads added to the README as well.
0
Followers
0
Repositories
0
Gists
0
Total Contributions
π Terraform Model Context Protocol (MCP) Tool - An experimental CLI tool that enables AI assistants to manage and operate Terraform environments. Supports reading Terraform configurations, analyzing plans, applying configurations, and managing state with Claude Desktop integration. β‘οΈ
echokit_server is a high-performance server software built in Rust, offering efficient data processing and API management. Designed for developers, it boasts high scalability and adaptability for various applications. It is particularly well-suited for real-time data processing and microservices architecture.