ESP32MCPServer
The ESP32 MCP Server is an implementation of the Model Context Protocol (MCP) for ESP32 microcontrollers, providing a WebSocket-based interface for resource discovery and monitoring. It enables real-time updates with thread-safe request handling and a web interface for WiFi configuration. The server supports configuration storage using LittleFS and includes a comprehensive test suite.
GitHub Stars
26
User Rating
Not Rated
Favorites
0
Views
28
Forks
10
Issues
0
ESP32 MCP Server
A Model Context Protocol (MCP) implementation for ESP32 microcontrollers, providing a WebSocket-based interface for resource discovery and monitoring.
Status: Not Compiling, initial commit as is from the model
Created with Claude 3.5 Sonet on the commit date (with minor obvioud fixes with automatic formating, etc.)

Features
- MCP protocol implementation (v0.1.0)
- WebSocket server for real-time updates
- Resource discovery and monitoring
- WiFi configuration via web interface
- Thread-safe request handling
- Comprehensive test suite
- AsyncWebServer integration
- LittleFS support for configuration storage
Prerequisites
Hardware
- ESP32 S3 DevKitC-1 board
- USB cable for programming
Software
- PlatformIO Core (CLI) or PlatformIO IDE
- Python 3.7 or higher
- Git
Architecture
flowchart TD
Start[Start Application] --> Setup[Setup]
Setup -->|Initialize Filesystem| InitFS[Initialize LittleFS]
Setup -->|Start Network| StartNetwork[Initialize Network Manager]
Setup -->|Create Tasks| CreateTasks[Create and Assign Tasks]
subgraph Network
StartNetwork --> APCheck[Check AP or Connect Mode]
APCheck -->|Credentials Exist| Connect[Connect to WiFi]
APCheck -->|No Credentials| StartAP[Start Access Point]
Connect --> NetworkReady[Network Ready]
StartAP --> NetworkReady
end
subgraph MCP_Server
MCP[Start MCP Server] --> HandleClient[Handle Client Connections]
HandleClient --> HandleRequest[Handle Requests]
HandleRequest -->|WebSocket Events| WebSocket[Handle WebSocket]
HandleRequest -->|HTTP Endpoints| HTTP[Process HTTP Requests]
end
subgraph Metrics
self[Start Metrics System] --> InitMetrics[Initialize System Metrics]
InitMetrics --> CollectMetrics[Collect Metrics Periodically]
CollectMetrics --> SaveMetrics[Save Metrics to Filesystem]
end
subgraph Logger
self[Start uLogger] --> LogMetrics[Log Metrics Data]
LogMetrics --> CompactLogs[Compact Logs if Necessary]
CompactLogs -->|Rotates Logs| LogRotation
end
CreateTasks -->|Network Task| NetworkTask[Run Network Task on Core]
CreateTasks -->|MCP Task| MCPTask[Run MCP Server Task on Core]
NetworkTask --> Network
MCPTask --> MCP_Server
MCP_Server --> Metrics
Metrics --> Logger
Installation
- Clone the repository:
git clone https://github.com/yourusername/esp32-mcp-server.git
cd esp32-mcp-server
- Install dependencies:
pio pkg install
- Build and upload the filesystem:
pio run -t uploadfs
- Build and upload the firmware:
pio run -t upload
Usage
Initial Setup
- Power on the ESP32. It will create an access point named "ESP32_XXXXXX"
- Connect to the access point
- Navigate to http://192.168.4.1
- Configure your WiFi credentials
- The device will connect to your network
MCP Connection
Connect to the MCP server using WebSocket on port 9000:
const ws = new WebSocket('ws://YOUR_ESP32_IP:9000');
// Initialize connection
ws.send(JSON.stringify({
jsonrpc: "2.0",
method: "initialize",
id: 1
}));
// List available resources
ws.send(JSON.stringify({
jsonrpc: "2.0",
method: "resources/list",
id: 2
}));
Testing
Run the test suite:
# Run all tests
pio test -e native
# Run specific test
pio test -e native -f test_request_queue
# Run with coverage
pio test -e native --coverage
Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
8
Followers
16
Repositories
0
Gists
0
Total Contributions
oatpp-mcp is an implementation of Anthropic's Model Context Protocol for the Oat++ framework. This library enables interaction with large language models (LLMs) through automatically generated tools from APIs. It supports communication via HTTP SSE and STDIO, and facilitates the management of prompts and resources.
🤩 Easy-to-use global IM bot platform designed for the LLM era / 简单易用的大模型即时通信机器人开发平台 ⚡️ Bots for QQ / QQ频道 / Discord / WeChat(微信)/ Telegram / 飞书 / 钉钉 / Slack 🧩 Integrated with ChatGPT(GPT)、DeepSeek、Dify、n8n、Claude、Google Gemini、xAI、PPIO、Ollama、阿里云百炼、SiliconFlow、Qwen、Moonshot(Kimi K2)、SillyTraven、MCP、WeClone etc. LLM & Agent & RAG