blazing-art-mcp

blazing-art-mcp is an art generation tool developed in Rust. It is designed to allow users to easily create beautiful artworks, offering an intuitive interface. With various styles and customization options, it serves as a valuable resource for artists and designers.

GitHub Stars

0

User Rating

Not Rated

Favorites

0

Views

25

Forks

0

Issues

0

README
Blazing ART MCP Server

Rust
MCP
Docker
Kubernetes

⚑ Blazing-fast Adaptive Radix Tree (ART) powered MCP server delivering microsecond-latency structured memory access for Large Language Models. Built with Rust for zero V8 overhead and predictable performance.

This server implements the Model Context Protocol so any MCP-compatible LLM can query structured memory over JSON-RPC.

πŸš€ Performance Characteristics
Dataset Size Lookup P95 Prefix Scan (100 matches) Memory Usage
100k keys 8 Β΅s 35 Β΅s 12 MB
1M keys 11 Β΅s 60 Β΅s 85 MB

Note: Performance numbers dominated by JSON serialization, not ART traversal - demonstrating exceptional core efficiency.

πŸ—οΈ Architecture
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    MCP Protocol     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   LLM Host      │◄──────────────────►│  Memory Server   β”‚
β”‚ (Claude/etc.)   β”‚   JSON-RPC 2.0     β”‚   (Rust + ART)   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                               β”‚
                                               β–Ό
                                        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                                        β”‚ Adaptive     β”‚
                                        β”‚ Radix Tree   β”‚
                                        β”‚ In-Memory    β”‚
                                        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
Key Technologies
  • πŸ¦€ Rust: Memory-safe, zero-cost abstractions, predictable performance
  • 🌳 Adaptive Radix Tree: O(k) operations, 8-52 bytes per key, cache-friendly
  • πŸ”Œ Model Context Protocol: Standardized LLM integration via JSON-RPC 2.0
  • 🐳 Docker: Static-linked, distroless containers (<10MB)
  • ☸️ Kubernetes: Production-ready with autoscaling, monitoring, security
πŸ›  Prerequisites

Install Rust 1.76+ using rustup if you don't already
have the toolchain:

curl https://sh.rustup.rs -sSf | sh -s -- -y
source $HOME/.cargo/env

Then you can build and test the project with cargo build and cargo test.

🎯 Quick Start
Local Development
# Clone and build
git clone https://github.com/JohnJBoren/blazing-art-mcp.git
cd blazing-art-mcp
cargo build --release

# Run with sample data
./target/release/blazing_art_mcp \
  --entities examples/entities.json \
  --events examples/events.json

# WebSocket mode for remote access
./target/release/blazing_art_mcp \
  --ws 0.0.0.0:4000 \
  --entities examples/entities.json
Docker Deployment
# Build optimized container
docker build -t blazing-art-mcp:latest .

# Run with STDIO (sidecar mode)
docker run -i --rm blazing-art-mcp:latest

# Run with WebSocket (service mode)
docker run -p 4000:4000 -p 3000:3000 blazing-art-mcp:latest \
  --ws 0.0.0.0:4000 --health-port 3000
Kubernetes Production
# Deploy to Kubernetes
kubectl apply -f k8s/deployment.yaml

# Check status
kubectl get pods -n mcp-memory
kubectl logs -f deployment/mcp-memory -n mcp-memory

# Health check
kubectl port-forward svc/mcp-memory-service 3000:3000 -n mcp-memory
curl http://localhost:3000/health/ready
πŸ”§ Configuration
Command Line Options
blazing_art_mcp [OPTIONS]

Options:
  --entities <FILE>      JSON file with entity data to preload
  --events <FILE>        JSON file with event data to preload  
  --ws <ADDRESS>         WebSocket address (e.g., 0.0.0.0:4000)
  --event-limit <NUM>    Max events returned by prefix search [default: 64]
  --health-port <PORT>   Health check port [default: 3000]
  --telemetry           Enable OpenTelemetry tracing
  --health-check        Run health check and exit (for containers)
Environment Variables
# Logging
RUST_LOG=info                                    # Log level
OTEL_EXPORTER_OTLP_ENDPOINT=http://jaeger:4317  # Telemetry endpoint

# Performance tuning (set automatically)
MIMALLOC_LARGE_OS_PAGES=1                        # Use huge pages if available
πŸ“Š MCP Protocol Interface

The server exposes two primary tools via MCP:

1. Entity Lookup
{
  "tool": "lookupEntity",
  "arguments": {
    "name": "Albert Einstein"
  }
}

Response:

{
  "name": "Albert Einstein",
  "summary": "Theoretical physicist, Nobel Prize 1921...",
  "born": "1879",
  "tags": ["physicist", "relativity", "nobel"]
}
2. Event Search
{
  "tool": "findEvents", 
  "arguments": {
    "prefix": "2023-11"
  }
}

Response:

[
  {
    "id": "2023-11-01:meeting",
    "timestamp": "2023-11-01T10:00:00Z",
    "description": "Team standup meeting",
    "category": "work"
  }
]
🏭 Production Features
Security Hardening
  • βœ… Non-root containers with distroless base images
  • βœ… Read-only filesystems and dropped capabilities
  • βœ… SBOM generation for supply chain security
  • βœ… Vulnerability scanning with Trivy
  • βœ… Network policies for micro-segmentation
Observability
  • βœ… Health checks (/health/live, /health/ready)
  • βœ… Prometheus metrics (/metrics)
  • βœ… Structured logging with JSON output
  • βœ… OpenTelemetry tracing for distributed systems
  • βœ… Graceful shutdown with statistics logging
High Availability
  • βœ… Horizontal Pod Autoscaling (2-10 replicas)
  • βœ… Pod Disruption Budgets for rolling updates
  • βœ… Anti-affinity rules for zone distribution
  • βœ… Resource limits and quality of service
πŸ”¬ Performance Optimizations
Memory Allocator
// 2-6x performance improvement with custom allocator
#[global_allocator]
static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
Zero-Copy Serialization
// 10-50x faster than standard JSON with rkyv
use rkyv::{Archive, Serialize, Deserialize};
Cache-Aligned Data Structures
// Optimize for CPU cache lines
#[repr(align(64))]
struct AlignedMemory { ... }
πŸ“ˆ Benchmarking
# Run performance benchmarks
cargo bench

# Memory profiling
cargo run --release -- --entities large_dataset.json &
ps aux | grep blazing_art_mcp  # Check RSS memory

# Load testing with WebSocket
wrk -t12 -c400 -d30s http://localhost:4000/
πŸ› οΈ Development
Building from Source
# Development build
cargo build

# Optimized release build
cargo build --release

# With all security features
cargo build --release --target x86_64-unknown-linux-musl
Testing
# Unit tests
cargo test

# Integration tests
cargo test --test integration

# Clippy linting
cargo clippy -- -D warnings

# Security audit
cargo audit
Container Development
# Development container with hot reload
docker build --target development -t mcp-memory:dev .
docker run -v $(pwd):/app mcp-memory:dev

# Security scanning
docker build --target security-scan -t mcp-memory:scan .
docker run --rm mcp-memory:scan cat /tmp/trivy-report.sarif
πŸ”„ Data Management
Loading Data

Entities (entities.json):

[
  {
    "name": "Claude Shannon",
    "summary": "Father of information theory...",
    "born": "1916",
    "tags": ["mathematician", "information-theory"]
  }
]

Events (events.json):

[
  {
    "id": "2024-01-15:discovery",
    "timestamp": "2024-01-15T14:30:00Z", 
    "description": "Major breakthrough in quantum computing",
    "category": "science"
  }
]
Persistence Strategies
  1. Snapshot Loading: Mount JSON files for initial data load
  2. Runtime Updates: Use MCP tools for dynamic mutations
  3. Graceful Persistence: Flush to disk on shutdown signals
πŸ› Troubleshooting
Common Issues

Container fails health check:

# Check health endpoint directly
docker exec -it <container> /blazing_art_mcp --health-check

# Verify port binding
docker ps | grep mcp-memory

High memory usage:

# Check ART statistics
curl http://localhost:3000/metrics

# Verify data size vs memory usage ratio

Performance degradation:

# Enable debug logging
RUST_LOG=debug ./blazing_art_mcp

# Check for JSON serialization bottlenecks in traces
πŸ“ License

MIT License - see LICENSE for details.

🀝 Contributing
  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Commit changes: git commit -m 'Add amazing feature'
  4. Push to branch: git push origin feature/amazing-feature
  5. Open a Pull Request
πŸ™ Acknowledgments

Built with ❀️ for the future of AI-powered applications