Dotnet2025-MCP

Evergine-based demo presented at DotNet2025 showcasing a 3D MCP Server with LLM-powered client integration

GitHub Stars

0

User Rating

Not Rated

Forks

1

Issues

0

Views

1

Favorites

0

README
DotNet2025 – Evergine MCP Demo

This repository contains the source code for the demos presented during the DotNet 2025 event, held in Madrid and organized by Plain Concepts on Thursday, June 19th, by Jorge Cantón and Manuel Rodrigo Cabello.

The demo showcases the implementation of a Model Context Protocol (MCP) Server using the Evergine engine, as well as a console-based MCP Client powered by Large Language Models (LLMs). It demonstrates how to build both the server and client sides of MCP applications in C#, and how to enable interaction between them.


🎥 Watch the Conference Recording

Watch on YouTube


📁 Projects Included
🧠 MCP Server (Evergine-based)

MCP-Server screenshot

This project launches a 3D scene using Evergine and exposes a set of tools via an MCP Server:

  • 🧱 Create basic 3D primitives in the scene (cube, sphere, cylinder, etc.).
  • 🧹 Delete all previously added primitives from the scene.
  • 🤖 Generate a 3D model using Generative AI via TripoAI, with PBR materials applied.

The server reads its configuration from an App.config file. To enable TripoAI integration, you must provide a valid API key.

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <appSettings>
    <!-- TripoAI settings -->
    <add key="tripo-key" value="[tsk_*****]" />
  </appSettings>
</configuration>

Note. Replace [tsk*****] with your actual TripoAI API key._

💬 MCP Client (Console-based)

MCP-Client screenshot

This is a console application that connects to the MCP Server and allows interaction through an LLM-powered chat interface.

It supports two backends:

  • Azure OpenAI (recommended for better comprehension and reasoning)
  • Ollama (for running local models; configured for llama3.2:3b)

The client reads its configuration from a separate App.config file where you can specify:

  • The path to the MCP Server executable (relative or absolute)
  • Your Azure OpenAI API key, endpoint URL, and model deployment name
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <appSettings>
    <!-- MCP Server executable path -->
    <add key="mcp-server" value="../../../../MCP-Server/EvergineMCPServer.Windows/bin/Debug/net8.0-windows/EvergineMCPServer.Windows.exe" />

    <!-- Azure OpenAI settings -->
    <add key="azure-openAi-apiKey" value="[openai-apikey]" />
    <add key="azure-openAi-apiUrl" value="[openai-apiurl]" />
    <add key="model-deployment-name" value="[openai-model-name]" />
  </appSettings>
</configuration>

During the DotNet2025 event, we used Azure OpenAI due to its superior understanding and reliability, which we recommend for advanced or production scenarios.


🚀 Technologies Used
  • 💻 Evergine – Real-time graphics engine
  • 🔌 Model Context Protocol (MCP)
  • 🧠 Azure OpenAI & Ollama for LLM integration
  • 🤖 TripoAI for 3D model generation with PBR materials
  • 🟦 Fully implemented in C# using .NET 8

📚 Purpose

This project serves as a practical reference for:

  • Building an MCP Server that exposes real-time 3D tools
  • Implementing an MCP Client that interacts with the server through LLMs
  • Establishing communication between server and client via MCP
  • Integrating Generative AI into a 3D application workflow

📝 License

This project is intended for demonstration and educational purposes.
Please refer to individual tool and service licenses (e.g., Azure OpenAI, TripoAI) for their respective terms of use.

Author Information
Jorge Cantón
Plain ConceptsSpain

53

Followers

25

Repositories

0

Gists

4

Total Contributions

Top Contributors

Threads