Aspire.MCP.Sample

Sample MCP Server and MCP client with Aspire

GitHub Stars

42

User Rating

Not Rated

Favorites

0

Views

28

Forks

21

Issues

3

Installation
Difficulty
Intermediate
Estimated Time
10-20 minutes

Installation

.NET SDK 9.0 or later
Visual Studio 2022 or Visual Studio code
LLM or SLM that supports function calling.
- [Azure AI Foundry](https://ai.azure.com) to run models in the cloud. IE: gpt-4o-mini - [GitHub Models](https://github.com/marketplace?type=models) to run models in the cloud. IE: gpt-4o-mini - [Ollama](https://ollama.com/) for running local models. Suggested: phi4-mini, llama3.2 or Qwq
1Clone the repository:
1Navigate to the Aspire project directory:
bash
   cd .\src\McpSample.AppHost\
   
1Run the project:
bash
   dotnet run
   
1In the Aspire Dashboard, navigate to the Blazor Chat client project.
![Aspire Dashboard](./images/20AspireDashboard.png)
1In the Chat Settings page, define the model to be used. You choose to use models in Azure AI Foundry (suggested gpt-4o-mini), GitHub Models or locally with ollama (suggested llama3.2)
![Chat Settings](./images/25ChatSettings.png)
1Now you can chat with the model. Everytime that one of the functions of the MCP server is called, the Tool Result section will be displayed in the chat.
Author Information
El Bruno

AI and NET Advocate @Microsoft, former Microsoft AI MVP (14 years!), lazy runner, lazy podcaster, avg coder

MicrosoftToronto, Canada

433

Followers

130

Repositories

0

Gists

0

Total Contributions