GitHub Stars
42
User Rating
Not Rated
Favorites
0
Views
28
Forks
21
Issues
3
Installation
Difficulty
IntermediateEstimated Time
10-20 minutes
Installation
.NET SDK 9.0 or later
Visual Studio 2022 or Visual Studio code
LLM or SLM that supports function calling.
- [Azure AI Foundry](https://ai.azure.com) to run models in the cloud. IE: gpt-4o-mini
- [GitHub Models](https://github.com/marketplace?type=models) to run models in the cloud. IE: gpt-4o-mini
- [Ollama](https://ollama.com/) for running local models. Suggested: phi4-mini, llama3.2 or Qwq
1Clone the repository:
1Navigate to the Aspire project directory:
bash
cd .\src\McpSample.AppHost\
1Run the project:
bash
dotnet run
1In the Aspire Dashboard, navigate to the Blazor Chat client project.

1In the Chat Settings page, define the model to be used. You choose to use models in Azure AI Foundry (suggested gpt-4o-mini), GitHub Models or locally with ollama (suggested llama3.2)

1Now you can chat with the model. Everytime that one of the functions of the MCP server is called, the
Tool Result section will be displayed in the chat.Additional Resources
Author Information