Aspire.MCP.Sample

Sample MCP Server and MCP client with Aspire

GitHubスター

42

ユーザー評価

未評価

お気に入り

0

閲覧数

19

フォーク

21

イシュー

3

インストール方法
難易度
中級
推定所要時間
10-20

インストール方法

.NET SDK 9.0 or later
Visual Studio 2022 or Visual Studio code
LLM or SLM that supports function calling.
- [Azure AI Foundry](https://ai.azure.com) to run models in the cloud. IE: gpt-4o-mini - [GitHub Models](https://github.com/marketplace?type=models) to run models in the cloud. IE: gpt-4o-mini - [Ollama](https://ollama.com/) for running local models. Suggested: phi4-mini, llama3.2 or Qwq
1Clone the repository:
1Navigate to the Aspire project directory:
bash
   cd .\src\McpSample.AppHost\
   
1Run the project:
bash
   dotnet run
   
1In the Aspire Dashboard, navigate to the Blazor Chat client project.
![Aspire Dashboard](./images/20AspireDashboard.png)
1In the Chat Settings page, define the model to be used. You choose to use models in Azure AI Foundry (suggested gpt-4o-mini), GitHub Models or locally with ollama (suggested llama3.2)
![Chat Settings](./images/25ChatSettings.png)
1Now you can chat with the model. Everytime that one of the functions of the MCP server is called, the Tool Result section will be displayed in the chat.
作者情報
El Bruno

AI and NET Advocate @Microsoft, former Microsoft AI MVP (14 years!), lazy runner, lazy podcaster, avg coder

MicrosoftToronto, Canada

433

フォロワー

130

リポジトリ

0

Gist

0

貢献数