GitHubスター
42
ユーザー評価
未評価
お気に入り
0
閲覧数
19
フォーク
21
イシュー
3
インストール方法
難易度
中級推定所要時間
10-20 分
インストール方法
.NET SDK 9.0 or later
Visual Studio 2022 or Visual Studio code
LLM or SLM that supports function calling.
- [Azure AI Foundry](https://ai.azure.com) to run models in the cloud. IE: gpt-4o-mini
- [GitHub Models](https://github.com/marketplace?type=models) to run models in the cloud. IE: gpt-4o-mini
- [Ollama](https://ollama.com/) for running local models. Suggested: phi4-mini, llama3.2 or Qwq
1Clone the repository:
1Navigate to the Aspire project directory:
bash
cd .\src\McpSample.AppHost\
1Run the project:
bash
dotnet run
1In the Aspire Dashboard, navigate to the Blazor Chat client project.

1In the Chat Settings page, define the model to be used. You choose to use models in Azure AI Foundry (suggested gpt-4o-mini), GitHub Models or locally with ollama (suggested llama3.2)

1Now you can chat with the model. Everytime that one of the functions of the MCP server is called, the
Tool Result
section will be displayed in the chat.追加リソース
作者情報