local-ai-mcp-chainlit
Connect your Local AI models to ANY MCP using this Chainlit example repo
GitHubスター
17
ユーザー評価
未評価
フォーク
9
イシュー
0
閲覧数
1
お気に入り
0
README
Chainlit MCP Integration
In this example repo you will learn how to use any MCP together with Chainlit. It is highly recommend to first watch the accompanying tutorial video
Development Environment
Ensure Python 3.x is installed
It's recommended to create a virtual environment:
# Create virtual environment python -m venv venv # Activate virtual environment # On Windows: venv\Scripts\activate # On macOS/Linux: source venv/bin/activate
Install dependencies:
pip install -r requirements.txt
Run Chainlit:
chainlit run app.py -w
Start LM Studio's dev server with a model of your choice that supports tool calls (https://lmstudio.ai/docs/app/api/tools)
Connect an MCP server and try it out in the Chainlit UI
Extend the chat app whichever way you like with the Chainlit SDK (https://docs.chainlit.io/get-started/overview)