local-ai-mcp-chainlit

Connect your Local AI models to ANY MCP using this Chainlit example repo

GitHubスター

17

ユーザー評価

未評価

フォーク

9

イシュー

0

閲覧数

1

お気に入り

0

README
Chainlit MCP Integration

In this example repo you will learn how to use any MCP together with Chainlit. It is highly recommend to first watch the accompanying tutorial video

Development Environment
  1. Ensure Python 3.x is installed

  2. It's recommended to create a virtual environment:

    # Create virtual environment
    python -m venv venv
    
    # Activate virtual environment
    # On Windows:
    venv\Scripts\activate
    # On macOS/Linux:
    source venv/bin/activate
    
  3. Install dependencies:

    pip install -r requirements.txt
    
  4. Run Chainlit:

chainlit run app.py -w
  1. Start LM Studio's dev server with a model of your choice that supports tool calls (https://lmstudio.ai/docs/app/api/tools)

  2. Connect an MCP server and try it out in the Chainlit UI

  3. Extend the chat app whichever way you like with the Chainlit SDK (https://docs.chainlit.io/get-started/overview)

作者情報

3

フォロワー

5

リポジトリ

0

Gist

3

貢献数

トップ貢献者

スレッド