local-ai-mcp-chainlit

Connect your Local AI models to ANY MCP using this Chainlit example repo

GitHub Stars

17

User Rating

Not Rated

Forks

9

Issues

0

Views

0

Favorites

0

README
Chainlit MCP Integration

In this example repo you will learn how to use any MCP together with Chainlit. It is highly recommend to first watch the accompanying tutorial video

Development Environment
  1. Ensure Python 3.x is installed

  2. It's recommended to create a virtual environment:

    # Create virtual environment
    python -m venv venv
    
    # Activate virtual environment
    # On Windows:
    venv\Scripts\activate
    # On macOS/Linux:
    source venv/bin/activate
    
  3. Install dependencies:

    pip install -r requirements.txt
    
  4. Run Chainlit:

chainlit run app.py -w
  1. Start LM Studio's dev server with a model of your choice that supports tool calls (https://lmstudio.ai/docs/app/api/tools)

  2. Connect an MCP server and try it out in the Chainlit UI

  3. Extend the chat app whichever way you like with the Chainlit SDK (https://docs.chainlit.io/get-started/overview)

Author Information

3

Followers

5

Repositories

0

Gists

3

Total Contributions

Top Contributors

Threads