LLaMa-MCP-Streamlit

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

GitHub Stars

42

User Rating

Not Rated

Favorites

0

Views

22

Forks

14

Issues

1

Installation
Difficulty
Intermediate
Estimated Time
10-20 minutes

Installation

Installation

Prerequisites

Python: 3.8 or higher
Poetry: Latest version
Docker: Latest version (if using Docker)

Installation Steps

1. Clone Repository

bash
git clone https://github.com/Nikunj2003/LLaMa-MCP-Streamlit.git
cd LLaMa-MCP-Streamlit

2. Install Dependencies

bash
poetry install

3. Set Environment Variables

Edit the .env file to set your API keys:

API_ENDPOINT=https://integrate.api.nvidia.com/v1
API_KEY=your_api_key_here

API_ENDPOINT=http://localhost:11434/v1/
API_KEY=ollama

4. Run the App

bash
poetry run streamlit run llama_mcp_streamlit/main.py

Troubleshooting

Issue: App does not start Solution: Ensure dependencies are correctly installed.

Additional Resources