workshop-assistant

A Model Context Protocol (MCP) server that bridges VSCode/Claude with local Ollama models. Enables AI-powered development workflows using locally-hosted LLMs through a simple, extensible interface. Features automatic Ollama discovery, dual transport modes (stdio/TCP), and seamless IDE integration.

GitHub Stars

0

User Rating

Not Rated

Favorites

0

Views

21

Forks

0

Issues

0

Technical Information

Programming Languages

PythonPrimary Language

System Requirements

No specific requirements are documented

Maintenance Status

Active

GitHub Topics

APIFile SystemAutomationAI/LLMContainerMonitoring