A secure local sandbox to run LLM-generated code using Apple containers
GitHub Stars
511
User Rating
Not Rated
Favorites
0
Views
39
Forks
16
Issues
5
No specific requirements are documented
Followers
Repositories
Gists
Total Contributions
MCP LLM is a server that provides access to LLMs using the LlamaIndexTS library. It features capabilities such as code generation, documentation generation, and question answering, supporting developers in their workflow efficiently.
A model-driven approach to building AI agents in just a few lines of code.