GitHub Stars
10
User Rating
Not Rated
Favorites
0
Views
9
Forks
10
Issues
0
Claude-LMStudio Bridge
An MCP server that bridges Claude with local LLMs running in LM Studio.
Overview
This tool allows Claude to interact with your local LLMs running in LM Studio, providing:
- Access to list all available models in LM Studio
- The ability to generate text using your local LLMs
- Support for chat completions through your local models
- A health check tool to verify connectivity with LM Studio
Prerequisites
- Claude Desktop with MCP support
- LM Studio installed and running locally with API server enabled
- Python 3.8+ installed
Quick Start (Recommended)
For macOS/Linux:
- Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
- Run the setup script
chmod +x setup.sh
./setup.sh
- Follow the setup script's instructions to configure Claude Desktop
For Windows:
- Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
- Run the setup script
setup.bat
- Follow the setup script's instructions to configure Claude Desktop
Manual Setup
If you prefer to set things up manually:
- Create a virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install the required packages
pip install -r requirements.txt
- Configure Claude Desktop:
- Open Claude Desktop preferences
- Navigate to the 'MCP Servers' section
- Add a new MCP server with the following configuration:
- Name: lmstudio-bridge
- Command: /bin/bash (on macOS/Linux) or cmd.exe (on Windows)
- Arguments:
- macOS/Linux: /path/to/claude-lmstudio-bridge/run_server.sh
- Windows: /c C:\path\to\claude-lmstudio-bridge\run_server.bat
Usage with Claude
After setting up the bridge, you can use the following commands in Claude:
- Check the connection to LM Studio:
Can you check if my LM Studio server is running?
- List available models:
List the available models in my local LM Studio
- Generate text with a local model:
Generate a short poem about spring using my local LLM
- Send a chat completion:
Ask my local LLM: "What are the main features of transformers in machine learning?"
Troubleshooting
Diagnosing LM Studio Connection Issues
Use the included debugging tool to check your LM Studio connection:
python debug_lmstudio.py
For more detailed tests:
python debug_lmstudio.py --test-chat --verbose
Common Issues
"Cannot connect to LM Studio API"
- Make sure LM Studio is running
- Verify the API server is enabled in LM Studio (Settings > API Server)
- Check that the port (default: 1234) matches what's in your .env file
"No models are loaded"
- Open LM Studio and load a model
- Verify the model is running successfully
"MCP package not found"
- Try reinstalling:
pip install "mcp[cli]" httpx python-dotenv - Make sure you're using Python 3.8 or later
"Claude can't find the bridge"
- Check Claude Desktop configuration
- Make sure the path to run_server.sh or run_server.bat is correct and absolute
- Verify the server script is executable:
chmod +x run_server.sh(on macOS/Linux)
Advanced Configuration
You can customize the bridge behavior by creating a .env file with these settings:
LMSTUDIO_HOST=127.0.0.1
LMSTUDIO_PORT=1234
DEBUG=false
Set DEBUG=true to enable verbose logging for troubleshooting.
License
MIT
/////« bad/good », « truth », « time », « power », « god »,... ,are human concepts.///// ///They are made to enslave ourselves///
9
Followers
34
Repositories
0
Gists
0
Total Contributions
🎯 告别信息过载,AI 助你看懂新闻资讯热点,简单的舆情监控分析 - 多平台热点聚合+基于 MCP 的AI分析工具。监控35个平台(抖音、知乎、B站、华尔街见闻、财联社等),智能筛选+自动推送+AI对话分析(用自然语言深度挖掘新闻:趋势追踪、情感分析、相似检索等13种工具)。支持企业微信/个人微信/飞书/钉钉/Telegram/邮件/ntfy/bark/slack 推送,30秒网页部署,1分钟手机通知,无需编程。支持Docker部署⭐ 让算法为你服务,用AI理解热点【不要 fork】 而是 【use this template】,以及建议大家使用【 docker部署】,作者正在和官方沟通