lightrag-mcp
The LightRAG MCP Server acts as a bridge between the LightRAG API and MCP-compatible clients. This server provides functionalities such as information retrieval, document management, knowledge graph operations, and monitoring of the API status, facilitating integration with various AI tools.
GitHub Stars
77
User Rating
Not Rated
Favorites
0
Views
113
Forks
24
Issues
3
Installation
Difficulty
IntermediateEstimated Time
10-20 minutes
Requirements
Python 3.11+Running LightRAG API serverInstallation
Installation
Prerequisites
Please specify required software and versions:Python: 3.11 or higher
Running LightRAG API server: Must be operational
Installation Steps
1. Create a Virtual Environment
bash
uv venv --python 3.11
2. Install the Package in Development Mode
bash
uv pip install -e .
Troubleshooting
Common Issues
Issue: Server won't start Solution: Check the Python version and reinstall dependencies. Issue: LightRAG API is not running Solution: Ensure the LightRAG API server is correctly started.Configuration
Configuration
MCP Client Setup
Add the following configuration to your MCP client configuration file (e.g.,mcp-config.json):
json
{
"mcpServers": {
"lightrag-mcp": {
"command": "uvx",
"args": [
"lightrag_mcp",
"--host",
"localhost",
"--port",
"9621",
"--api-key",
"your_api_key"
]
}
}
}
Advanced Configuration
Manage your API key securely.
Modify the server host and port as needed.
Examples
Examples
Basic Usage
Starting LightRAG API
bash
uv run LightRAG/lightrag/api/lightrag_server.py --host localhost --port 9621 --working-dir ./rag_storage --input-dir ./input --llm-binding openai --embedding-binding openai --log-level DEBUG
Starting MCP Server
bash
uv run src/lightrag_mcp/main.py --host localhost --port 9621 --api-key your_api_key
Programmatic Usage
python
import requests
def call_mcp_tool(tool_name, params):
response = requests.post(
'http://localhost:9621/mcp/call',
json={
'tool': tool_name,
'parameters': params
}
)
return response.json()
result = call_mcp_tool('analyze', {'input': 'sample data'})
print(result)
Use Cases
Using the LightRAG MCP Server to search documents for information retrieval in AI tools.
Uploading documents and creating indexes for easy access later.
Managing relationships between entities using a knowledge graph for analysis.
Monitoring the API status to ensure system health.
Additional Resources
Author Information
3
Followers
29
Repositories
0
Gists
0
Total Contributions
Related MCPs
LangBot
13398
🤩 Easy-to-use global IM bot platform designed for the LLM era / 简单易用的大模型即时通信机器人开发平台 ⚡️ Bots for QQ / QQ频道 / Discord / WeChat(微信)/ Telegram / 飞书 / 钉钉 / Slack 🧩 Integrated with ChatGPT(GPT)、DeepSeek、Dify、n8n、Claude、Google Gemini、xAI、PPIO、Ollama、阿里云百炼、SiliconFlow、Qwen、Moonshot(Kimi K2)、SillyTraven、MCP、WeClone etc. LLM & Agent & RAG