GitHub Stars
17,372
User Rating
Not Rated
Favorites
0
Views
102
Forks
1,195
Issues
168
Python 3.7 or higherpip latest versionInstallation
Installation
Prerequisites
Please specify required software and versions:Installation Steps
1. Clone Repository
bash
git clone https://github.com/jlowin/fastmcp.git
cd fastmcp
2. Install Dependencies
bash
pip install -r requirements.txt
3. Start Server
bash
fastmcp run server.py
Troubleshooting
Common Issues
Issue: Server won't start Solution: Check Python version and reinstall dependencies. Issue: Client cannot connect to the server Solution: Verify the server URL and port.Configuration
Configuration
Basic Configuration
Server Setup
Server configuration is done within theserver.py file. Hereβs a basic setup example:
python
from fastmcp import FastMCP
mcp = FastMCP("Demo π")
@mcp.tool
def add(a: int, b: int) -> int:
"""Add two numbers"""
return a + b
if __name__ == "__main__":
mcp.run()
Environment Variables
Set the following environment variables as needed:bash
export API_KEY="your-api-key"
export DEBUG="true"
Advanced Configuration
Security Settings
Examples
Examples
Basic Usage
Here are basic usage examples for the MCP server:Running the Server
bash
fastmcp run server.py
Programmatic Usage
python
Python example
import requests
def call_mcp_tool(tool_name, params):
response = requests.post(
'http://localhost:3000/mcp/call',
json={
'tool': tool_name,
'parameters': params
}
)
return response.json()
Usage example
result = call_mcp_tool('add', {'a': 5, 'b': 3})
print(result) # 8
Use Cases
Additional Resources
949
Followers
41
Repositories
9
Gists
0
Total Contributions
hica is a high-performance editor tailored for Python, offering features like code auto-completion and debugging tools. Users can program efficiently through an intuitive interface. With a wide range of plugins, it also boasts high customization, making it suitable for various development environments.
The mcp-llm-client is a Python library designed to simplify interactions with machine learning models. It provides an efficient API for communicating with large language models (LLMs), enabling developers to quickly prototype applications. With a user-friendly interface and a range of features, it supports various use cases in AI development.