open-responses-server
Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm compliant.
GitHub Stars
87
User Rating
Not Rated
Favorites
0
Views
88
Forks
11
Issues
10
Installation
Difficulty
BeginnerEstimated Time
5-10 minutes
Installation
Installation
Prerequisites
Python: 3.7 or higher
pip: Latest version
Installation Steps
1. Install from PyPI
bash
pip install open-responses-server
2. Install from Source
bash
uv venv
uv pip install .
uv pip install -e ".[dev]" # dev dependencies
3. Start the Server
bash
Using CLI tool (after installation)
otc start
Or directly from source
uv run src/open_responses_server/cli.py start
4. Docker Deployment
bash
Run with Docker
docker run -p 8080:8080 \
-e OPENAI_BASE_URL_INTERNAL=http://your-llm-api:8000 \
-e OPENAI_BASE_URL=http://localhost:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/teabranch/open-responses-server:latest