turbular
turbularは、Pythonを使用してデータの流れを自動化するためのライブラリです。ユーザーは簡単にワークフローを構築し、データ処理を効率化できます。直感的なAPIを提供し、複雑なタスクを簡素化することが可能です。
GitHubスター
97
ユーザー評価
未評価
お気に入り
0
閲覧数
20
フォーク
12
イシュー
1
Turbular
Turbular is an open-source Model Context Protocol (MCP) server that enables seamless database connectivity for Language
Models (LLMs). It provides a unified API interface to interact with various database types, making it perfect for AI
applications that need to work with multiple data sources.
✨ Features
- 🔌 Multi-Database Support: Connect to various database types through a single API
- 🔄 Schema Normalization: Automatically normalize database schemas to correct naming conventions for LLM
compatibility - 🔒 Secure Connections: Support for SSL and various authentication methods
- 🚀 High Performance: Optimizes your LLM generated queries
- 📝 Query Transformation: Let LLM generate queries against normalized layouts and transform them into their
unnormalized form - 🐳 Docker Support: Easy deployment with Docker and Docker Compose
- 🔧 Easy to Extend: Adding new database providers can be easily done by extending
the BaseDBConnector interface
🗄️ Supported Databases
Database Type | Status | Icon |
---|---|---|
PostgreSQL | ✅ | |
MySQL | ✅ | |
SQLite | ✅ | |
BigQuery | ✅ | |
Oracle | ✅ | |
MS SQL | ✅ | |
Redshift | ✅ | ![]() |
🚀 Quick Start
Using Docker (Recommended)
Clone the repository:
git clone https://github.com/raeudigerRaeffi/turbular.git cd turbular
Start the development environment:
docker-compose -f docker-compose.dev.yml up --build
Test the connection:
./scripts/test_connection.py
Manual Installation
Install Python 3.11 or higher
Install dependencies:
pip install -r requirements.txt
Run the server:
uvicorn app.main:app --reload
🔌 API Reference
Database Operations
Get Database Schema
POST /get_schema
Retrieve the schema of a connected database for your LLM agent.
Parameters:
db_info
: Database connection argumentsreturn_normalize_schema
(optional): Return schema in LLM-friendly format
Execute Query
POST /execute_query
Optimizes query and then execute SQL queries on the connected database.
Parameters:
db_info
: Database connection argumentsquery
: SQL query stringnormalized_query
: Boolean indicating if query is normalizedmax_rows
: Maximum number of rows to returnautocommit
: Boolean for autocommit mode
File Management
Upload BigQuery Key
POST /upload-bigquery-key
Upload a BigQuery service account key file.
Parameters:
project_id
: BigQuery project IDkey_file
: JSON key file
Upload SQLite Database
POST /upload-sqlite-file
Upload a SQLite database file.
Parameters:
database_name
: Name to identify the databasedb_file
: SQLite database file (.db or .sqlite)
Utility Endpoints
Health Check
GET /health
Verify if the API is running.
List Supported Databases
GET /supported-databases
Get a list of all supported database types.
🔧 Development Setup
Fork and clone the repository
Create a development environment:
docker-compose -f docker-compose.dev.yml up --build
The development server includes:
- FastAPI server with hot reload
- PostgreSQL test database
- Pre-configured test data
Access the API documentation:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
🤝 Contributing
We welcome contributions! Here's how you can help:
- Check out our contribution guidelines
- Look for open issues
- Submit pull requests with improvements
- Help with documentation
- Share your feedback
Development Guidelines
- Follow PEP 8 style guide
- Write tests for new features
- Update documentation as needed
- Use meaningful commit messages
Roadmap
- Add more testing, formatting and commit hooks
- Add SSH support for database connection
- Add APIs as datasources using steampipe
- Enable local schema saving for databases to which the server has already connected
- Add more datasources (snowflake, mongodb, excel, etc.)
- Add authentication protection to routes
🧪 Testing
Run the test suite:
pytest
For development tests with the included PostgreSQL:
./scripts/test_connection.py
📚 Documentation
📝 Connection Examples
PostgreSQL
connection_info = {
"database_type": "PostgreSQL",
"username": "user",
"password": "password",
"host": "localhost",
"port": 5432,
"database_name": "mydb",
"ssl": False
}
BigQuery
connection_info = {
"database_type": "BigQuery",
"path_cred": "/path/to/credentials.json",
"project_id": "my-project",
"dataset_id": "my_dataset"
}
SQLite
connection_info = {
"type": "SQLite",
"database_name": "my_database"
}
🙏 Acknowledgments
- FastAPI for the amazing framework
- SQLAlchemy for database support
- @henryclickclack Henry Albert Jupiter Hommel
as Co-Developer ❤️ - All our contributors and users
📞 Support
- Create an issue
- Email: raffael@turbular.com