TuriX-CUA
TuriXは、AIモデルを使用してデスクトップ上で直接アクションを実行するオープンソースのツールです。ユーザーは自分のモデルを設定ファイルで簡単に変更でき、個人や研究目的で無料で利用可能です。68%以上の精度を持つコンピュータ利用エージェントを搭載しており、使いやすさと高い性能を兼ね備えています。
GitHubスター
60
ユーザー評価
未評価
お気に入り
0
閲覧数
8
フォーク
6
イシュー
0
TuriX · Desktop Actions, Driven by AI
Talk to your computer, watch it work.
📞 Contact & Community
Join our Discord community for support, discussions, and updates:
Or contact us with email: contact@turix.ai
TuriX lets your powerful AI models take real, hands‑on actions directly on your desktop.
It ships with a state‑of‑the‑art computer‑use agent (passes > 68 % of our internal OSWorld‑style test set) yet stays 100 % open‑source and cost‑free for personal & research use.
Prefer your own model? Change in config.json
and go.
Table of Contents
- 📞 Contact & Community
- 🖼️ Demos
- ✨ Key Features
- 📊 Model Performance
- 🚀 Quick‑Start (macOS 15)
- 🤝 Contributing
- 🗺️ Roadmap
🖼️ Demos
MacOS Demo
Book a flight, hotel and uber.
Search iPhone price, create Pages document, and send to contact
Generate a bar-chart in the numbers file sent by boss in discord and insert it to the right place of my powerpoint, and reply my boss.
Windows Demo
Search video content in youtube and like it
MCP with Claude Demo
Claude search for AI news, and call TuriX with MCP, write down the research result to a pages document and send it to contact
✨ Key Features
Capability | What it means |
---|---|
SOTA default model | Outperforms previous open‑source agents (e.g. UI‑TARS) on success rate and speed on Mac |
No app‑specific APIs | If a human can click it, TuriX can too—WhatsApp, Excel, Outlook, in‑house tools… |
Hot‑swappable "brains" | Replace the VLM policy without touching code (config.json ) |
MCP‑ready | Hook up Claude for Desktop or any agent via the Model Context Protocol (MCP) |
📊 Model Performance
Our agent achieves state-of-the-art performance on desktop automation tasks:
For more details, check our report.
🚀 Quick‑Start (macOS 15)
We never collect data—install, grant permissions, and hack away.
0. Windows Users: Switch to the
windows
branch for Windows-specific setup and installation instructions.git checkout windows
1. Download the App
For easier usage, download the app
Or follow the manual setup below:
2. Create a Python 3.12 Environment
Firstly Clone the repository and run:
conda create -n turix_env python=3.12
conda activate turix_env # requires conda ≥ 22.9
pip install -r requirements.txt
3. Grant macOS Permissions
3.1 Accessibility
- Open System Settings ▸ Privacy & Security ▸ Accessibility
- Click +, then add Terminal and Visual Studio Code ANY IDE you use
- If the agent still fails, also add /usr/bin/python3
3.2 Safari Automation
- Safari ▸ Settings ▸ Advanced → enable Show features for web developers
- In the new Develop menu, enable
- Allow Remote Automation
- Allow JavaScript from Apple Events
Trigger the Permission Dialogs (run once per shell)
# macOS Terminal
osascript -e 'tell application "Safari" \
to do JavaScript "alert(\"Triggering accessibility request\")" in document 1'
# VS Code integrated terminal (repeat to grant VS Code)
osascript -e 'tell application "Safari" \
to do JavaScript "alert(\"Triggering accessibility request\")" in document 1'
Click "Allow" on every dialog so the agent can drive Safari.
4. Configure & Run
4.1 Edit Task Configuration
[!IMPORTANT]
Task Configuration is Critical: The quality of your task instructions directly impacts success rate. Clear, specific prompts lead to better automation results.
Edit task in examples/config.json
:
{
"agent": {
"task": "open system settings, switch to Dark Mode"
}
}
4.2 Edit API Configuration
Get the key for free from our official web page.
Login to our website and the key is at the bottom.
Edit API in examples/config.json
:
"llm": {
"provider": "turix",
"api_key": "YOUR_API_KEY",
"base_url": "https://llm.turixapi.io/v1"
}
4.3 Configure Custom Models (Optional)
If you want to use other models not defined by the build_llm function in the main.py, you need to first define it, then setup the config.
main.py:
if provider == "name_you_want":
return ChatOpenAI(
model="gpt-4.1-mini", api_key=api_key, temperature=0.3
)
Switch between ChatOpenAI, ChatGoogleGenerativeAI and ChatAnthropic base on your llm. Also change the model name.
4.4 Start the Agent
python examples/main.py
Enjoy hands‑free computing 🎉
🤝 Contributing
We welcome contributions! Please read our Contributing Guide to get started.
Quick links:
For bug reports and feature requests, please open an issue.
🗺️ Roadmap
Quarter | Feature | Description |
---|---|---|
2025 Q3 | ✅ Windows Support | Cross-platform compatibility bringing TuriX automation to Windows environments (Now Available) |
2025 Q3 | ✅ Enhanced MCP Integration | Deeper Model Context Protocol support for seamless third-party agent connectivity (Now Available) |
2025 Q4 | Next-Gen AI Model | Significantly improved reasoning and task execution capabilities |
2025 Q4 | Planner | Understands user intent and makes step-by-step plans to complete tasks |
2025 Q4 | Workflow Automation | Record, edit, and replay complex multi-step automation sequences |
2026 Q1 | Offline Model Option | Fully local inference for maximum privacy and zero API dependency |
2026 Q1 | Persistent Memory | Learn user preferences and maintain task history across sessions |
2026 Q2 | Learning by Demonstration | Train the agent by showing it your preferred methods and workflows |
2026 Q2 | Windows-Optimized Model | Native Windows model architecture for superior performance on Microsoft platforms |