synaptic-flow
Lightning Fast AI Companion - SynapticFlow: High-Freedom AI Agent Platform with Multi-Agent Orchestration and Shareable Configurations
GitHub Stars
1
User Rating
Not Rated
Favorites
0
Views
38
Forks
0
Issues
2
π SynapticFlow - AI Agent Platform for Everyone
π Project Overview
SynapticFlow: Making MCP Tool Integration Accessible to All Users
SynapticFlow is a desktop AI agent platform designed to solve two critical problems in the AI ecosystem:
Accessibility Gap: MCP (Model Context Protocol) tools are powerful but primarily accessible to developers. We make these tools available to general users through an intuitive interface.
LLM Provider Lock-in: Users shouldn't be restricted to a few major LLM providers. SynapticFlow provides freedom to choose from multiple providers, including increasingly powerful local LLMs.
π― What Problems We Solve
π§ MCP Tool Integration Made Simple
- Problem: MCP tools require technical setup and command-line knowledge
- Solution: Built-in tools and easy MCP server management with GUI
- Benefit: Anyone can use powerful tools without technical barriers
π Freedom from LLM Vendor Lock-in
- Problem: Most AI platforms tie users to specific LLM providers
- Solution: Support for multiple providers (OpenAI, Anthropic, Groq, local models, etc.)
- Benefit: Choose the best model for your needs and budget
π€ Personalized AI Agents
- Problem: Generic AI assistants don't fit specific workflows
- Solution: Create custom agents with unique personalities and tool access
- Benefit: AI that works exactly how you want it to
π What We Provide
β Comprehensive Built-in Tool Ecosystem
π‘οΈ Secure File Management:
- SecureFileManager: Advanced path validation and sandboxed operations
- Content Store: Upload, index, and full-text search across PDF, DOCX, XLSX files
- File Attachments: Smart MIME type handling with preview capabilities
- Document Processing: Extract and analyze content from multiple formats
β‘ Code Execution & Development:
- Python Sandbox: Secure code execution with real-time result capture
- TypeScript Runtime: JavaScript/TypeScript evaluation environment
- Output Management: Comprehensive execution logging and error handling
- Development Tools: Built-in debugging and testing utilities
π Browser Automation:
- Interactive Browser Server: Automated web interactions and scraping
- Session Management: Persistent browser sessions with state management
- Content Extraction: Clean markdown conversion from web pages
- Web Integration: Seamless web data processing pipeline
π Advanced MCP Integration:
- Dual Backend Support: Both Rust Tauri and Web Worker implementations
- Security Validation: Built-in SecurityValidator with comprehensive protection
- Tool Execution Context: Unified calling interface across all backends
- Error Normalization: Robust error handling and reporting system
β Advanced Multi-LLM Ecosystem
8 Major Providers, 50+ Models:
- π€ OpenAI: GPT-4.1 series, o3/o4-mini reasoning models, GPT-4o variants
- π§ Anthropic: Claude 4 Opus/Sonnet, Claude 3.5 series with advanced tool calling
- π Google: Gemini 2.5 Pro/Flash (2M context), Gemini 2.0 agentic models
- β‘ Groq: Llama 3.3 70B, DeepSeek R1 Distill, Qwen3 32B reasoning (1,800+ tokens/sec)
- π₯ Fireworks: DeepSeek R1, Qwen3 235B MoE, Llama 4 Maverick/Scout
- π§ Cerebras: Ultra-fast inference with industry-leading speed
- π Ollama: Local models with zero cost (Llama, Mistral, Qwen, CodeLlama)
- π― Empty: Custom provider configurations
Advanced Features:
- π€ Reasoning Models: o3, DeepSeek R1, Qwen3 thinking models for complex problem-solving
- π° Cost Optimization: Real-time cost tracking and model comparison
- π Massive Context: Up to 2M tokens (Gemini 2.5 Pro)
- ποΈ Multimodal: Vision, document processing, and code understanding
β User-Friendly Features
- π€ Custom Agents: Create AI assistants with specific roles and tool access
- π₯ Multi-Agent Collaboration: Multiple agents working together on complex tasks
- π¬ Session Management: Organize conversations with file attachments and context
- π€ Export/Import: Share agent configurations and setups with others
- π¨ Modern UI: Clean, terminal-style interface that's both powerful and intuitive
π Advanced Technology Stack
Core Framework:
- Tauri 2.x: Latest cross-platform framework with enhanced security and performance
- React 18.3: Modern UI with concurrent features and advanced hooks
- TypeScript 5.6: Latest language features with strict type safety
- RMCP 0.2.1: Rust-based Model Context Protocol with child process transport
Backend Technologies:
- Rust: High-performance native operations with async/await architecture
- Tokio: Advanced async runtime for concurrent MCP server management
- SecurityValidator: Built-in path validation and process sandboxing
- Warp: HTTP server infrastructure for browser automation capabilities
Frontend Technologies:
- Tailwind CSS 4.x: Latest utility-first styling with performance optimizations
- Radix UI: Accessible component primitives for robust UI
- Dexie: TypeScript-friendly IndexedDB wrapper for local data
- Zustand: Lightweight, scalable state management solution
π‘οΈ Security & Performance
Built-in Security:
- SecurityValidator: Advanced path traversal protection and sandboxed operations
- MIME Type Validation: Safe file handling across all supported formats
- Process Isolation: MCP servers run in isolated child processes for maximum security
- API Key Management: Secure in-app credential storage with encryption
- Content Sanitization: Automatic cleaning and validation of all user inputs
Performance Optimizations:
- Streaming Responses: Real-time AI model outputs with minimal latency
- Concurrent Tool Execution: Parallel MCP server operations for faster results
- Smart Caching: Intelligent resource caching for improved response times
- Memory Management: Optimized for long-running sessions and large datasets
- Ultra-Fast Models: Cerebras integration delivering 1,800+ tokens/second
π Project Structure
synaptic-flow/
βββ src/ # React Frontend (Feature-Driven Architecture)
β βββ app/ # App entry, root layout, global providers
β β βββ App.tsx # Main application component
β β βββ main.tsx # React entry point
β β βββ globals.css # Global styles
β βββ assets/ # Static assets (images, svgs)
β βββ components/ # Shared UI components (20+ shadcn/ui components)
β β βββ ui/ # shadcn/ui component library
β β βββ layout/ # App layout components
β β βββ common/ # Reusable common components
β βββ features/ # Feature modules (7 major features)
β β βββ assistant/ # AI agent management and configuration
β β βββ chat/ # Real-time chat interface with tool calling
β β βββ group/ # Multi-agent collaboration system
β β βββ history/ # Conversation history and search
β β βββ prompts/ # Prompt management and templates
β β βββ session/ # Session management with file attachments
β β βββ settings/ # Configuration and API key management
β β βββ tools/ # Built-in tool ecosystem and MCP integration
β βββ context/ # React context system (8 specialized contexts)
β β βββ AssistantContext.tsx # Agent state management
β β βββ BuiltInToolContext.tsx # Tool execution context
β β βββ MCPServerContext.tsx # MCP server management
β β βββ ... # Additional contexts
β βββ hooks/ # Custom React hooks
β β βββ use-rust-backend.ts # Tauri backend integration
β β βββ use-mcp-server.ts # MCP server management
β β βββ ... # Feature-specific hooks
β βββ lib/ # Service layer and business logic
β β βββ ai-service.ts # LLM provider integration
β β βββ logger.ts # Centralized logging system
β β βββ secure-file-manager.ts # Advanced file operations
β β βββ rust-backend-client.ts # Backend communication layer
β β βββ ... # Additional services
β βββ models/ # TypeScript type definitions
β β βββ chat.ts # Chat and message types
β β βββ mcp-types.ts # MCP protocol types (680+ lines)
β β βββ llm-config.ts # LLM configuration types
β βββ config/ # Configuration files
β βββ llm-providers.json # LLM provider definitions
βββ src-tauri/ # Rust Backend (Advanced Architecture)
β βββ src/
β β βββ lib.rs # Main Tauri application
β β βββ mcp/ # MCP server integration modules
β β βββ security/ # Security validation and sandboxing
β β βββ tools/ # Built-in tool implementations
β β βββ commands/ # Tauri command definitions
β βββ Cargo.toml # Rust dependencies
β βββ tauri.conf.json # Tauri 2.x configuration
βββ docs/ # Documentation and guides
β βββ history/ # Refactoring and change history
βββ package.json # Node.js dependencies and scripts
βββ tailwind.config.js # Tailwind CSS 4.x configuration
βββ vite.config.ts # Vite build configuration
π Getting Started
Ready to use SynapticFlow? Here's how to get up and running:
Option 1: Download Release (Recommended)
Visit our Releases page to download the latest version for your operating system.
Option 2: Build from Source
Prerequisites: Ensure you have Rust, Node.js (v18+), and pnpm installed.
Install Dependencies:
pnpm installDevelopment Commands:
# Development pnpm tauri dev # Start development server with hot reload pnpm dev # Frontend-only development mode # Code Quality pnpm lint # ESLint checking with strict rules pnpm lint:fix # Auto-fix lint issues pnpm format # Prettier formatting pnpm format:check # Check formatting compliance # Testing & Building pnpm test # Run comprehensive test suite pnpm build # Production build optimization pnpm tauri build # Create optimized desktop app bundle # Diagnostics pnpm diagnose # System diagnostic for troubleshooting
Next Steps
- Configure Your First LLM: Open Settings and add your preferred AI provider's API key
- Create an Agent: Set up your first AI assistant with specific tools and personality
- Connect MCP Tools: Add external MCP servers or use our built-in tools
- Start Collaborating: Begin conversations with your AI agents
π₯ Performance Highlights
Speed & Efficiency:
- β‘ Ultra-Fast Models: Cerebras delivering 1,800+ tokens/second
- π° Cost Optimization: 60-80% cost reduction with smart model selection
- π Concurrent Operations: Parallel tool execution for faster results
- π€― Massive Context: Handle up to 2M tokens in single conversations
π Documentation
- π User Guide: Complete setup and usage instructions
- ποΈ Architecture: Technical details for developers
- π§ MCP Integration: How to connect and use MCP servers
- β Troubleshooting: Common issues and solutions
- π Refactoring History: Detailed change logs and improvements
π€ Contributing
We welcome contributions! Here's how you can help:
- π Report Issues: Found a bug? Open an issue
- π‘ Suggest Features: Have ideas? Share them in our discussions
- π§ Submit Code: Read our Contributing Guide to get started
- π Improve Docs: Help make our documentation even better
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Support
If SynapticFlow helps you work more efficiently with AI tools, consider:
- β Star this repository to show your support
- π£οΈ Share with others who might find it useful
- π Report issues to help us improve
- π¬ Join discussions to shape the future of the project
Ready to experience the most advanced AI agent platform? SynapticFlow combines enterprise-grade security, lightning-fast performance, and unlimited LLM freedom in one powerful desktop application! π