synaptic-flow

Lightning Fast AI Companion - SynapticFlow: High-Freedom AI Agent Platform with Multi-Agent Orchestration and Shareable Configurations

GitHub Stars

1

User Rating

Not Rated

Favorites

0

Views

38

Forks

0

Issues

2

README
πŸš€ SynapticFlow - AI Agent Platform for Everyone
πŸ“‹ Project Overview

SynapticFlow: Making MCP Tool Integration Accessible to All Users

SynapticFlow is a desktop AI agent platform designed to solve two critical problems in the AI ecosystem:

  1. Accessibility Gap: MCP (Model Context Protocol) tools are powerful but primarily accessible to developers. We make these tools available to general users through an intuitive interface.

  2. LLM Provider Lock-in: Users shouldn't be restricted to a few major LLM providers. SynapticFlow provides freedom to choose from multiple providers, including increasingly powerful local LLMs.

🎯 What Problems We Solve
πŸ”§ MCP Tool Integration Made Simple
  • Problem: MCP tools require technical setup and command-line knowledge
  • Solution: Built-in tools and easy MCP server management with GUI
  • Benefit: Anyone can use powerful tools without technical barriers
πŸ”“ Freedom from LLM Vendor Lock-in
  • Problem: Most AI platforms tie users to specific LLM providers
  • Solution: Support for multiple providers (OpenAI, Anthropic, Groq, local models, etc.)
  • Benefit: Choose the best model for your needs and budget
πŸ€– Personalized AI Agents
  • Problem: Generic AI assistants don't fit specific workflows
  • Solution: Create custom agents with unique personalities and tool access
  • Benefit: AI that works exactly how you want it to
πŸ›  What We Provide
βœ… Comprehensive Built-in Tool Ecosystem

πŸ›‘οΈ Secure File Management:

  • SecureFileManager: Advanced path validation and sandboxed operations
  • Content Store: Upload, index, and full-text search across PDF, DOCX, XLSX files
  • File Attachments: Smart MIME type handling with preview capabilities
  • Document Processing: Extract and analyze content from multiple formats

⚑ Code Execution & Development:

  • Python Sandbox: Secure code execution with real-time result capture
  • TypeScript Runtime: JavaScript/TypeScript evaluation environment
  • Output Management: Comprehensive execution logging and error handling
  • Development Tools: Built-in debugging and testing utilities

🌐 Browser Automation:

  • Interactive Browser Server: Automated web interactions and scraping
  • Session Management: Persistent browser sessions with state management
  • Content Extraction: Clean markdown conversion from web pages
  • Web Integration: Seamless web data processing pipeline

πŸ”— Advanced MCP Integration:

  • Dual Backend Support: Both Rust Tauri and Web Worker implementations
  • Security Validation: Built-in SecurityValidator with comprehensive protection
  • Tool Execution Context: Unified calling interface across all backends
  • Error Normalization: Robust error handling and reporting system
βœ… Advanced Multi-LLM Ecosystem

8 Major Providers, 50+ Models:

  • πŸ€– OpenAI: GPT-4.1 series, o3/o4-mini reasoning models, GPT-4o variants
  • 🧠 Anthropic: Claude 4 Opus/Sonnet, Claude 3.5 series with advanced tool calling
  • πŸš€ Google: Gemini 2.5 Pro/Flash (2M context), Gemini 2.0 agentic models
  • ⚑ Groq: Llama 3.3 70B, DeepSeek R1 Distill, Qwen3 32B reasoning (1,800+ tokens/sec)
  • πŸ”₯ Fireworks: DeepSeek R1, Qwen3 235B MoE, Llama 4 Maverick/Scout
  • 🧠 Cerebras: Ultra-fast inference with industry-leading speed
  • 🏠 Ollama: Local models with zero cost (Llama, Mistral, Qwen, CodeLlama)
  • 🎯 Empty: Custom provider configurations

Advanced Features:

  • πŸ€” Reasoning Models: o3, DeepSeek R1, Qwen3 thinking models for complex problem-solving
  • πŸ’° Cost Optimization: Real-time cost tracking and model comparison
  • πŸ“š Massive Context: Up to 2M tokens (Gemini 2.5 Pro)
  • πŸ‘οΈ Multimodal: Vision, document processing, and code understanding
βœ… User-Friendly Features
  • πŸ€– Custom Agents: Create AI assistants with specific roles and tool access
  • πŸ‘₯ Multi-Agent Collaboration: Multiple agents working together on complex tasks
  • πŸ’¬ Session Management: Organize conversations with file attachments and context
  • πŸ“€ Export/Import: Share agent configurations and setups with others
  • 🎨 Modern UI: Clean, terminal-style interface that's both powerful and intuitive
πŸ›  Advanced Technology Stack

Core Framework:

  • Tauri 2.x: Latest cross-platform framework with enhanced security and performance
  • React 18.3: Modern UI with concurrent features and advanced hooks
  • TypeScript 5.6: Latest language features with strict type safety
  • RMCP 0.2.1: Rust-based Model Context Protocol with child process transport

Backend Technologies:

  • Rust: High-performance native operations with async/await architecture
  • Tokio: Advanced async runtime for concurrent MCP server management
  • SecurityValidator: Built-in path validation and process sandboxing
  • Warp: HTTP server infrastructure for browser automation capabilities

Frontend Technologies:

  • Tailwind CSS 4.x: Latest utility-first styling with performance optimizations
  • Radix UI: Accessible component primitives for robust UI
  • Dexie: TypeScript-friendly IndexedDB wrapper for local data
  • Zustand: Lightweight, scalable state management solution
πŸ›‘οΈ Security & Performance

Built-in Security:

  • SecurityValidator: Advanced path traversal protection and sandboxed operations
  • MIME Type Validation: Safe file handling across all supported formats
  • Process Isolation: MCP servers run in isolated child processes for maximum security
  • API Key Management: Secure in-app credential storage with encryption
  • Content Sanitization: Automatic cleaning and validation of all user inputs

Performance Optimizations:

  • Streaming Responses: Real-time AI model outputs with minimal latency
  • Concurrent Tool Execution: Parallel MCP server operations for faster results
  • Smart Caching: Intelligent resource caching for improved response times
  • Memory Management: Optimized for long-running sessions and large datasets
  • Ultra-Fast Models: Cerebras integration delivering 1,800+ tokens/second
πŸ“ Project Structure
synaptic-flow/
β”œβ”€β”€ src/                        # React Frontend (Feature-Driven Architecture)
β”‚   β”œβ”€β”€ app/                    # App entry, root layout, global providers
β”‚   β”‚   β”œβ”€β”€ App.tsx             # Main application component
β”‚   β”‚   β”œβ”€β”€ main.tsx            # React entry point
β”‚   β”‚   └── globals.css         # Global styles
β”‚   β”œβ”€β”€ assets/                 # Static assets (images, svgs)
β”‚   β”œβ”€β”€ components/             # Shared UI components (20+ shadcn/ui components)
β”‚   β”‚   β”œβ”€β”€ ui/                 # shadcn/ui component library
β”‚   β”‚   β”œβ”€β”€ layout/             # App layout components
β”‚   β”‚   └── common/             # Reusable common components
β”‚   β”œβ”€β”€ features/               # Feature modules (7 major features)
β”‚   β”‚   β”œβ”€β”€ assistant/          # AI agent management and configuration
β”‚   β”‚   β”œβ”€β”€ chat/               # Real-time chat interface with tool calling
β”‚   β”‚   β”œβ”€β”€ group/              # Multi-agent collaboration system
β”‚   β”‚   β”œβ”€β”€ history/            # Conversation history and search
β”‚   β”‚   β”œβ”€β”€ prompts/            # Prompt management and templates
β”‚   β”‚   β”œβ”€β”€ session/            # Session management with file attachments
β”‚   β”‚   β”œβ”€β”€ settings/           # Configuration and API key management
β”‚   β”‚   └── tools/              # Built-in tool ecosystem and MCP integration
β”‚   β”œβ”€β”€ context/                # React context system (8 specialized contexts)
β”‚   β”‚   β”œβ”€β”€ AssistantContext.tsx   # Agent state management
β”‚   β”‚   β”œβ”€β”€ BuiltInToolContext.tsx # Tool execution context
β”‚   β”‚   β”œβ”€β”€ MCPServerContext.tsx   # MCP server management
β”‚   β”‚   └── ...                   # Additional contexts
β”‚   β”œβ”€β”€ hooks/                  # Custom React hooks
β”‚   β”‚   β”œβ”€β”€ use-rust-backend.ts    # Tauri backend integration
β”‚   β”‚   β”œβ”€β”€ use-mcp-server.ts      # MCP server management
β”‚   β”‚   └── ...                   # Feature-specific hooks
β”‚   β”œβ”€β”€ lib/                    # Service layer and business logic
β”‚   β”‚   β”œβ”€β”€ ai-service.ts          # LLM provider integration
β”‚   β”‚   β”œβ”€β”€ logger.ts              # Centralized logging system
β”‚   β”‚   β”œβ”€β”€ secure-file-manager.ts # Advanced file operations
β”‚   β”‚   β”œβ”€β”€ rust-backend-client.ts # Backend communication layer
β”‚   β”‚   └── ...                   # Additional services
β”‚   β”œβ”€β”€ models/                 # TypeScript type definitions
β”‚   β”‚   β”œβ”€β”€ chat.ts               # Chat and message types
β”‚   β”‚   β”œβ”€β”€ mcp-types.ts          # MCP protocol types (680+ lines)
β”‚   β”‚   └── llm-config.ts         # LLM configuration types
β”‚   └── config/                 # Configuration files
β”‚       └── llm-providers.json    # LLM provider definitions
β”œβ”€β”€ src-tauri/                 # Rust Backend (Advanced Architecture)
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ lib.rs                 # Main Tauri application
β”‚   β”‚   β”œβ”€β”€ mcp/                   # MCP server integration modules
β”‚   β”‚   β”œβ”€β”€ security/              # Security validation and sandboxing
β”‚   β”‚   β”œβ”€β”€ tools/                 # Built-in tool implementations
β”‚   β”‚   └── commands/              # Tauri command definitions
β”‚   β”œβ”€β”€ Cargo.toml             # Rust dependencies
β”‚   └── tauri.conf.json        # Tauri 2.x configuration
β”œβ”€β”€ docs/                      # Documentation and guides
β”‚   └── history/               # Refactoring and change history
β”œβ”€β”€ package.json               # Node.js dependencies and scripts
β”œβ”€β”€ tailwind.config.js         # Tailwind CSS 4.x configuration
└── vite.config.ts             # Vite build configuration
πŸš€ Getting Started

Ready to use SynapticFlow? Here's how to get up and running:

Option 1: Download Release (Recommended)

Visit our Releases page to download the latest version for your operating system.

Option 2: Build from Source
  1. Prerequisites: Ensure you have Rust, Node.js (v18+), and pnpm installed.

  2. Install Dependencies:

    pnpm install
    
  3. Development Commands:

    # Development
    pnpm tauri dev              # Start development server with hot reload
    pnpm dev                    # Frontend-only development mode
    
    # Code Quality
    pnpm lint                   # ESLint checking with strict rules
    pnpm lint:fix              # Auto-fix lint issues
    pnpm format                # Prettier formatting
    pnpm format:check          # Check formatting compliance
    
    # Testing & Building
    pnpm test                  # Run comprehensive test suite
    pnpm build                 # Production build optimization
    pnpm tauri build          # Create optimized desktop app bundle
    
    # Diagnostics
    pnpm diagnose             # System diagnostic for troubleshooting
    
Next Steps
  1. Configure Your First LLM: Open Settings and add your preferred AI provider's API key
  2. Create an Agent: Set up your first AI assistant with specific tools and personality
  3. Connect MCP Tools: Add external MCP servers or use our built-in tools
  4. Start Collaborating: Begin conversations with your AI agents
πŸ”₯ Performance Highlights

Speed & Efficiency:

  • ⚑ Ultra-Fast Models: Cerebras delivering 1,800+ tokens/second
  • πŸ’° Cost Optimization: 60-80% cost reduction with smart model selection
  • πŸš€ Concurrent Operations: Parallel tool execution for faster results
  • 🀯 Massive Context: Handle up to 2M tokens in single conversations
πŸ“š Documentation
🀝 Contributing

We welcome contributions! Here's how you can help:

  • πŸ› Report Issues: Found a bug? Open an issue
  • πŸ’‘ Suggest Features: Have ideas? Share them in our discussions
  • πŸ”§ Submit Code: Read our Contributing Guide to get started
  • πŸ“š Improve Docs: Help make our documentation even better
πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

🌟 Support

If SynapticFlow helps you work more efficiently with AI tools, consider:

  • ⭐ Star this repository to show your support
  • πŸ—£οΈ Share with others who might find it useful
  • πŸ› Report issues to help us improve
  • πŸ’¬ Join discussions to shape the future of the project

Ready to experience the most advanced AI agent platform? SynapticFlow combines enterprise-grade security, lightning-fast performance, and unlimited LLM freedom in one powerful desktop application! πŸš€

Download SynapticFlow | View Source | Join Community