NoLLMChat

Not-Only LLM Chat. An AI application that enhances creativity and user experience beyond just LLM chat. Noted: Seems it beta version of there is issue with DB please clear site Data in debug

GitHubスター

47

ユーザー評価

未評価

お気に入り

0

閲覧数

1

フォーク

5

イシュー

0

README
🚀 NoLLM Chat

A revolutionary AI interaction platform that enhances traditional LLM experiences with a versatile, visual interface for exploring AI technologies directly in your browser.


📋 Table of Contents

🎯 Overview

NoLLM Chat revolutionizes AI interaction by providing a platform that goes beyond basic chat interactions. It enables users to interact with language models in ways that boost creativity and enrich their experience through:

  • 🎨 Visual Interface: Node-based workflow creation and management
  • 🔧 Browser-Based: Runs locally and free of charge with optional cloud extensions
  • 🔄 Workflow Automation: Create custom AI workflows tailored to your needs
  • 📚 Comprehensive Learning: Interactive tools for LLMs, prompt engineering, and vector databases
✨ Live Demo

Netlify Status

🌐 Try Live Demo 🌐

Development Progress: [■■■■□□□□□□] 30%


🎥 Features Showcase
🖼️ Platform Overview

Intro Image

🎬 Interactive Demo

Demo

💬 Built-in Chat Application

Demo Chat Application

📝 Built-in Document Editor

Demo Document Editor

👨‍💻 Built-in Code Editor with Sandbox

VSLite Application


🎯 Vision
🚀 Enhanced AI Interaction

Move beyond traditional LLM chat with a platform offering a more flexible and visual interface. Users can directly edit and guide AI to improve response quality, enabling richer interaction experiences.

⚡ Automated Personal Workflows

Empowers users to create custom AI workflows tailored to their needs, enhancing productivity and personalization.

🧠 Comprehensive AI Learning

Utilize node-based tools that facilitate interaction with and learning about AI technologies. The platform supports LLMs, prompt engineering, function calls, and vector databases, allowing users to experiment and see the impact of different AI components.

🆓 Free and Browser-Based

Operates locally and free of charge, with the option to extend capabilities using services like OpenAI. This ensures accessibility and ease of use directly from the browser.


📁 Project Structure
src/
│
├── assets/         # Static assets like images and fonts
├── components/     # Reusable React components
├── constants/      # Constant values and configuration settings
├── contexts/       # React context providers for global state management
├── css/            # Styling files (CSS or preprocessor files)
├── hooks/          # Custom React hooks
├── i18n/           # Internationalization setup and resources
├── lib/            # Utility libraries and third-party integrations
├── pages/          # Page components for different routes
├── services/       # API calls and service functions
├── states/         # State management files (e.g., Zustand)
├── utils/          # Utility functions and helpers
│
├── App.tsx         # Main application component
├── main.tsx        # Entry point of the application
└── routes.tsx      # Route configurations
🏗️ Project Architecture

The architecture is designed to efficiently handle different tasks by dividing them into separate threads. This ensures smooth operation and responsiveness while managing complex processes in the background.

🧵 Thread Architecture
Thread Responsibility Technologies
🎨 Main Thread UI application logic and responsive interface React, ReactFlow, Zustand
🗃️ Database Worker Data storage and retrieval operations TypeORM, PgLite
🤖 LLM Thread Large language model processes and AI computations WebLLM, Langchain
🔍 Embedding Thread Vector database and embedding model processing Memory Vector DB, Voy
graph LR
    A[Main Thread] <--> C[Database Worker Thread]
    C -->|uses| I((TypeORM))
    I -->|Wraps| D((PGLite))
    A <--> E[LLM Thread]
    E -->|Uses| J((Langchain))
    J -->|Wraps| F((WebLLM))
    A <--> G[(Memory Vector database)]
    G --> K[Embedding thread]
    K -->|Use| L((Embedding Model))
    
    A -->|Handle| B((UI Application Logic))

🔄 Flow Machine Engine

The Flow Machine is the core orchestration engine that powers NoLLM Chat's workflow capabilities. It provides a sophisticated two-phase execution system for managing complex AI workflows with dynamic data sharing and dependency management.

🎯 Key Features
  • ⚡ Two-Phase Execution: Separate prepare and execute phases for optimal performance
  • 🔗 Dynamic Dependency Resolution: Automatic discovery of upstream node dependencies
  • 📊 Shared Session State: Seamless data sharing between connected nodes
  • 🏗️ Modular Handler System: Extensible architecture for custom node types
  • 🔄 Topological Sorting: Ensures proper execution order with cycle detection
🏛️ Flow Machine Integration Architecture
graph TB
    subgraph "🎨 UI Layer (Main Thread)"
        RF[ReactFlow Canvas]
        CHAT[Chat Interface]
        EDITOR[Document Editor]
        VSCODE[Code Editor]
    end
    
    subgraph "🔄 Flow Machine Core"
        FM[FlowMachine]
        FDS[FlowDataService]
        HANDLERS[Node Handlers]
        SESSION[Session State]
    end
    
    subgraph "🗃️ Data Layer"
        DB[(Database Worker)]
        VECTOR[(Vector Database)]
        FILES[File System]
    end
    
    subgraph "🤖 AI Processing"
        LLM[LLM Thread]
        EMBED[Embedding Thread]
        TOOLS[Tool Handlers]
    end
    
    subgraph "🔧 Node Types"
        PROMPT[Prompt Nodes]
        LLMNODE[LLM Nodes]
        DATA[Data Nodes]
        SCHEMA[Schema Nodes]
        AGENT[Agent Nodes]
    end
    
    %% UI to Flow Machine connections
    RF --> FM
    CHAT --> FM
    EDITOR --> FM
    VSCODE --> FM
    
    %% Flow Machine internal connections
    FM --> FDS
    FM --> HANDLERS
    FM --> SESSION
    
    %% Data layer connections
    FDS --> DB
    FDS --> VECTOR
    FDS --> FILES
    
    %% AI processing connections
    HANDLERS --> LLM
    HANDLERS --> EMBED
    HANDLERS --> TOOLS
    
    %% Node type connections
    HANDLERS --> PROMPT
    HANDLERS --> LLMNODE
    HANDLERS --> DATA
    HANDLERS --> SCHEMA
    HANDLERS --> AGENT
    
    %% Styling
    classDef uiLayer fill:#e3f2fd
    classDef flowCore fill:#f3e5f5
    classDef dataLayer fill:#e8f5e8
    classDef aiLayer fill:#fff3e0
    classDef nodeTypes fill:#fce4ec
    
    class RF,CHAT,EDITOR,VSCODE uiLayer
    class FM,FDS,HANDLERS,SESSION flowCore
    class DB,VECTOR,FILES dataLayer
    class LLM,EMBED,TOOLS aiLayer
    class PROMPT,LLMNODE,DATA,SCHEMA,AGENT nodeTypes
📖 Flow Machine Execution Flow
sequenceDiagram
    participant UI as 🎨 UI Interface
    participant FM as 🔄 FlowMachine
    participant FDS as 📊 FlowDataService
    participant Handler as 🔧 NodeHandler
    participant AI as 🤖 AI Thread
    participant DB as 🗃️ Database
    
    Note over UI,DB: Workflow Execution Pipeline
    
    UI->>FM: Execute Target Node
    FM->>FDS: Get Connected Nodes
    FDS->>DB: Query Node Dependencies
    DB-->>FDS: Return Graph Data
    FDS-->>FM: FlowGraph Structure
    
    Note over FM,Handler: Phase 1: Preparation
    loop For Each Dependency
        FM->>Handler: prepare(node, context)
        Handler->>AI: Process AI Task
        AI-->>Handler: Return Result
        Handler-->>FM: Preparation Complete
    end
    
    Note over FM,Handler: Phase 2: Execution
    FM->>Handler: execute(targetNode, context)
    Handler->>AI: Execute Main Logic
    AI-->>Handler: Final Result
    Handler-->>FM: Execution Complete
    FM-->>UI: Workflow Results
🔗 Integration Points
Component Integration Purpose Flow Machine Role
ReactFlow Canvas Visual workflow creation Executes user-designed node graphs
Chat Interface Conversational AI flows Orchestrates message processing pipelines
Document Editor AI-assisted writing Manages content generation workflows
Code Editor AI code assistance Handles code analysis and generation flows
Vector Database Semantic search workflows Coordinates embedding and retrieval operations
LLM Thread Language model processing Manages prompt-to-response workflows
📚 Complete Documentation

For detailed technical documentation about the Flow Machine architecture, including:

  • Implementation Details: Core classes and interfaces
  • Node Handler Development: Creating custom node types
  • Execution Context: Session state management
  • Advanced Examples: Complex workflow patterns

👉 Read the Complete Flow Machine Documentation


🛠️ Libraries and Tools
🏗️ Core Framework
Technology Purpose Description
Vite Build Tool Fast and modern build tool for web projects
React UI Library Popular JavaScript library for building user interfaces
ReactFlow Node Editor Library for building node-based applications
🗄️ Data & Storage
Technology Purpose Description
PGLite Database Lightweight PostgreSQL client for browsers
TypeORM ORM Object-relational mapping with SQLite WASM support
Voy Vector Search WASM vector similarity search engine in Rust
Memory Vector Database Vector Store In-memory embeddings with linear search
🤖 AI & LLM Integration
Technology Purpose Description
WebLLM LLM Runtime Run large language models in browser without servers
Langchain AI Framework Framework for developing LLM-powered applications
Langgraph Graph Models Graph-based language model framework
🎨 UI & Styling
Technology Purpose Description
shadcn UI UI Components Modern React component library
Tailwind CSS CSS Framework Utility-first CSS framework
magicui Components Additional UI component library
kokonut Components Specialized component collection
⚙️ Development Tools
Technology Purpose Description
React Router Routing Declarative routing for React applications
Zustand State Management Small, fast, and scalable state management
i18next Internationalization Framework for browser internationalization
ESLint Code Linting Pluggable linter for JavaScript patterns
Prettier Code Formatting Opinionated code formatter for consistency

🚀 Getting Started

Get up and running with NoLLM Chat in just a few steps:

📦 Installation
  1. Clone the Repository

    git clone git@github.com:zrg-team/NoLLMChat.git
    
  2. Install Dependencies

    cd NoLLMChat
    yarn install
    
  3. Start Development Server

    yarn dev
    
  4. Open in Browser
    Visit http://localhost:PORT to start exploring AI workflows!

🤖 Local LLM Support

NoLLM Chat provides native browser-based language model inference without requiring external APIs:

  • 🌐 WebLLM: High-performance inference using WebGPU/WebAssembly with MLC models
  • ⚡ Wllama: Lightweight WASM-based inference with HuggingFace models
  • 🔗 OpenAI-Compatible API: Unified interface for both providers
  • 🎯 Structured Output: JSON schema support and function calling (WebLLM)
  • 💻 Privacy-First: All processing happens locally in your browser

👉 Complete Local LLM Documentation

🎯 Quick Start Guide
  1. Explore the Demo: Try the live demo first
  2. Create Your First Workflow: Use the visual node editor to build AI pipelines
  3. Connect Data Sources: Import your data using CSV, JSONL, or vector databases
  4. Deploy Locally: Run everything in your browser without external dependencies

🤝 Contributing

We welcome contributions from the community! Whether it's:

  • 🐛 Bug fixes
  • ✨ New features
  • 📖 Documentation improvements
  • 💡 Ideas and suggestions

Your help is greatly appreciated! Please check our contribution guidelines for more information.


📄 License

This project is licensed under the MIT License. See the LICENSE file for more details.


📞 Contact

Got questions, feedback, or suggestions? We'd love to hear from you!


Built with ❤️ for the AI community

作者情報
zrg-team

Thank you, doctors, nurses, and other medical personnel who are on the front lines of this pandemic

JITERAHồ Chí Minh

21

フォロワー

32

リポジトリ

0

Gist

0

貢献数