NoLLMChat
Not-Only LLM Chat. An AI application that enhances creativity and user experience beyond just LLM chat. Noted: Seems it beta version of there is issue with DB please clear site Data in debug
GitHubスター
47
ユーザー評価
未評価
お気に入り
0
閲覧数
1
フォーク
5
イシュー
0
ð NoLLM Chat
A revolutionary AI interaction platform that enhances traditional LLM experiences with a versatile, visual interface for exploring AI technologies directly in your browser.
ð Table of Contents
- ð¯ Overview
- ⨠Live Demo
- ð¥ Features Showcase
- ð¯ Vision
- ðï¸ Project Architecture
- ð Flow Machine Engine
- ð Project Structure
- ð ï¸ Libraries and Tools
- ð Getting Started
- ð¤ Contributing
- ð License
- ð Contact
ð¯ Overview
NoLLM Chat revolutionizes AI interaction by providing a platform that goes beyond basic chat interactions. It enables users to interact with language models in ways that boost creativity and enrich their experience through:
- ð¨ Visual Interface: Node-based workflow creation and management
- ð§ Browser-Based: Runs locally and free of charge with optional cloud extensions
- ð Workflow Automation: Create custom AI workflows tailored to your needs
- ð Comprehensive Learning: Interactive tools for LLMs, prompt engineering, and vector databases
⨠Live Demo
Development Progress: [â â â â â¡â¡â¡â¡â¡â¡] 30%
ð¥ Features Showcase
ð¼ï¸ Platform Overview
ð¬ Interactive Demo
ð¬ Built-in Chat Application
ð Built-in Document Editor
ð¨âð» Built-in Code Editor with Sandbox
ð¯ Vision
ð Enhanced AI Interaction
Move beyond traditional LLM chat with a platform offering a more flexible and visual interface. Users can directly edit and guide AI to improve response quality, enabling richer interaction experiences.
â¡ Automated Personal Workflows
Empowers users to create custom AI workflows tailored to their needs, enhancing productivity and personalization.
ð§ Comprehensive AI Learning
Utilize node-based tools that facilitate interaction with and learning about AI technologies. The platform supports LLMs, prompt engineering, function calls, and vector databases, allowing users to experiment and see the impact of different AI components.
ð Free and Browser-Based
Operates locally and free of charge, with the option to extend capabilities using services like OpenAI. This ensures accessibility and ease of use directly from the browser.
ð Project Structure
src/
â
âââ assets/ # Static assets like images and fonts
âââ components/ # Reusable React components
âââ constants/ # Constant values and configuration settings
âââ contexts/ # React context providers for global state management
âââ css/ # Styling files (CSS or preprocessor files)
âââ hooks/ # Custom React hooks
âââ i18n/ # Internationalization setup and resources
âââ lib/ # Utility libraries and third-party integrations
âââ pages/ # Page components for different routes
âââ services/ # API calls and service functions
âââ states/ # State management files (e.g., Zustand)
âââ utils/ # Utility functions and helpers
â
âââ App.tsx # Main application component
âââ main.tsx # Entry point of the application
âââ routes.tsx # Route configurations
ðï¸ Project Architecture
The architecture is designed to efficiently handle different tasks by dividing them into separate threads. This ensures smooth operation and responsiveness while managing complex processes in the background.
ð§µ Thread Architecture
Thread | Responsibility | Technologies |
---|---|---|
ð¨ Main Thread | UI application logic and responsive interface | React, ReactFlow, Zustand |
ðï¸ Database Worker | Data storage and retrieval operations | TypeORM, PgLite |
ð¤ LLM Thread | Large language model processes and AI computations | WebLLM, Langchain |
ð Embedding Thread | Vector database and embedding model processing | Memory Vector DB, Voy |
graph LR
A[Main Thread] <--> C[Database Worker Thread]
C -->|uses| I((TypeORM))
I -->|Wraps| D((PGLite))
A <--> E[LLM Thread]
E -->|Uses| J((Langchain))
J -->|Wraps| F((WebLLM))
A <--> G[(Memory Vector database)]
G --> K[Embedding thread]
K -->|Use| L((Embedding Model))
A -->|Handle| B((UI Application Logic))
ð Flow Machine Engine
The Flow Machine is the core orchestration engine that powers NoLLM Chat's workflow capabilities. It provides a sophisticated two-phase execution system for managing complex AI workflows with dynamic data sharing and dependency management.
ð¯ Key Features
- â¡ Two-Phase Execution: Separate prepare and execute phases for optimal performance
- ð Dynamic Dependency Resolution: Automatic discovery of upstream node dependencies
- ð Shared Session State: Seamless data sharing between connected nodes
- ðï¸ Modular Handler System: Extensible architecture for custom node types
- ð Topological Sorting: Ensures proper execution order with cycle detection
ðï¸ Flow Machine Integration Architecture
graph TB
subgraph "ð¨ UI Layer (Main Thread)"
RF[ReactFlow Canvas]
CHAT[Chat Interface]
EDITOR[Document Editor]
VSCODE[Code Editor]
end
subgraph "ð Flow Machine Core"
FM[FlowMachine]
FDS[FlowDataService]
HANDLERS[Node Handlers]
SESSION[Session State]
end
subgraph "ðï¸ Data Layer"
DB[(Database Worker)]
VECTOR[(Vector Database)]
FILES[File System]
end
subgraph "ð¤ AI Processing"
LLM[LLM Thread]
EMBED[Embedding Thread]
TOOLS[Tool Handlers]
end
subgraph "ð§ Node Types"
PROMPT[Prompt Nodes]
LLMNODE[LLM Nodes]
DATA[Data Nodes]
SCHEMA[Schema Nodes]
AGENT[Agent Nodes]
end
%% UI to Flow Machine connections
RF --> FM
CHAT --> FM
EDITOR --> FM
VSCODE --> FM
%% Flow Machine internal connections
FM --> FDS
FM --> HANDLERS
FM --> SESSION
%% Data layer connections
FDS --> DB
FDS --> VECTOR
FDS --> FILES
%% AI processing connections
HANDLERS --> LLM
HANDLERS --> EMBED
HANDLERS --> TOOLS
%% Node type connections
HANDLERS --> PROMPT
HANDLERS --> LLMNODE
HANDLERS --> DATA
HANDLERS --> SCHEMA
HANDLERS --> AGENT
%% Styling
classDef uiLayer fill:#e3f2fd
classDef flowCore fill:#f3e5f5
classDef dataLayer fill:#e8f5e8
classDef aiLayer fill:#fff3e0
classDef nodeTypes fill:#fce4ec
class RF,CHAT,EDITOR,VSCODE uiLayer
class FM,FDS,HANDLERS,SESSION flowCore
class DB,VECTOR,FILES dataLayer
class LLM,EMBED,TOOLS aiLayer
class PROMPT,LLMNODE,DATA,SCHEMA,AGENT nodeTypes
ð Flow Machine Execution Flow
sequenceDiagram
participant UI as ð¨ UI Interface
participant FM as ð FlowMachine
participant FDS as ð FlowDataService
participant Handler as ð§ NodeHandler
participant AI as ð¤ AI Thread
participant DB as ðï¸ Database
Note over UI,DB: Workflow Execution Pipeline
UI->>FM: Execute Target Node
FM->>FDS: Get Connected Nodes
FDS->>DB: Query Node Dependencies
DB-->>FDS: Return Graph Data
FDS-->>FM: FlowGraph Structure
Note over FM,Handler: Phase 1: Preparation
loop For Each Dependency
FM->>Handler: prepare(node, context)
Handler->>AI: Process AI Task
AI-->>Handler: Return Result
Handler-->>FM: Preparation Complete
end
Note over FM,Handler: Phase 2: Execution
FM->>Handler: execute(targetNode, context)
Handler->>AI: Execute Main Logic
AI-->>Handler: Final Result
Handler-->>FM: Execution Complete
FM-->>UI: Workflow Results
ð Integration Points
Component | Integration Purpose | Flow Machine Role |
---|---|---|
ReactFlow Canvas | Visual workflow creation | Executes user-designed node graphs |
Chat Interface | Conversational AI flows | Orchestrates message processing pipelines |
Document Editor | AI-assisted writing | Manages content generation workflows |
Code Editor | AI code assistance | Handles code analysis and generation flows |
Vector Database | Semantic search workflows | Coordinates embedding and retrieval operations |
LLM Thread | Language model processing | Manages prompt-to-response workflows |
ð Complete Documentation
For detailed technical documentation about the Flow Machine architecture, including:
- Implementation Details: Core classes and interfaces
- Node Handler Development: Creating custom node types
- Execution Context: Session state management
- Advanced Examples: Complex workflow patterns
ð Read the Complete Flow Machine Documentation
ð ï¸ Libraries and Tools
ðï¸ Core Framework
Technology | Purpose | Description |
---|---|---|
Vite | Build Tool | Fast and modern build tool for web projects |
React | UI Library | Popular JavaScript library for building user interfaces |
ReactFlow | Node Editor | Library for building node-based applications |
ðï¸ Data & Storage
Technology | Purpose | Description |
---|---|---|
PGLite | Database | Lightweight PostgreSQL client for browsers |
TypeORM | ORM | Object-relational mapping with SQLite WASM support |
Voy | Vector Search | WASM vector similarity search engine in Rust |
Memory Vector Database | Vector Store | In-memory embeddings with linear search |
ð¤ AI & LLM Integration
Technology | Purpose | Description |
---|---|---|
WebLLM | LLM Runtime | Run large language models in browser without servers |
Langchain | AI Framework | Framework for developing LLM-powered applications |
Langgraph | Graph Models | Graph-based language model framework |
ð¨ UI & Styling
Technology | Purpose | Description |
---|---|---|
shadcn UI | UI Components | Modern React component library |
Tailwind CSS | CSS Framework | Utility-first CSS framework |
magicui | Components | Additional UI component library |
kokonut | Components | Specialized component collection |
âï¸ Development Tools
Technology | Purpose | Description |
---|---|---|
React Router | Routing | Declarative routing for React applications |
Zustand | State Management | Small, fast, and scalable state management |
i18next | Internationalization | Framework for browser internationalization |
ESLint | Code Linting | Pluggable linter for JavaScript patterns |
Prettier | Code Formatting | Opinionated code formatter for consistency |
ð Getting Started
Get up and running with NoLLM Chat in just a few steps:
ð¦ Installation
Clone the Repository
git clone git@github.com:zrg-team/NoLLMChat.git
Install Dependencies
cd NoLLMChat yarn install
Start Development Server
yarn dev
Open in Browser
Visithttp://localhost:PORT
to start exploring AI workflows!
ð¤ Local LLM Support
NoLLM Chat provides native browser-based language model inference without requiring external APIs:
- ð WebLLM: High-performance inference using WebGPU/WebAssembly with MLC models
- â¡ Wllama: Lightweight WASM-based inference with HuggingFace models
- ð OpenAI-Compatible API: Unified interface for both providers
- ð¯ Structured Output: JSON schema support and function calling (WebLLM)
- ð» Privacy-First: All processing happens locally in your browser
ð Complete Local LLM Documentation
ð¯ Quick Start Guide
- Explore the Demo: Try the live demo first
- Create Your First Workflow: Use the visual node editor to build AI pipelines
- Connect Data Sources: Import your data using CSV, JSONL, or vector databases
- Deploy Locally: Run everything in your browser without external dependencies
ð¤ Contributing
We welcome contributions from the community! Whether it's:
- ð Bug fixes
- ⨠New features
- ð Documentation improvements
- ð¡ Ideas and suggestions
Your help is greatly appreciated! Please check our contribution guidelines for more information.
ð License
This project is licensed under the MIT License. See the LICENSE file for more details.
ð Contact
Got questions, feedback, or suggestions? We'd love to hear from you!
- ð§ Email: zerglingno2@outlook.com
- ð Issues: Open an issue on GitHub
- ð¬ Discussions: Join our community discussions
Built with â¤ï¸ for the AI community