Insight_Platform

Insight Platform is a modular agent-driven system for automated AI insight generation, combining LangGraph agents, structured protocols (MCP, A2A, ACP), and local LLMs via Ollama, with Streamlit UI and file-based storage.

GitHubスター

0

ユーザー評価

未評価

フォーク

0

イシュー

0

閲覧数

0

お気に入り

0

README
Insight Platform — AI Signal Connector Prototype
Agent System, Protocol Design, and Modular LLM Architecture
Modular Agentic System for Curated AI Insights Using LangGraph + Ollama

What This Is

A modular Python project combining:

  • LangGraph Agent Flow:
    Scraper → Summarizer → Tagger → Connector → Publisher
  • Ollama for Local Summarization:
    Using /api/generate endpoint with streamed response handling
  • Streamlit UI for End-User Display:
    Browse markdown-formatted insights with tags and source URLs
  • Markdown + JSON File Storage:
    Insights saved as .md + .meta.json pairs

Project Structure
backend/
├── app.py (FastAPI routes)
├── core/orchestrator.py (LangGraph flow)
├── agents/ (scraper_agent, summarizer_agent, etc.)
├── protocols/ (mcp, a2a, acp structures)
├── utils/
streamlit_ui/
├── app.py (Streamlit frontend)
docker-compose.yml
README.md

How to Set Up Locally
Clone the Repo
git clone https://github.com/yourusername/insight-platform.git
cd insight-platform
Pull Required Ollama Model

Start only Ollama service first:

docker compose up ollama

Then pull the model:

curl -X POST http://localhost:11434/api/pull \
  -H "Content-Type: application/json" \
  -d '{"name": "llama3"}'
Build and Run Backend + UI
docker compose up --build
  • Backend available at: http://localhost:8000
  • Streamlit UI at: http://localhost:8501

Workflow Summary
  1. Send a URL to /run_pipeline:
    • The pipeline scrapes → summarizes → tags → connects → publishes.
  2. Visit Streamlit UI or call /list_insights API to view insights.
  3. Insights include:
    • Title, tags, content preview, original source URL.

Current Scope + Limitations
  • No CI/CD or Kubernetes deployment yet.
  • Models must be pulled manually the first time.
  • Local file storage only; no database integration yet.

Future Enhancements
  • CI/CD pipelines (GitHub Actions, Docker Registry)
  • Full Kubernetes Helm Charts
  • OAuth2 login on Streamlit UI
  • Ollama model pre-pull automation

Contribution Guidelines
  • Fork → Clone → Submit PR
  • Focus on modular, readable, community-friendly code.
作者情報
Sai Sandeep Kantareddy

Data Scientist. Graduate Student of ASU(2023), Former Undergraduate Student of Dr.MGR University(CS,2017).

Austin

12

フォロワー

93

リポジトリ

1

Gist

6

貢献数

トップ貢献者

スレッド