chat

An open-source SDK for agentic workflows based on MCPs. Integrated LLM cost management and one-click deploy to Cloudflare.

GitHub Stars

151

User Rating

Not Rated

Favorites

0

Views

3

Forks

12

Issues

205

README
image
decocms.com

decocms is an open-source foundation for building AI-native software.
We equip developers, engineers, and AI enthusiasts with robust tools to rapidly
prototype, develop, and deploy AI-powered applications.

Official docs: https://docs.deco.page

[!TIP]
If you have questions or want to learn more, please join our discord
community: https://decocms.com/discord

Who is it for?
  • Vibecoders prototyping ideas
  • Agentic engineers deploying scalable, secure, and sustainable production
    systems
Why deco CMS?

Our goal is simple: empower teams with Generative AI by giving builders the
tools to create AI applications that scale beyond the initial demo and into the
thousands of users, securely and cost-effectively.

image
Core capabilities
  • Open-source Runtime – Easily compose tools, workflows, and views within a
    single codebase
  • MCP Mesh (Model Context Protocol) – Securely integrate models, data
    sources, and APIs, with observability and cost control
  • Unified TypeScript Stack – Combine backend logic and custom React/Tailwind
    frontends seamlessly using typed RPC
  • Global, Modular Infrastructure – Built on Cloudflare for low-latency,
    infinitely scalable deployments. Self-host with your Cloudflare API Key
  • Visual Workspace – Build agents, connect tools, manage permissions, and
    orchestrate everything built in code
image
Creating a new Deco project

A Deco project extends a standard Cloudflare Worker with our building blocks and
defaults for MCP servers.
It runs a type-safe API out of the box and can also serve views — front-end apps
deployed alongside the server.

Currently, views can be any Vite app that outputs a static build. Soon, they’ll
support components declared as tools, callable by app logic or LLMs.
Views can call server-side tools via typed RPC.

Requirements
  • Your preferred JavaScript runtime:
Quick Start
  1. Create your project
npm create deco

or

bun create deco

This will prompt you to log in or to create an account on decocms.com.

  1. Enter the project directory and start the dev server
cd <my-project-directory>
npm install
npm run dev               # → http://localhost:8787 (hot reload)

Need pre‑built MCP integrations? Explore
deco-cx/apps.

Project Layout
my-project/
├── server/         # MCP tools & workflows (Cloudflare Workers)
│   ├── main.ts
│   ├── deco.gen.ts  # Typed bindings (auto-generated)
│   └── wrangler.toml
├── view/           # React + Tailwind UI (optional)
│   └── src/
├── package.json    # Root workspace scripts
└── README.md

Skip view/ if you don’t need a frontend.

CLI Essentials
Command Purpose
deco dev Run server & UI with hot reload
deco deploy Deploy to Cloudflare Workers
deco gen Generate types for external integrations
deco gen:self Generate types for your own tools

For full command list: deco --help or see the
CLI README

Building Blocks

A Deco project is built using tools and workflows — the core primitives
for connecting integrations, APIs, models, and business logic.

Tools

Atomic functions that call external APIs, databases, or AI models. All templates
include the necessary imports from the Deco Workers runtime.

import { createTool, Env, z } from "deco/mod.ts";

const createMyTool = (env: Env) =>
  createTool({
    id: "MY_TOOL",
    description: "Describe what it does",
    inputSchema: z.object({ query: z.string() }),
    outputSchema: z.object({ answer: z.string() }),
    execute: async ({ context }) => {
      const res = await env.OPENAI.CHAT_COMPLETIONS({
        model: "gpt-4o",
        messages: [{ role: "user", content: context.query }],
      });
      return { answer: res.choices[0].message.content };
    },
  });

Tools can be used independently or within workflows. Golden rule: one tool
call per step — keep logic in the workflow.


Workflows

Orchestrate tools using Mastra operators like .then, .parallel,
.branch, and .dountil.

Tip: Add Mastra docs to your AI code
assistant for autocomplete and examples.

import { createStepFromTool, createWorkflow } from "deco/mod.ts";

return createWorkflow({
  id: "HELLO_WORLD",
  inputSchema: z.object({ name: z.string() }),
  outputSchema: z.object({ greeting: z.string() }),
})
  .then(createStepFromTool(createMyTool(env)))
  .map(({ inputData }) => ({ greeting: `Hello, ${inputData.answer}!` }))
  .commit();

Views

Build React + Tailwind frontends served by the same Cloudflare Worker.

  • Routing with TanStack Router
  • Typed RPC via @deco/workers-runtime/client
  • Preconfigured with shadcn/ui and lucide-react

Development Flow
  1. Add an integration via the decocms.com dashboard
    (improved UX coming soon)

  2. Run npm run gen → updates deco.gen.ts with typed clients

  3. Write tools in server/main.ts

  4. Compose workflows using .map, .branch, .parallel, etc.

  5. (Optional) Run npm run gen:self → typed RPC clients for your tools

  6. Build views in /view and call workflows via the typed client

  7. Run locally

    npm run dev   # → http://localhost:8787
    
  8. Deploy to Cloudflare

    npm run deploy
    

How to Contribute (WIP)

We welcome contributions! Check out CONTRIBUTING.md for
guidelines and tips.


Made with ❤️ by the Deco community — helping teams build AI-native systems that
scale.

Author Information
deco.cx

Design, develop and deliver complete experiences.

Brazil

364

Followers

61

Repositories

0

Gists

0

Total Contributions

Related MCPs
agent-toolkit logo

Python and TypeScript library for integrating the Stripe API into agentic workflows

TypeScript
chatluna logo

多平台模型接入,可扩展,多种输出格式,提供大语言模型聊天服务的插件 | A bot plugin for LLM chat with multi-model integration, extensibility, and various output formats

TypeScript
n8n-mcp logo

A MCP for Claude Desktop / Claude Code / Windsurf / Cursor to build n8n workflows for you

TypeScript