sonata-twilio-server

💬 WhatsApp AI assistant for government services: appointments, social security, tax queries - powered by 🎼 Sonata MCP

GitHub Stars

0

User Rating

Not Rated

Favorites

0

Views

6

Forks

0

Issues

0

README
Sonata WhatsApp Bot with Twilio

WhatsApp bot using Twilio, Express, and LLM with tools support and MCP servers.

Features
  • 🤖 OpenAI integration (extensible to other LLMs)
  • 🛠️ Extensible tool system with Zod validation
  • 🌐 Automatic multilingual support
  • 💬 Per-user conversation context management
  • 🔌 MCP server integration
  • 💾 Persistence with Redis and SQLite
Architecture

The bot uses a modular agent-based architecture:

  • Agent System: Each user has their own agent with independent context
  • Chat Completion Service: Abstract interface to support multiple LLMs
  • Tool System: Extensible tool system with Zod validation
  • Storage System: Persistence with Redis (cache) and SQLite (structured data)
  • MCP Integration: Support for Model Context Protocol servers
Structure
src/
├── index.ts                    # Main server
├── agent/
│   ├── Agent.ts               # Agent implementation with loop
│   ├── AgentManager.ts        # Per-user agent manager
│   ├── PromptManager.ts       # Agent prompt management
│   └── types.ts               # Agent system types
├── config/
│   └── constants.ts           # Configuration and constants
├── services/
│   ├── chat/
│   │   ├── ChatCompletionService.ts       # Abstract base class
│   │   └── OpenAIChatCompletionService.ts # OpenAI implementation
│   ├── mcp/
│   │   ├── MCPClient.ts       # Individual MCP client
│   │   ├── MCPManager.ts      # Multiple MCP clients manager
│   │   └── index.ts           # Module exports
│   └── storage/
│       ├── StorageService.ts       # Abstract base class
│       ├── RedisStorageService.ts  # Redis implementation
│       ├── SQLiteStorageService.ts # SQLite implementation
│       ├── AgentStorageService.ts  # Agent-specific service
│       └── index.ts               # Module exports
├── tools/
│   ├── basicTools.ts          # Tools with Zod
│   └── schemas.ts             # Validation schemas
└── types/
    ├── chat.ts                # Chat completion types
    ├── mcp.ts                 # MCP types
    ├── storage.ts             # Storage types
    └── tool.ts                # Tool types
Setup
  1. Install dependencies:

    npm install
    
  2. Configure environment variables:
    Create a .env file with:

    # Server port
    PORT=3000
    
    # Twilio Configuration
    TWILIO_ACCOUNT_SID=your_account_sid_here
    TWILIO_AUTH_TOKEN=your_auth_token_here
    TWILIO_PHONE_NUMBER=whatsapp:+14155238886
    
    # LLM Configuration
    OPENAI_API_KEY=your_openai_api_key_here
    
    # Storage Configuration
    REDIS_URL=redis://localhost:6379
    SQLITE_PATH=./data/agent.db
    
    # MCP Servers (JSON array)
    MCP_SERVERS=[]
    
  3. Configure Twilio:

    • In your Twilio console, set the WhatsApp webhook to:
      https://your-domain.com/webhook/whatsapp
    • Use ngrok for local development: ngrok http 3000
Development
# Development mode
npm run dev

# Build
npm run build

# Production
npm start

# Type checking
npm run typecheck
Extending the System
Create New Tools
import { createTool } from './types/tool';
import { z } from 'zod';

const myToolSchema = z.object({
  param1: z.string().describe('Parameter description'),
  param2: z.number().optional(),
});

export const myTool = createTool({
  name: 'my_tool',
  description: 'My custom tool',
  schema: myToolSchema,
  execute: async (params) => {
    // Tool logic
    return { result: 'success' };
  },
});
Add a New LLM Provider
import { ChatCompletionService } from './services/chat/ChatCompletionService';
import { ChatCompletionRequest, ChatCompletionResponse } from './types/chat';

export class AnthropicChatCompletionService extends ChatCompletionService {
  protected modelName = 'claude-3-opus-20240229';
  protected serviceName = 'Anthropic';
  
  async complete(request: ChatCompletionRequest): Promise<ChatCompletionResponse> {
    // Implement Anthropic-specific logic
    // Convert request to Anthropic format
    // Call the API
    // Convert response to common format
  }
}
Configure MCP Servers
// In .env, set MCP_SERVERS:
[
  {
    "name": "filesystem",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
  },
  {
    "name": "github",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-github"],
    "env": {
      "GITHUB_TOKEN": "your-token"
    }
  }
]
Storage System

The system includes two storage options:

Redis
  • Used for cache and temporary data
  • Fast read/write operations
  • Automatic TTL support
  • Ideal for conversation contexts
SQLite
  • Used for persistent and structured data
  • Support for complex SQL queries
  • Database migrations
  • Ideal for analytics and historical data
Storage Usage
// Use Redis
const redisStorage = new RedisStorageService();
await redisStorage.connect();

// Use SQLite
const sqliteStorage = new SQLiteStorageService('./data/bot.db');
await sqliteStorage.connect();

// Use AgentStorageService (high-level abstraction)
const agentStorage = new AgentStorageService(redisStorage);
await agentStorage.saveAgentContext(userId, context);
Agent Loop Features

The agent implements an intelligent loop that:

  1. Receives the user message
  2. Queries the LLM to determine what to do
  3. If the LLM requests tools, executes them
  4. Feeds results back to the LLM
  5. Repeats until getting a final response
  6. Automatically manages tasks and context
  7. Prevents infinite loops with configurable limits
  8. Persists state for continuity between sessions
Available Tools
  • get_weather: Get weather for a city
  • get_current_time: Get current time
  • calculate: Perform mathematical calculations
  • Plus tools exposed by configured MCP servers
Contributing
  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request
License

This project is licensed under the MIT License.

Author Information

0

Followers

0

Repositories

0

Gists

0

Total Contributions

Related MCPs
inspector logo

Visual testing tool for MCP servers

TypeScript
magic-mcp logo

It's like v0 but in your Cursor/WindSurf/Cline. 21st dev Magic MCP server for working with your frontend like Magic

TypeScript
XcodeBuildMCP logo

A Model Context Protocol (MCP) server that provides Xcode-related tools for integration with AI assistants and other MCP clients.

TypeScript