sonata-twilio-server
💬 WhatsApp AI assistant for government services: appointments, social security, tax queries - powered by 🎼 Sonata MCP
GitHubスター
0
ユーザー評価
未評価
お気に入り
0
閲覧数
6
フォーク
0
イシュー
0
Sonata WhatsApp Bot with Twilio
WhatsApp bot using Twilio, Express, and LLM with tools support and MCP servers.
Features
- 🤖 OpenAI integration (extensible to other LLMs)
- 🛠️ Extensible tool system with Zod validation
- 🌐 Automatic multilingual support
- 💬 Per-user conversation context management
- 🔌 MCP server integration
- 💾 Persistence with Redis and SQLite
Architecture
The bot uses a modular agent-based architecture:
- Agent System: Each user has their own agent with independent context
- Chat Completion Service: Abstract interface to support multiple LLMs
- Tool System: Extensible tool system with Zod validation
- Storage System: Persistence with Redis (cache) and SQLite (structured data)
- MCP Integration: Support for Model Context Protocol servers
Structure
src/
├── index.ts # Main server
├── agent/
│ ├── Agent.ts # Agent implementation with loop
│ ├── AgentManager.ts # Per-user agent manager
│ ├── PromptManager.ts # Agent prompt management
│ └── types.ts # Agent system types
├── config/
│ └── constants.ts # Configuration and constants
├── services/
│ ├── chat/
│ │ ├── ChatCompletionService.ts # Abstract base class
│ │ └── OpenAIChatCompletionService.ts # OpenAI implementation
│ ├── mcp/
│ │ ├── MCPClient.ts # Individual MCP client
│ │ ├── MCPManager.ts # Multiple MCP clients manager
│ │ └── index.ts # Module exports
│ └── storage/
│ ├── StorageService.ts # Abstract base class
│ ├── RedisStorageService.ts # Redis implementation
│ ├── SQLiteStorageService.ts # SQLite implementation
│ ├── AgentStorageService.ts # Agent-specific service
│ └── index.ts # Module exports
├── tools/
│ ├── basicTools.ts # Tools with Zod
│ └── schemas.ts # Validation schemas
└── types/
├── chat.ts # Chat completion types
├── mcp.ts # MCP types
├── storage.ts # Storage types
└── tool.ts # Tool types
Setup
Install dependencies:
npm install
Configure environment variables:
Create a.env
file with:# Server port PORT=3000 # Twilio Configuration TWILIO_ACCOUNT_SID=your_account_sid_here TWILIO_AUTH_TOKEN=your_auth_token_here TWILIO_PHONE_NUMBER=whatsapp:+14155238886 # LLM Configuration OPENAI_API_KEY=your_openai_api_key_here # Storage Configuration REDIS_URL=redis://localhost:6379 SQLITE_PATH=./data/agent.db # MCP Servers (JSON array) MCP_SERVERS=[]
Configure Twilio:
- In your Twilio console, set the WhatsApp webhook to:
https://your-domain.com/webhook/whatsapp
- Use ngrok for local development:
ngrok http 3000
- In your Twilio console, set the WhatsApp webhook to:
Development
# Development mode
npm run dev
# Build
npm run build
# Production
npm start
# Type checking
npm run typecheck
Extending the System
Create New Tools
import { createTool } from './types/tool';
import { z } from 'zod';
const myToolSchema = z.object({
param1: z.string().describe('Parameter description'),
param2: z.number().optional(),
});
export const myTool = createTool({
name: 'my_tool',
description: 'My custom tool',
schema: myToolSchema,
execute: async (params) => {
// Tool logic
return { result: 'success' };
},
});
Add a New LLM Provider
import { ChatCompletionService } from './services/chat/ChatCompletionService';
import { ChatCompletionRequest, ChatCompletionResponse } from './types/chat';
export class AnthropicChatCompletionService extends ChatCompletionService {
protected modelName = 'claude-3-opus-20240229';
protected serviceName = 'Anthropic';
async complete(request: ChatCompletionRequest): Promise<ChatCompletionResponse> {
// Implement Anthropic-specific logic
// Convert request to Anthropic format
// Call the API
// Convert response to common format
}
}
Configure MCP Servers
// In .env, set MCP_SERVERS:
[
{
"name": "filesystem",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
},
{
"name": "github",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_TOKEN": "your-token"
}
}
]
Storage System
The system includes two storage options:
Redis
- Used for cache and temporary data
- Fast read/write operations
- Automatic TTL support
- Ideal for conversation contexts
SQLite
- Used for persistent and structured data
- Support for complex SQL queries
- Database migrations
- Ideal for analytics and historical data
Storage Usage
// Use Redis
const redisStorage = new RedisStorageService();
await redisStorage.connect();
// Use SQLite
const sqliteStorage = new SQLiteStorageService('./data/bot.db');
await sqliteStorage.connect();
// Use AgentStorageService (high-level abstraction)
const agentStorage = new AgentStorageService(redisStorage);
await agentStorage.saveAgentContext(userId, context);
Agent Loop Features
The agent implements an intelligent loop that:
- Receives the user message
- Queries the LLM to determine what to do
- If the LLM requests tools, executes them
- Feeds results back to the LLM
- Repeats until getting a final response
- Automatically manages tasks and context
- Prevents infinite loops with configurable limits
- Persists state for continuity between sessions
Available Tools
get_weather
: Get weather for a cityget_current_time
: Get current timecalculate
: Perform mathematical calculations- Plus tools exposed by configured MCP servers
Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
License
This project is licensed under the MIT License.
It's like v0 but in your Cursor/WindSurf/Cline. 21st dev Magic MCP server for working with your frontend like Magic
A Model Context Protocol (MCP) server that provides Xcode-related tools for integration with AI assistants and other MCP clients.