Deployment
Quick Start
See README.md for the 3-tier installation overview. The interactive wizard (pnpm wizard) handles all configuration — API keys, interfaces, storage paths, and scheduler. It writes a chmod 0600 .env file and creates the required data directories.
Prerequisites
- Node.js 20+, pnpm 9+
- Docker and Docker Compose (for Docker deployments)
.envfile — generated bypnpm wizardor manually from.env.example
Docker Compose
- redis: Redis 7 for BullMQ job queue (required)
- echos: The EchOS application (healthcheck: GET /health)
./data/.
With nginx + Let’s Encrypt (HTTPS)
Manual Deployment
Troubleshooting
Module Not Found Errors
If you seeCannot find package '@echos/shared' or '@echos/plugin-youtube' or similar:
LanceDB Native Module Errors
If you seeCannot find module '@lancedb/lancedb-darwin-x64' or similar:
- Intel Macs: The project uses LanceDB 0.22.3 (configured in
packages/core/package.json) - ARM Macs/Linux: LanceDB should install the correct native binding automatically
- Force reinstall:
pnpm install --force
Telegram Bot Conflicts
Error:Conflict: terminated by other getUpdates request
This means another instance of your bot is already running. Only one instance can poll Telegram.
Solution:
VPS Deployment (Ubuntu/Debian)
One-liner install
systemd service (production)
SSH / VPS CLI Access
The daemon runs via systemd or Docker as normal. When you SSH in, run queries directly against the live data without interrupting the daemon:Oracle Cloud Deployment
- Syncs project files via rsync
- Installs dependencies on remote
- Builds the project
- Restarts via docker-compose
Important: Dev vs Production
Before starting locally, ensure your production deployment is stopped:Backup
Environment Variables
See.env.example for the full list. Required:
TELEGRAM_BOT_TOKEN- Get from @BotFather on TelegramALLOWED_USER_IDS- Comma-separated Telegram user IDs (get from @userinfobot)- At least one LLM key:
ANTHROPIC_API_KEY- Anthropic pay-as-you-go key from https://console.anthropic.com/LLM_API_KEY- API key for any other provider (Groq, Mistral, xAI, etc.)
DEFAULT_MODEL- Model spec: bare ID (claude-haiku-4-5-20251001),provider/model(groq/llama-3.3-70b-versatile), or any ID when usingLLM_BASE_URL
LLM_BASE_URL- Custom OpenAI-compatible endpoint (e.g. DeepInfra); requiresLLM_API_KEYOPENAI_API_KEY- Required for embeddings and Whisper transcriptionCACHE_RETENTION- Prompt cache TTL for Anthropic models:long(1h, default),short(5min),none(off). Alwaysnonefor custom endpoints.
REDIS_URL- Redis connection URL (default:redis://localhost:6379)