Enables comprehensive Android device automation with 19 tools for device management, screen interaction (tap, swipe, type), screenshots, UI element listing, hardware button simulation, and orientation control via the mobile-mcp server.
Provides iOS device control and automation capabilities through the mobile-mcp server, supporting app management, screen interactions, and device control operations.
Provides an AI-powered conversational interface through a Telegram bot, enabling natural language interactions for task management, mobile automation, and information retrieval with support for per-user conversation history and chained tool calls.
Multi-MCP Telegram Bot System
AI-powered Telegram bot with task management and mobile automation via multiple MCP (Model Context Protocol) servers.
Overview
This system integrates three specialized MCP servers with a Telegram bot powered by OpenRouter AI, providing a unified conversational interface for:
Task Management - Weeek task tracker integration
Information - Random facts generator
Mobile Automation - Android/iOS device control
Features
π€ AI-Powered Telegram Bot
Natural language interface via OpenRouter (deepseek-v3.1)
Supports up to 15 chained tool calls per conversation
Per-user conversation history (max 50 messages)
Real-time "ΠΡΠΌΠ°Ρ..." thinking indicator
Automatic MCP usage indicator
π Task Management (Weeek MCP)
Retrieve tasks with states: Backlog, In Progress, Done
Periodic task monitoring (every 30 seconds)
AI-generated summaries every 2 minutes
User subscription management for summaries
π± Mobile Automation (mobile-mcp)
19 tools for comprehensive mobile device control:
Device & app management
Screen interaction (tap, swipe, type)
Screenshots and UI element listing
Hardware button simulation
Orientation control
π² Random Facts (facts-mcp)
On-demand interesting facts via
/factcommand
π Document Embeddings & RAG (Ollama + FAISS + Reranking)
Generate vector embeddings from markdown files
Uses local Ollama with nomic-embed-text model (768 dimensions)
Paragraph-based chunking for optimal embedding quality
RAG (Retrieval Augmented Generation) - 4-stage pipeline for context-aware responses:
Query embedding generation (Ollama)
FAISS vector search (top-10) + similarity filtering (β₯0.71)
Cross-encoder reranking (BGE reranker model)
Query augmentation with top-3 reranked chunks
Per-user RAG mode toggle with persistent state
Comprehensive logging for all pipeline stages
Automatic fallback handling for robustness
JSON output with timestamps for easy integration
Architecture
Prerequisites
Python 3.14+
Node.js v22+ (for mobile-mcp)
Android Platform Tools (adb) for mobile automation
Telegram Bot Token (from @BotFather)
OpenRouter API Key (from openrouter.ai)
Weeek API Token (configured in mcp_tasks/weeek_api.py)
Ollama with nomic-embed-text (optional, for
/docs_embedand RAG features)sentence-transformers (optional, for RAG reranking - installs with PyTorch)
Installation
1. Clone and Navigate
2. Install Node.js v22+ (if needed)
3. Verify Android Platform Tools
4. Setup Python Environment
5. Configure Environment Variables
6. Install Ollama (Optional - for /docs_embed)
Running the System
macOS OpenMP Workaround (Required)
Add environment variable to avoid OpenMP library conflicts:
Start All MCP Servers + Bot
First run:
May take 30-40 seconds while mobile-mcp downloads and initializes
BGE reranker model (~280MB) downloads automatically on first RAG query
Test Individual MCP Servers (Optional)
Usage
Telegram Commands
/start- Welcome message/tasks [query]- Query Weeek tasks/fact- Get random fact/rag [true|false|on|off]- Toggle RAG mode for context-aware responses/docs_embed- Generate embeddings and FAISS index from docs/ markdown files/subscribe- Enable periodic task summaries/unsubscribe- Disable summaries
Natural Language Examples
System Configuration
client/config.py
MAX_CONVERSATION_HISTORY- Message limit per user (default: 50)TOOL_CALL_TIMEOUT- MCP tool timeout (default: 30s)TASK_FETCH_INTERVAL- Task monitoring interval (default: 30s)SUMMARY_INTERVAL- Summary delivery interval (default: 120s)MCP_SERVERS- List of MCP server configurations
client/bot.py
max_iterations- Max chained tool calls (default: 15)
Project Statistics
Languages: Python 3.14, Node.js v22+
MCP Servers: 3 (Weeek Tasks, Random Facts, Mobile MCP)
Total Tools: 21
1 task management (get_tasks)
1 information (get_fact)
19 mobile automation (mobile_*)
API Integrations: Telegram, OpenRouter, Weeek
Transport: stdio (MCP), HTTPS (APIs)
Project Structure
Troubleshooting
Bot Conflict Error
Solution: Only one bot instance can run at a time. Kill all instances:
Mobile MCP Connection Errors
Verify Node.js v22+:
node --versionVerify adb:
adb --versionCheck device connected:
adb devicesFirst run takes 30-40 seconds (downloading package)
MCP Server Errors
Check logs for initialization errors
Verify all prerequisites installed
Ensure environment variables configured correctly
Technology Stack
MCP Servers
MCP SDK (Python) - Model Context Protocol implementation
httpx - Async HTTP client for Weeek API
@mobilenext/mobile-mcp - Node.js mobile automation
Client
python-telegram-bot - Telegram bot framework
MCP SDK - MCP client implementation
httpx - OpenRouter API client
AsyncExitStack - Multi-context manager for MCP connections
FAISS - Vector similarity search for RAG
NumPy - Vector operations and normalization
sentence-transformers - Cross-encoder reranking (BGE model)
PyTorch - Deep learning backend for reranking
External Tools
Node.js v22+ - Runtime for mobile-mcp
npx - Package runner
Android Platform Tools (adb) - Device communication
Recent Updates
v2.3 - RAG Enhancement: Reranking & Filtering Pipeline
β Added 4-stage RAG pipeline for improved retrieval accuracy
β Stage 1: Query embedding generation with Ollama (768 dims)
β Stage 2: FAISS retrieval (top-10) + cosine similarity filtering (β₯0.71)
β Stage 3: Cross-encoder reranking with BGE reranker model
β Stage 4: Query augmentation with top-3 reranked chunks
β Integrated sentence-transformers library for reranking
β Added
reranker.pymodule with lazy model initializationβ Comprehensive logging for all 4 pipeline stages with data printing
β Configurable thresholds and top-k values in config.py
β Automatic fallback handling (reranking β FAISS β standard query)
β Added OpenMP workaround for macOS (KMP_DUPLICATE_LIB_OK)
β Updated documentation with detailed pipeline flow diagrams
v2.2 - RAG (Retrieval Augmented Generation) System
β Added
/ragcommand for per-user RAG mode toggleβ FAISS vector search integration (IndexFlatIP with cosine similarity)
β Automatic context retrieval from document embeddings (top-3 chunks)
β RAG-specific system prompt for context-aware AI responses
β Per-user RAG state persistence across bot restarts
β Comprehensive logging for embeddings, chunks, and augmented queries
β Graceful fallback to standard mode on errors
β Critical bug fix: Augmented queries now correctly sent to AI model
β Enhanced
/docs_embedto create FAISS index alongside JSON export
v2.1 - Document Embeddings Feature
β Added
/docs_embedcommand for generating vector embeddingsβ Integrated Ollama with nomic-embed-text model (768 dimensions)
β Paragraph-based chunking for optimal embedding quality
β JSON output with timestamps for easy integration
β Comprehensive error handling and logging
β Updated welcome message and documentation
v2.0 - Mobile Automation Integration
β Added mobile-mcp server (19 Android/iOS automation tools)
β Refactored MCP manager with AsyncExitStack for stable multi-server connections
β Increased tool call iteration limit from 5 to 15
β Fixed environment variable inheritance for Node.js processes
β Updated documentation with mobile automation examples
Credits
Weeek API - Task management data
Anthropic - MCP protocol specification
OpenRouter - AI model access
python-telegram-bot - Telegram integration
mobile-mcp - Mobile automation capabilities
License
This project demonstrates MCP integration with AI-powered conversational interfaces.