Provides read-only access to PostgreSQL databases for e-commerce analytics, enabling schema inspection, sample data retrieval, and SQL query execution with automatic safety limits and parallel query support.
Connects to Supabase PostgreSQL databases for secure, read-only analytics queries on e-commerce data with progressive context loading and automatic schema discovery.
VTION E-Commerce MCP Server
A Model Context Protocol (MCP) server providing secure, read-only access to VTION e-commerce analytics data. Built with FastAPI and PostgreSQL, supporting both MCP native protocol and REST API.
Features
MCP Protocol Support: Full implementation of Model Context Protocol for AI agent integration
Multiple Transport Modes:
FastMCP (stdio) for direct MCP client integration
HTTP/SSE for web-based clients
REST API for traditional HTTP clients
Secure by Design: Read-only access, query validation, connection pooling
Progressive Context Loading: Efficient data discovery with 4 context levels
Parallel Query Execution: Multiple queries execute concurrently for optimal performance
Auto-limiting: Raw queries limited to 5 rows, aggregated queries to 1,000 rows
Rich Query Tools: Schema inspection, sample data, flexible querying
Architecture
Quick Start
1. Installation
2. Configuration
Required Environment Variables:
3. Run the Server
Option A: FastMCP Mode (for MCP clients)
Option B: HTTP/SSE Mode (for web clients)
Option C: Production Deployment
Database Configuration
The MCP server connects to your Supabase PostgreSQL database. The connection string is already configured in .env.example:
Important: The password is URL-encoded (Vtion@2023# → Vtion%402023%23)
Expected Schema
The server works with any PostgreSQL schema. Common e-commerce tables include:
products- Product catalog with inventoryorders- Order history and transactionscustomers- Customer profiles and demographicscart_items- Shopping cart datauser_sessions- User engagement metrics
The server will automatically discover your schema at runtime.
Usage
MCP Tools
The server provides 5 MCP tools:
1. get_context(level, dataset_id?)
Progressive context loading:
Level 0: Global rules and guidelines
Level 1: List all datasets
Level 2: Schema for specific dataset (requires dataset_id)
Level 3: Full details with sample data (requires dataset_id)
2. list_available_datasets()
List all configured datasets with metadata.
3. get_dataset_schema(dataset_id)
Get complete schema for a dataset (equivalent to get_context(level=2)).
4. query_dataset(dataset_id, query, response_format?)
Execute SQL SELECT queries on a dataset.
Parallel Execution: Call query_dataset() multiple times - they execute in parallel automatically!
5. get_dataset_sample(dataset_id, table_name, limit?)
Get sample rows from a specific table.
REST API Endpoints
When running server.py, these HTTP endpoints are available:
Health Check
Response:
List Datasets
Execute Query
MCP Protocol Endpoint
Implements full MCP protocol over HTTP with JSON-RPC 2.0.
Security
Query Restrictions
Only SELECT allowed: INSERT, UPDATE, DELETE, DROP, etc. are blocked
Automatic limits: Raw queries max 5 rows, aggregated queries max 1,000 rows
Connection pooling: Prevents resource exhaustion
Timeout protection: 60-second query timeout
Authentication
⚠️ Important: This server does not include authentication. For production:
Add authentication middleware (JWT, API keys, OAuth)
Use environment-specific credentials
Enable database row-level security (RLS)
Run behind a reverse proxy (nginx, Cloudflare)
Development
Testing Connection
Adding Multiple Datasets
Edit .env to add more datasets:
Customizing Business Logic
The server inherits business logic from indian-analytics-mcp:
Query validation: Modify
query_dataset()invtion_ecom_mcp.pyResponse formatting: Update
format_markdown_table()helperAdd custom tools: Use
@mcp.tool()decoratorSchema customization: Edit
DATASET_1_DICTIONARYin.env
Deployment
Render
Create new Web Service
Connect GitHub repository
Set build command:
pip install -r requirements.txtSet start command:
python server.pyAdd environment variables from
.env
Docker
Railway / Fly.io
Both support automatic deployment from GitHub with environment variables.
Troubleshooting
Connection Issues
No Datasets Found
Check environment variables are set:
Query Errors
Verify table names with
get_dataset_schema()Check column names match schema
Ensure query is valid SQL SELECT statement
Import Errors
Credits
Based on indian-analytics-mcp by @adityac7.
License
MIT License - see LICENSE file for details
Support
For issues and questions:
GitHub Issues: /issues
Email: support@vtion.com
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Provides secure, read-only access to VTION e-commerce analytics data through MCP protocol. Enables AI agents to query PostgreSQL databases, discover schemas, and analyze product, order, and customer data with automatic query validation and parallel execution.