VTION E-Commerce MCP Server
Provides read-only access to PostgreSQL databases for e-commerce analytics, enabling schema inspection, sample data retrieval, and SQL query execution with automatic safety limits and parallel query support.
Connects to Supabase PostgreSQL databases for secure, read-only analytics queries on e-commerce data with progressive context loading and automatic schema discovery.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@VTION E-Commerce MCP Servershow me the top 5 selling products by category last month"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
VTION E-Commerce MCP Server
A Model Context Protocol (MCP) server providing secure, read-only access to VTION e-commerce analytics data. Built with FastAPI and PostgreSQL, supporting both MCP native protocol and REST API.
Features
MCP Protocol Support: Full implementation of Model Context Protocol for AI agent integration
Multiple Transport Modes:
FastMCP (stdio) for direct MCP client integration
HTTP/SSE for web-based clients
REST API for traditional HTTP clients
Secure by Design: Read-only access, query validation, connection pooling
Progressive Context Loading: Efficient data discovery with 4 context levels
Parallel Query Execution: Multiple queries execute concurrently for optimal performance
Auto-limiting: Raw queries limited to 5 rows, aggregated queries to 1,000 rows
Rich Query Tools: Schema inspection, sample data, flexible querying
Architecture
VTION-ECOM/
├── vtion_ecom_mcp.py # Main MCP server with FastMCP
├── server.py # Standalone HTTP/SSE server
├── requirements.txt # Python dependencies
├── .env.example # Configuration template
├── .gitignore # Git ignore rules
└── README.md # This fileQuick Start
1. Installation
# Clone the repository
git clone <your-repo-url>
cd VTION-ECOM
# Install dependencies
pip install -r requirements.txt2. Configuration
# Copy environment template
cp .env.example .env
# Edit .env with your database credentials
nano .envRequired Environment Variables:
DATASET_1_NAME=vtion_ecom
DATASET_1_DESC=VTION E-commerce platform analytics data
DATASET_1_CONNECTION=postgresql://postgres:PASSWORD@host:port/db?sslmode=require
DATASET_1_DICTIONARY={"table1":"desc","table2":"desc"}3. Run the Server
Option A: FastMCP Mode (for MCP clients)
python vtion_ecom_mcp.pyOption B: HTTP/SSE Mode (for web clients)
python server.py
# Server runs on http://localhost:10000Option C: Production Deployment
uvicorn server:app --host 0.0.0.0 --port 10000 --workers 4Database Configuration
The MCP server connects to your Supabase PostgreSQL database. The connection string is already configured in .env.example:
postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=requireImportant: The password is URL-encoded (Vtion@2023# → Vtion%402023%23)
Expected Schema
The server works with any PostgreSQL schema. Common e-commerce tables include:
products- Product catalog with inventoryorders- Order history and transactionscustomers- Customer profiles and demographicscart_items- Shopping cart datauser_sessions- User engagement metrics
The server will automatically discover your schema at runtime.
Usage
MCP Tools
The server provides 5 MCP tools:
1. get_context(level, dataset_id?)
Progressive context loading:
Level 0: Global rules and guidelines
Level 1: List all datasets
Level 2: Schema for specific dataset (requires dataset_id)
Level 3: Full details with sample data (requires dataset_id)
# Get global rules
get_context(level=0)
# List all datasets
get_context(level=1)
# Get schema for dataset 1
get_context(level=2, dataset_id=1)
# Get full details with samples
get_context(level=3, dataset_id=1)2. list_available_datasets()
List all configured datasets with metadata.
list_available_datasets()3. get_dataset_schema(dataset_id)
Get complete schema for a dataset (equivalent to get_context(level=2)).
get_dataset_schema(dataset_id=1)4. query_dataset(dataset_id, query, response_format?)
Execute SQL SELECT queries on a dataset.
# Simple query
query_dataset(
dataset_id=1,
query="SELECT * FROM products WHERE category = 'Electronics' LIMIT 10"
)
# Aggregated query
query_dataset(
dataset_id=1,
query="SELECT category, COUNT(*) as count, AVG(price) as avg_price FROM products GROUP BY category"
)
# JSON response format
query_dataset(
dataset_id=1,
query="SELECT * FROM orders WHERE status = 'completed'",
response_format="json"
)Parallel Execution: Call query_dataset() multiple times - they execute in parallel automatically!
# These three queries execute concurrently:
query_dataset(1, "SELECT category, COUNT(*) FROM products GROUP BY category")
query_dataset(1, "SELECT status, COUNT(*) FROM orders GROUP BY status")
query_dataset(1, "SELECT gender, COUNT(*) FROM customers GROUP BY gender")5. get_dataset_sample(dataset_id, table_name, limit?)
Get sample rows from a specific table.
get_dataset_sample(
dataset_id=1,
table_name="products",
limit=20
)REST API Endpoints
When running server.py, these HTTP endpoints are available:
Health Check
curl http://localhost:10000/
# or
curl http://localhost:10000/healthResponse:
{
"status": "ok",
"service": "VTION E-Commerce MCP Server",
"datasets": 1,
"version": "1.0",
"mcp_endpoint": "/mcp",
"mcp_protocol_version": "2025-06-18"
}List Datasets
curl http://localhost:10000/datasetsExecute Query
curl -X POST http://localhost:10000/query \
-H "Content-Type: application/json" \
-d '{
"dataset_id": 1,
"query": "SELECT * FROM products LIMIT 5"
}'MCP Protocol Endpoint
POST /mcpImplements full MCP protocol over HTTP with JSON-RPC 2.0.
Security
Query Restrictions
Only SELECT allowed: INSERT, UPDATE, DELETE, DROP, etc. are blocked
Automatic limits: Raw queries max 5 rows, aggregated queries max 1,000 rows
Connection pooling: Prevents resource exhaustion
Timeout protection: 60-second query timeout
Authentication
⚠️ Important: This server does not include authentication. For production:
Add authentication middleware (JWT, API keys, OAuth)
Use environment-specific credentials
Enable database row-level security (RLS)
Run behind a reverse proxy (nginx, Cloudflare)
Development
Testing Connection
# Test database connectivity
python -c "
import asyncio
import asyncpg
async def test():
conn = await asyncpg.connect('postgresql://...')
print('Connected!')
tables = await conn.fetch('SELECT table_name FROM information_schema.tables WHERE table_schema = \\'public\\'')
print('Tables:', [t['table_name'] for t in tables])
await conn.close()
asyncio.run(test())
"Adding Multiple Datasets
Edit .env to add more datasets:
# Dataset 1
DATASET_1_NAME=vtion_ecom
DATASET_1_CONNECTION=postgresql://...
DATASET_1_DESC=Main e-commerce data
DATASET_1_DICTIONARY={"products":"Product catalog"}
# Dataset 2
DATASET_2_NAME=analytics
DATASET_2_CONNECTION=postgresql://...
DATASET_2_DESC=Analytics warehouse
DATASET_2_DICTIONARY={"events":"User events"}Customizing Business Logic
The server inherits business logic from indian-analytics-mcp:
Query validation: Modify
query_dataset()invtion_ecom_mcp.pyResponse formatting: Update
format_markdown_table()helperAdd custom tools: Use
@mcp.tool()decoratorSchema customization: Edit
DATASET_1_DICTIONARYin.env
Deployment
Render
Create new Web Service
Connect GitHub repository
Set build command:
pip install -r requirements.txtSet start command:
python server.pyAdd environment variables from
.env
Docker
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
ENV PORT=10000
EXPOSE 10000
CMD ["python", "server.py"]docker build -t vtion-mcp .
docker run -p 10000:10000 --env-file .env vtion-mcpRailway / Fly.io
Both support automatic deployment from GitHub with environment variables.
Troubleshooting
Connection Issues
# Test database connection
psql "postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=require"No Datasets Found
Check environment variables are set:
env | grep DATASET_Query Errors
Verify table names with
get_dataset_schema()Check column names match schema
Ensure query is valid SQL SELECT statement
Import Errors
pip install --upgrade -r requirements.txtCredits
Based on indian-analytics-mcp by @adityac7.
License
MIT License - see LICENSE file for details
Support
For issues and questions:
GitHub Issues: /issues
Email: support@vtion.com
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/adityac7/VTION-ECOM'
If you have feedback or need assistance with the MCP directory API, please join our Discord server