Skip to main content
Glama

VTION E-Commerce MCP Server

by adityac7

VTION E-Commerce MCP Server

A Model Context Protocol (MCP) server providing secure, read-only access to VTION e-commerce analytics data. Built with FastAPI and PostgreSQL, supporting both MCP native protocol and REST API.

Features

  • MCP Protocol Support: Full implementation of Model Context Protocol for AI agent integration

  • Multiple Transport Modes:

    • FastMCP (stdio) for direct MCP client integration

    • HTTP/SSE for web-based clients

    • REST API for traditional HTTP clients

  • Secure by Design: Read-only access, query validation, connection pooling

  • Progressive Context Loading: Efficient data discovery with 4 context levels

  • Parallel Query Execution: Multiple queries execute concurrently for optimal performance

  • Auto-limiting: Raw queries limited to 5 rows, aggregated queries to 1,000 rows

  • Rich Query Tools: Schema inspection, sample data, flexible querying

Architecture

VTION-ECOM/ ├── vtion_ecom_mcp.py # Main MCP server with FastMCP ├── server.py # Standalone HTTP/SSE server ├── requirements.txt # Python dependencies ├── .env.example # Configuration template ├── .gitignore # Git ignore rules └── README.md # This file

Quick Start

1. Installation

# Clone the repository git clone <your-repo-url> cd VTION-ECOM # Install dependencies pip install -r requirements.txt

2. Configuration

# Copy environment template cp .env.example .env # Edit .env with your database credentials nano .env

Required Environment Variables:

DATASET_1_NAME=vtion_ecom DATASET_1_DESC=VTION E-commerce platform analytics data DATASET_1_CONNECTION=postgresql://postgres:PASSWORD@host:port/db?sslmode=require DATASET_1_DICTIONARY={"table1":"desc","table2":"desc"}

3. Run the Server

Option A: FastMCP Mode (for MCP clients)

python vtion_ecom_mcp.py

Option B: HTTP/SSE Mode (for web clients)

python server.py # Server runs on http://localhost:10000

Option C: Production Deployment

uvicorn server:app --host 0.0.0.0 --port 10000 --workers 4

Database Configuration

The MCP server connects to your Supabase PostgreSQL database. The connection string is already configured in .env.example:

postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=require

Important: The password is URL-encoded (Vtion@2023#Vtion%402023%23)

Expected Schema

The server works with any PostgreSQL schema. Common e-commerce tables include:

  • products - Product catalog with inventory

  • orders - Order history and transactions

  • customers - Customer profiles and demographics

  • cart_items - Shopping cart data

  • user_sessions - User engagement metrics

The server will automatically discover your schema at runtime.

Usage

MCP Tools

The server provides 5 MCP tools:

1. get_context(level, dataset_id?)

Progressive context loading:

  • Level 0: Global rules and guidelines

  • Level 1: List all datasets

  • Level 2: Schema for specific dataset (requires dataset_id)

  • Level 3: Full details with sample data (requires dataset_id)

# Get global rules get_context(level=0) # List all datasets get_context(level=1) # Get schema for dataset 1 get_context(level=2, dataset_id=1) # Get full details with samples get_context(level=3, dataset_id=1)

2. list_available_datasets()

List all configured datasets with metadata.

list_available_datasets()

3. get_dataset_schema(dataset_id)

Get complete schema for a dataset (equivalent to get_context(level=2)).

get_dataset_schema(dataset_id=1)

4. query_dataset(dataset_id, query, response_format?)

Execute SQL SELECT queries on a dataset.

# Simple query query_dataset( dataset_id=1, query="SELECT * FROM products WHERE category = 'Electronics' LIMIT 10" ) # Aggregated query query_dataset( dataset_id=1, query="SELECT category, COUNT(*) as count, AVG(price) as avg_price FROM products GROUP BY category" ) # JSON response format query_dataset( dataset_id=1, query="SELECT * FROM orders WHERE status = 'completed'", response_format="json" )

Parallel Execution: Call query_dataset() multiple times - they execute in parallel automatically!

# These three queries execute concurrently: query_dataset(1, "SELECT category, COUNT(*) FROM products GROUP BY category") query_dataset(1, "SELECT status, COUNT(*) FROM orders GROUP BY status") query_dataset(1, "SELECT gender, COUNT(*) FROM customers GROUP BY gender")

5. get_dataset_sample(dataset_id, table_name, limit?)

Get sample rows from a specific table.

get_dataset_sample( dataset_id=1, table_name="products", limit=20 )

REST API Endpoints

When running server.py, these HTTP endpoints are available:

Health Check

curl http://localhost:10000/ # or curl http://localhost:10000/health

Response:

{ "status": "ok", "service": "VTION E-Commerce MCP Server", "datasets": 1, "version": "1.0", "mcp_endpoint": "/mcp", "mcp_protocol_version": "2025-06-18" }

List Datasets

curl http://localhost:10000/datasets

Execute Query

curl -X POST http://localhost:10000/query \ -H "Content-Type: application/json" \ -d '{ "dataset_id": 1, "query": "SELECT * FROM products LIMIT 5" }'

MCP Protocol Endpoint

POST /mcp

Implements full MCP protocol over HTTP with JSON-RPC 2.0.

Security

Query Restrictions

  • Only SELECT allowed: INSERT, UPDATE, DELETE, DROP, etc. are blocked

  • Automatic limits: Raw queries max 5 rows, aggregated queries max 1,000 rows

  • Connection pooling: Prevents resource exhaustion

  • Timeout protection: 60-second query timeout

Authentication

⚠️ Important: This server does not include authentication. For production:

  1. Add authentication middleware (JWT, API keys, OAuth)

  2. Use environment-specific credentials

  3. Enable database row-level security (RLS)

  4. Run behind a reverse proxy (nginx, Cloudflare)

Development

Testing Connection

# Test database connectivity python -c " import asyncio import asyncpg async def test(): conn = await asyncpg.connect('postgresql://...') print('Connected!') tables = await conn.fetch('SELECT table_name FROM information_schema.tables WHERE table_schema = \\'public\\'') print('Tables:', [t['table_name'] for t in tables]) await conn.close() asyncio.run(test()) "

Adding Multiple Datasets

Edit .env to add more datasets:

# Dataset 1 DATASET_1_NAME=vtion_ecom DATASET_1_CONNECTION=postgresql://... DATASET_1_DESC=Main e-commerce data DATASET_1_DICTIONARY={"products":"Product catalog"} # Dataset 2 DATASET_2_NAME=analytics DATASET_2_CONNECTION=postgresql://... DATASET_2_DESC=Analytics warehouse DATASET_2_DICTIONARY={"events":"User events"}

Customizing Business Logic

The server inherits business logic from indian-analytics-mcp:

  • Query validation: Modify query_dataset() in vtion_ecom_mcp.py

  • Response formatting: Update format_markdown_table() helper

  • Add custom tools: Use @mcp.tool() decorator

  • Schema customization: Edit DATASET_1_DICTIONARY in .env

Deployment

Render

  1. Create new Web Service

  2. Connect GitHub repository

  3. Set build command: pip install -r requirements.txt

  4. Set start command: python server.py

  5. Add environment variables from .env

Docker

FROM python:3.11-slim WORKDIR /app COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt COPY . . ENV PORT=10000 EXPOSE 10000 CMD ["python", "server.py"]
docker build -t vtion-mcp . docker run -p 10000:10000 --env-file .env vtion-mcp

Railway / Fly.io

Both support automatic deployment from GitHub with environment variables.

Troubleshooting

Connection Issues

# Test database connection psql "postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=require"

No Datasets Found

Check environment variables are set:

env | grep DATASET_

Query Errors

  • Verify table names with get_dataset_schema()

  • Check column names match schema

  • Ensure query is valid SQL SELECT statement

Import Errors

pip install --upgrade -r requirements.txt

Credits

Based on indian-analytics-mcp by @adityac7.

License

MIT License - see LICENSE file for details

Support

For issues and questions:

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Provides secure, read-only access to VTION e-commerce analytics data through MCP protocol. Enables AI agents to query PostgreSQL databases, discover schemas, and analyze product, order, and customer data with automatic query validation and parallel execution.

  1. Features
    1. Architecture
      1. Quick Start
        1. 1. Installation
        2. 2. Configuration
        3. 3. Run the Server
      2. Database Configuration
        1. Expected Schema
      3. Usage
        1. MCP Tools
        2. REST API Endpoints
        3. MCP Protocol Endpoint
      4. Security
        1. Query Restrictions
        2. Authentication
      5. Development
        1. Testing Connection
        2. Adding Multiple Datasets
        3. Customizing Business Logic
      6. Deployment
        1. Render
        2. Docker
        3. Railway / Fly.io
      7. Troubleshooting
        1. Connection Issues
        2. No Datasets Found
        3. Query Errors
        4. Import Errors
      8. Credits
        1. License
          1. Support

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/adityac7/VTION-ECOM'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server