# DP-MCP Server API Reference
This document provides detailed API reference for all tools available in the DP-MCP server.
## Overview
The DP-MCP server provides 11 tools organized into three categories:
- **PostgreSQL Tools**: Database operations and introspection
- **MinIO Tools**: Object storage management
- **Combined Tools**: Cross-platform operations
All tools follow the MCP (Model Context Protocol) specification and return JSON-RPC 2.0 formatted responses.
## PostgreSQL Tools
### execute_sql_query
Execute SQL queries on the PostgreSQL database with automatic result formatting.
**Parameters:**
- `query` (string, required): SQL query to execute
- `limit` (integer, optional): Maximum rows to return (default: 1000)
**Example:**
```json
{
"tool": "execute_sql_query",
"arguments": {
"query": "SELECT id, username, email FROM users WHERE is_active = true",
"limit": 50
}
}
```
**Response Format:**
- **SELECT queries**: Formatted table with headers and data
- **DML queries**: Row count affected
- **DDL queries**: Success/failure message
**Error Handling:**
- SQL syntax errors
- Connection timeouts
- Permission denied
- Table/column not found
### list_db_tables
List all tables in a specified database schema.
**Parameters:**
- `schema` (string, optional): Schema name (default: "public")
**Example:**
```json
{
"tool": "list_db_tables",
"arguments": {
"schema": "public"
}
}
```
**Response:**
Returns a formatted list of tables with their types (TABLE, VIEW, etc.).
### describe_db_table
Get detailed structure information about a database table.
**Parameters:**
- `table_name` (string, required): Name of the table to describe
- `schema` (string, optional): Schema name (default: "public")
**Example:**
```json
{
"tool": "describe_db_table",
"arguments": {
"table_name": "users",
"schema": "public"
}
}
```
**Response:**
Returns column information including:
- Column names
- Data types
- Null constraints
- Default values
- Primary keys and indexes
### export_table_csv
Export table data in CSV format for analysis or backup.
**Parameters:**
- `table_name` (string, required): Name of the table to export
- `limit` (integer, optional): Maximum rows to export (default: 10,000)
- `where_clause` (string, optional): SQL WHERE clause for filtering
**Example:**
```json
{
"tool": "export_table_csv",
"arguments": {
"table_name": "orders",
"limit": 5000,
"where_clause": "order_date >= '2025-01-01' AND status = 'completed'"
}
}
```
**Response:**
Returns CSV-formatted data with headers.
## MinIO Object Storage Tools
### list_minio_buckets
List all available buckets in the MinIO instance.
**Parameters:**
None
**Example:**
```json
{
"tool": "list_minio_buckets",
"arguments": {}
}
```
**Response:**
Returns bucket names with creation dates.
### list_bucket_objects
List objects within a specific bucket with optional filtering.
**Parameters:**
- `bucket_name` (string, required): Name of the bucket
- `prefix` (string, optional): Object name prefix for filtering
- `max_keys` (integer, optional): Maximum objects to return (default: 1000)
**Example:**
```json
{
"tool": "list_bucket_objects",
"arguments": {
"bucket_name": "data-lake",
"prefix": "backups/2025/",
"max_keys": 100
}
}
```
**Response:**
Returns formatted list with object names, sizes, and modification dates.
### upload_to_minio
Upload data to MinIO object storage.
**Parameters:**
- `bucket_name` (string, required): Target bucket name
- `object_name` (string, required): Object key/path
- `data` (string, required): Data content to upload
- `content_type` (string, optional): MIME type (default: "text/plain")
**Example:**
```json
{
"tool": "upload_to_minio",
"arguments": {
"bucket_name": "data-lake",
"object_name": "reports/monthly_report_2025_01.json",
"data": "{\"total_sales\": 125000, \"orders\": 1250}",
"content_type": "application/json"
}
}
```
**Response:**
Confirms successful upload with object details.
### download_from_minio
Download and retrieve object data from MinIO.
**Parameters:**
- `bucket_name` (string, required): Source bucket name
- `object_name` (string, required): Object key/path to download
**Example:**
```json
{
"tool": "download_from_minio",
"arguments": {
"bucket_name": "data-lake",
"object_name": "reports/monthly_report_2025_01.json"
}
}
```
**Response:**
Returns the object content as text.
### create_minio_bucket
Create a new bucket in MinIO.
**Parameters:**
- `bucket_name` (string, required): Name for the new bucket
- `region` (string, optional): AWS region specification
**Example:**
```json
{
"tool": "create_minio_bucket",
"arguments": {
"bucket_name": "new-data-bucket",
"region": "us-east-1"
}
}
```
**Response:**
Confirms bucket creation or reports if bucket already exists.
### delete_minio_object
Delete an object from MinIO storage.
**Parameters:**
- `bucket_name` (string, required): Source bucket name
- `object_name` (string, required): Object key/path to delete
**Example:**
```json
{
"tool": "delete_minio_object",
"arguments": {
"bucket_name": "data-lake",
"object_name": "temp/old_file.txt"
}
}
```
**Response:**
Confirms successful deletion.
## Combined Operations
### backup_table_to_minio
Automated backup of PostgreSQL table data to MinIO object storage.
**Parameters:**
- `table_name` (string, required): Database table to backup
- `bucket_name` (string, optional): Target bucket (default: "backups")
- `schema` (string, optional): Database schema (default: "public")
- `limit` (integer, optional): Maximum rows to backup (default: 10,000)
- `where_clause` (string, optional): SQL filtering condition
**Example:**
```json
{
"tool": "backup_table_to_minio",
"arguments": {
"table_name": "user_activities",
"bucket_name": "daily-backups",
"schema": "analytics",
"limit": 50000,
"where_clause": "activity_date >= CURRENT_DATE - INTERVAL '30 days'"
}
}
```
**Workflow:**
1. Executes SQL query to extract table data
2. Converts results to CSV format
3. Generates timestamped object name
4. Uploads CSV data to specified MinIO bucket
5. Returns confirmation with backup location
**Object Naming:**
`{schema}_{table_name}_{timestamp}.csv`
Example: `public_users_20250129_143022.csv`
**Response:**
Confirms successful backup with full object path and upload details.
## Error Handling
All tools implement comprehensive error handling:
### Database Errors
- **Connection Failed**: PostgreSQL server unreachable
- **Authentication Failed**: Invalid credentials
- **Permission Denied**: Insufficient database privileges
- **SQL Syntax Error**: Invalid query syntax
- **Table Not Found**: Referenced table doesn't exist
- **Timeout**: Query execution timeout
### Storage Errors
- **Connection Failed**: MinIO server unreachable
- **Authentication Failed**: Invalid access keys
- **Bucket Not Found**: Referenced bucket doesn't exist
- **Object Not Found**: Referenced object doesn't exist
- **Permission Denied**: Insufficient storage privileges
- **Storage Full**: Insufficient storage space
### Validation Errors
- **Missing Required Parameter**: Required argument not provided
- **Invalid Parameter Type**: Wrong data type for parameter
- **Invalid Parameter Value**: Parameter value out of acceptable range
- **Malformed Request**: Invalid JSON-RPC format
### System Errors
- **Network Timeout**: Network communication timeout
- **Memory Error**: Insufficient memory for operation
- **Disk Space**: Insufficient local disk space
- **Internal Server Error**: Unexpected server error
## Response Formats
### Success Response
```json
{
"jsonrpc": "2.0",
"id": "request-id",
"result": {
"content": [
{
"type": "text",
"text": "Tool execution result"
}
]
}
}
```
### Error Response
```json
{
"jsonrpc": "2.0",
"id": "request-id",
"error": {
"code": -32000,
"message": "Tool execution failed",
"data": {
"tool": "tool_name",
"error": "Detailed error message",
"parameters": "Request parameters"
}
}
}
```
## Rate Limiting and Performance
### Default Limits
- **Query Results**: 1,000 rows (configurable via `limit` parameter)
- **Object Listing**: 1,000 objects (configurable via `max_keys`)
- **CSV Export**: 10,000 rows (configurable via `limit`)
- **Object Size**: No built-in limit (depends on available memory)
### Performance Considerations
- Large result sets are automatically paginated
- CSV exports stream data to minimize memory usage
- Database connections are pooled for efficiency
- Object uploads use streaming to handle large files
### Timeout Settings
- **Database Query**: 60 seconds
- **Object Upload**: 300 seconds
- **Object Download**: 300 seconds
- **Connection Establishment**: 10 seconds
## Security Considerations
### Database Security
- SSL connections supported (configurable)
- Parameterized queries prevent SQL injection
- Connection pooling with authentication
- Read-only recommendations for production
### Object Storage Security
- HTTPS connections supported (configurable)
- Access key authentication required
- Bucket-level permissions respected
- Object encryption supported
### Network Security
- Server binds to localhost by default
- Production deployments should use reverse proxy
- CORS headers configurable
- Request logging available for audit
## MCP Protocol Compliance
The DP-MCP server fully complies with MCP v1.0 specification:
- **JSON-RPC 2.0**: All communications use JSON-RPC format
- **Tool Registration**: Tools are properly registered with schemas
- **Error Handling**: Standard error codes and formats
- **Transport Agnostic**: Supports multiple transport protocols
- **Streaming Support**: Compatible with streaming HTTP transport
- **Type Safety**: Full parameter validation and type checking
For more information about MCP specification, visit: https://spec.modelcontextprotocol.io/