Enables querying of Google Cloud SQL (PostgreSQL) and BigQuery datasets, with intelligent routing between data sources for transactional and analytical queries.
Provides tools for executing SQL queries against Cloud SQL PostgreSQL databases, including schema discovery and table listing for customer, order, and vendor data.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@GCP Sales Analytics MCP Servershow me total sales for last quarter"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
GCP Sales Analytics POC
A proof of concept demonstrating an intelligent AI agent that can query both Google Cloud SQL (PostgreSQL) and BigQuery public datasets, automatically choosing the right data source based on the question.
Architecture
Features
Dual Data Source Access: Query both Cloud SQL and BigQuery seamlessly
Intelligent Routing: Agent automatically determines which data source to use
MCP Server: Standards-compliant Model Context Protocol server
Synthetic Data: Pre-populated with 50 customers, 50 vendors, and 50 orders
Infrastructure as Code: Terraform for Cloud SQL deployment
Production-Ready: Error handling, logging, and security best practices
Data Sources
Cloud SQL (PostgreSQL)
Contains transactional data with three tables:
customers: Customer information (name, email, address, etc.)
orders: Order details (amounts, dates, status, products)
vendors: Vendor information
Use for queries about:
Specific customer information
Recent order details
Vendor data
Current sales transactions
BigQuery (thelook_ecommerce)
Public e-commerce analytics dataset with comprehensive data:
products, users, events
inventory_items, order_items
distribution_centers
Use for queries about:
Product analytics
User behavior patterns
Inventory analysis
Historical trends
Large-scale analytics
Prerequisites
Google Cloud Platform account with billing enabled
GCP Project with appropriate permissions
Tools installed:
gcloudCLIterraform(>= 1.0)python3(>= 3.9)psql(PostgreSQL client)
Anthropic API key for Claude
Setup
1. Clone and Configure
2. Install Python Dependencies
3. Authenticate with GCP
4. Deploy Infrastructure
The deployment script will:
Deploy Cloud SQL instance with Terraform
Create database schema (customers, orders, vendors)
Seed database with synthetic data
Verify BigQuery access
This process takes approximately 5-10 minutes.
Usage
Running the MCP Server
The MCP server provides tools for:
query_cloudsql- Execute SQL against Cloud SQLquery_bigquery- Execute SQL against BigQuerylist_cloudsql_tables- List Cloud SQL tableslist_bigquery_tables- List BigQuery tablesget_cloudsql_schema- Get table schema from Cloud SQLget_bigquery_schema- Get table schema from BigQuery
Running the Agent (Interactive Mode)
Example interactions:
Running Tests
Project Structure
How It Works
Agent Decision Making
The agent uses Claude's function calling capabilities with a specialized system prompt that guides data source selection:
Question Analysis: Agent analyzes the user's question
Schema Discovery: May first list tables to understand available data
Source Selection: Chooses Cloud SQL or BigQuery based on:
Keywords (customers, vendors → Cloud SQL)
Query type (analytics, trends → BigQuery)
Data recency requirements
Query Execution: Formulates and executes appropriate SQL
Result Presentation: Formats and explains results
Example Decision Flow
Configuration
Environment Variables (.env)
Terraform Variables
See terraform/terraform.tfvars.example
Security Considerations
This is a proof of concept with simplified security:
Cloud SQL has public IP (uses authorized networks)
Database credentials in environment variables
No VPC or private networking
SQL injection prevention (SELECT-only queries)
For production:
Use Cloud SQL Proxy or private IP
Store credentials in Secret Manager
Implement VPC and private networking
Add query validation and sanitization
Enable Cloud SQL backups
Use IAM authentication
Implement rate limiting
Cost Estimation
Approximate costs for running this POC:
Cloud SQL (db-f1-micro): ~$7-10/month
BigQuery: Pay per query (~$5/TB scanned, public datasets may be free)
Anthropic API: Pay per token
Important: Run ./scripts/cleanup.sh when done to avoid ongoing charges.
Cleanup
To destroy all resources:
This will:
Destroy the Cloud SQL instance
Remove all data
Clean up Terraform state
Troubleshooting
Connection Issues
API Access Issues
Terraform Issues
Extending the POC
Ideas for enhancement:
Add More Data Sources
Cloud Spanner
Firestore
External APIs
Enhanced Agent Capabilities
Data visualization
Report generation
Predictive analytics
Production Features
Caching layer
Query optimization
Audit logging
Monitoring and alerts
Advanced MCP Features
Streaming responses
Batch operations
Transaction support
Resources
License
This is a proof of concept for demonstration purposes.
Support
For issues or questions:
Check the troubleshooting section
Review logs in the terminal
Check GCP Console for resource status