Skip to main content
Glama

Amazon Security Lake MCP Server

by kebabmane

Amazon Security Lake MCP Server

An MCP (Model Context Protocol) server for querying Amazon Security Lake data using AWS Athena. This server provides structured access to OCSF-normalized security data stored in Security Lake, enabling AI assistants and applications to search for IP addresses, GuardDuty findings, and explore available data sources.

Features

  • IP Address Search: Search for IP addresses across Security Lake data sources (source and destination)
  • GuardDuty Findings: Query GuardDuty security findings with filtering by ID, severity, and type
  • Data Source Discovery: List and analyze available Security Lake data sources and tables
  • OCSF Compliance: Built-in OCSF (Open Cybersecurity Schema Framework) schema validation
  • AWS Integration: Seamless integration with AWS Athena, S3, and IAM
  • Security-First: Input validation, query sanitization, and least-privilege access

Requirements

  • Python 3.10 or higher
  • AWS account with Amazon Security Lake configured
  • AWS credentials configured (IAM role, profile, or environment variables)
  • Access to AWS Athena and the Security Lake database

Installation

pip install amazon-security-lake-mcp

Option 2: From source

git clone <repository-url> cd amazon-security-lake-mcp pip install -e .

Option 3: Using Docker

docker build -t amazon-security-lake-mcp . docker run -it amazon-security-lake-mcp

Configuration

The MCP server can automatically discover your Security Lake resources! Simply configure your AWS credentials and the server will find:

  • Security Lake S3 buckets for Athena query results
  • Security Lake Glue database names
  • Appropriate IAM permissions and configurations

Minimal Configuration:

# Only AWS region is required for auto-discovery export ASL_MCP_AWS_REGION="us-east-1" export ASL_MCP_AWS_PROFILE="your-aws-profile" # Optional if using default credentials

Test Discovery:

# After installation, test resource discovery python -c " from asl_mcp_server.aws.discovery import AWSResourceDiscovery discovery = AWSResourceDiscovery('us-east-1') print(discovery.get_discovery_summary()) "

Manual Configuration

If you prefer manual configuration or auto-discovery fails:

# Required export ASL_MCP_ATHENA_OUTPUT_LOCATION="s3://your-athena-results-bucket/path/" # Optional (with defaults) export ASL_MCP_AWS_REGION="us-east-1" export ASL_MCP_AWS_PROFILE="your-aws-profile" export ASL_MCP_SECURITY_LAKE_DATABASE="amazon_security_lake_glue_db" export ASL_MCP_ATHENA_WORKGROUP="primary" export ASL_MCP_AUTO_DISCOVER_RESOURCES="false" # Disable auto-discovery export ASL_MCP_LOG_LEVEL="INFO" export ASL_MCP_MAX_QUERY_RESULTS="1000" export ASL_MCP_QUERY_TIMEOUT_SECONDS="300"

Configuration File

Create a .env file in your project directory:

# Minimal configuration (auto-discovery enabled) ASL_MCP_AWS_REGION=us-east-1 ASL_MCP_AWS_PROFILE=security-lake-user # Or full manual configuration ASL_MCP_ATHENA_OUTPUT_LOCATION=s3://my-athena-results/ ASL_MCP_AWS_REGION=us-east-1 ASL_MCP_SECURITY_LAKE_DATABASE=amazon_security_lake_glue_db ASL_MCP_AUTO_DISCOVER_RESOURCES=false ASL_MCP_LOG_LEVEL=INFO

AWS Permissions

The MCP server requires the following AWS permissions:

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "athena:StartQueryExecution", "athena:GetQueryExecution", "athena:GetQueryResults", "athena:ListDataCatalogs", "athena:ListDatabases", "athena:ListTableMetadata" ], "Resource": "*" }, { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:ListBucket", "s3:PutObject" ], "Resource": [ "arn:aws:s3:::aws-security-data-lake-*", "arn:aws:s3:::aws-security-data-lake-*/*", "arn:aws:s3:::your-athena-results-bucket", "arn:aws:s3:::your-athena-results-bucket/*" ] }, { "Effect": "Allow", "Action": [ "glue:GetDatabase", "glue:GetTable", "glue:GetPartitions" ], "Resource": "*" } ] }

Usage

Integrating with Claude Desktop

To use this MCP server with Claude Desktop, you need to configure it in Claude Desktop's settings.

Step 1: Install the MCP Server

Option A: Using pipx (Recommended for Claude Desktop)

# Install pipx if you don't have it brew install pipx # macOS # or: pip install --user pipx # Other systems # Install the MCP server globally pipx install amazon-security-lake-mcp # Or install from source git clone <repository-url> cd amazon-security-lake-mcp pipx install -e .

Option B: Using pip with virtual environment

# Create virtual environment python3 -m venv asl-mcp-env source asl-mcp-env/bin/activate # Linux/macOS # or: asl-mcp-env\Scripts\activate # Windows # Install the package pip install amazon-security-lake-mcp

Find the installed command path:

# For pipx installation which asl-mcp-server # Typical output: /Users/username/.local/bin/asl-mcp-server # For pip installation in venv which asl-mcp-server # (with venv activated)
Step 2: Configure Claude Desktop

Add the MCP server to your Claude Desktop configuration. The configuration file location depends on your operating system:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Option A: Using Full Path to Command (Recommended)

{ "mcpServers": { "amazon-security-lake": { "command": "/Users/username/.local/bin/asl-mcp-server", "env": { "ASL_MCP_AWS_REGION": "us-east-1", "ASL_MCP_AWS_PROFILE": "your-aws-profile", "ASL_MCP_LOG_LEVEL": "INFO" } } } }

Replace /Users/username/.local/bin/asl-mcp-server with the actual path from which asl-mcp-server

Option B: Using Python Module (Most Reliable)

{ "mcpServers": { "amazon-security-lake": { "command": "/opt/homebrew/bin/python3", "args": ["-m", "asl_mcp_server.server"], "cwd": "/path/to/amazon-security-lake-mcp", "env": { "ASL_MCP_AWS_REGION": "us-east-1", "ASL_MCP_AWS_PROFILE": "your-aws-profile", "PYTHONPATH": "/path/to/amazon-security-lake-mcp/src", "ASL_MCP_LOG_LEVEL": "INFO" } } } }

Option C: Using Python Path (if not installed globally)

{ "mcpServers": { "amazon-security-lake": { "command": "python", "args": ["-m", "asl_mcp_server.server"], "cwd": "/path/to/amazon-security-lake-mcp", "env": { "ASL_MCP_AWS_REGION": "us-east-1", "ASL_MCP_AWS_PROFILE": "your-aws-profile" } } } }
Step 3: Restart Claude Desktop

After saving the configuration file, restart Claude Desktop completely (quit and relaunch the application).

Step 4: Verify Connection

In Claude Desktop, you can now ask questions like:

  • "Can you discover my Security Lake resources?"
  • "Search for IP address 192.168.1.100 in my security data"
  • "Show me recent high-severity GuardDuty findings"
  • "What data sources are available in my Security Lake?"
Example Claude Desktop Conversation
User: Can you help me investigate security incidents in my AWS environment? Claude: I can help you investigate security incidents using your Amazon Security Lake data! Let me start by discovering your Security Lake resources and then we can search for specific indicators. [Claude uses the discover_aws_resources tool] I found your Security Lake setup: - Database: amazon_security_lake_glue_db_us_east_1 - Region: us-east-1 - Athena results: s3://aws-athena-query-results-123456789012-us-east-1/ What would you like to investigate? I can: 1. Search for specific IP addresses 2. Look up GuardDuty findings 3. Show available data sources 4. Search by time ranges or severity levels User: Search for any activity related to IP address 203.0.113.45 Claude: [Uses search_ip_addresses tool to query Security Lake data] I found 3 security events involving IP 203.0.113.45: ...

Advanced Claude Desktop Configuration

For production use, consider these additional configurations:

Environment-Specific Configuration:

{ "mcpServers": { "security-lake-prod": { "command": "asl-mcp-server", "env": { "ASL_MCP_AWS_REGION": "us-east-1", "ASL_MCP_AWS_PROFILE": "prod-security-readonly", "ASL_MCP_MAX_QUERY_RESULTS": "500", "ASL_MCP_QUERY_TIMEOUT_SECONDS": "180" } }, "security-lake-dev": { "command": "asl-mcp-server", "env": { "ASL_MCP_AWS_REGION": "us-west-2", "ASL_MCP_AWS_PROFILE": "dev-security", "ASL_MCP_LOG_LEVEL": "DEBUG" } } } }

Starting the MCP Server (Standalone)

You can also run the server standalone for testing:

# Using the installed command asl-mcp-server # Or using Python module python -m asl_mcp_server.server

Available Tools

1. Search IP Addresses

Search for IP addresses in Security Lake data:

{ "tool": "search_ip_addresses", "arguments": { "ip_address": "192.168.1.100", "start_time": "2024-01-15T00:00:00Z", "end_time": "2024-01-15T23:59:59Z", "sources": ["guardduty", "cloudtrail"], "limit": 100 } }

Parameters:

  • ip_address (required): IP address to search for (IPv4 or IPv6)
  • start_time (optional): Start time in ISO format (default: 7 days ago)
  • end_time (optional): End time in ISO format (default: now)
  • sources (optional): Data sources to search (guardduty, cloudtrail, vpcflow, etc.)
  • limit (optional): Maximum results to return (default: 100, max: 1000)
2. Search GuardDuty Findings

Query GuardDuty security findings:

{ "tool": "search_guardduty_findings", "arguments": { "finding_id": "12345abc-def0-1234-5678-90abcdef1234", "severity": "High", "finding_type": "Trojan:EC2/MaliciousIP", "start_time": "2024-01-15T00:00:00Z", "end_time": "2024-01-15T23:59:59Z", "limit": 50 } }

Parameters:

  • finding_id (optional): Specific GuardDuty finding ID
  • severity (optional): Severity level (Critical, High, Medium, Low, Informational)
  • finding_type (optional): Type of finding to search for
  • start_time (optional): Start time in ISO format (default: 7 days ago)
  • end_time (optional): End time in ISO format (default: now)
  • limit (optional): Maximum results to return (default: 100, max: 1000)
3. List Data Sources

Discover available Security Lake data sources:

{ "tool": "list_data_sources", "arguments": { "include_schema": true } }

Parameters:

  • include_schema (optional): Include detailed table schema information (default: false)
4. Discover AWS Resources

Automatically discover Security Lake resources in your AWS account:

{ "tool": "discover_aws_resources", "arguments": {} }

This tool scans your AWS account to find:

  • Security Lake S3 buckets and databases
  • Athena output locations
  • Configuration recommendations
  • Setup validation

Intelligent search across all available Security Lake data sources:

{ "tool": "universal_security_search", "arguments": { "query_type": "findings", "filters": { "severity": "High", "start_time": "2024-01-15T00:00:00Z", "end_time": "2024-01-15T23:59:59Z" }, "limit": 50 } }

Query Types:

  • findings: Search security findings (GuardDuty, Security Hub)
  • network: Search network activity (VPC Flow, DNS, Route53)
  • api_calls: Search API activity (CloudTrail)
  • ip_search: Search by IP address across all sources

Key Features:

  • Automatically adapts to available data sources
  • Supports both OCSF 1.0 and 2.0 schemas
  • Intelligent fallback (e.g., Security Hub for GuardDuty data)
  • Unified result format across different sources
6. Test Connection

Verify connectivity to AWS services:

{ "tool": "test_connection", "arguments": {} }

Response Format

All tools return responses in a consistent format:

{ "success": true, "error": null, "results": [...], "metadata": { "query_info": {...}, "summary": {...} }, "count": 10 }
{ "success": true, "results": [ { "timestamp": "2024-01-15T10:30:00Z", "event_type": "Network Activity", "severity": "Medium", "ip_context": { "role": "source", "direction": "outbound" }, "network_info": { "source_ip": "192.168.1.100", "destination_ip": "203.0.113.45", "source_port": 3456, "destination_port": 443 }, "aws_context": { "account_id": "123456789012", "region": "us-east-1" }, "product_info": { "name": "VPC Flow Logs", "vendor": "AWS" } } ], "metadata": { "summary": { "total_events": 1, "most_common_event_type": "Network Activity", "highest_severity": "Medium" } }, "count": 1 }

Development

Setting up Development Environment

# Clone the repository git clone <repository-url> cd amazon-security-lake-mcp # Install development dependencies pip install -e ".[dev]" # Install pre-commit hooks pre-commit install

Running Tests

# Run all tests pytest # Run with coverage pytest --cov=asl_mcp_server --cov-report=html # Run specific test categories pytest -m unit # Unit tests only pytest -m integration # Integration tests only pytest -m slow # Long-running tests

Code Quality

# Format code black src tests # Lint code ruff src tests # Type checking mypy src

Security Considerations

Input Validation

  • All IP addresses are validated before querying
  • Query parameters are sanitized to prevent injection attacks
  • Time ranges are validated for reasonableness

Access Control

  • Uses AWS IAM for authentication and authorization
  • Supports least-privilege access patterns
  • Query results are limited to prevent resource exhaustion

Data Protection

  • No sensitive data is logged
  • Query results can be filtered to remove sensitive information
  • Supports encryption in transit and at rest through AWS services

Troubleshooting

Common Issues

1. Command Not Found (ENOENT Error)
Error: spawn asl-mcp-server ENOENT

Solution: Claude Desktop cannot find the asl-mcp-server command. Use one of these fixes:

Fix A: Use Full Command Path

# Find the command location which asl-mcp-server # Output: /Users/username/.local/bin/asl-mcp-server # Update Claude Desktop config with full path { "mcpServers": { "amazon-security-lake": { "command": "/Users/username/.local/bin/asl-mcp-server", "env": { ... } } } }

Fix B: Use Python Module (Most Reliable)

{ "mcpServers": { "amazon-security-lake": { "command": "/opt/homebrew/bin/python3", "args": ["-m", "asl_mcp_server.server"], "cwd": "/path/to/amazon-security-lake-mcp", "env": { "PYTHONPATH": "/path/to/amazon-security-lake-mcp/src" } } } }
2. AWS Credentials Not Found
Error: AWS credentials not configured

Solution: Configure AWS credentials using one of these methods:

  • Set AWS_PROFILE environment variable
  • Configure ~/.aws/credentials file
  • Use IAM roles (recommended for EC2/Lambda)
  • Set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables
2. Auto-Discovery Failed
Error: athena_output_location could not be auto-discovered and was not provided

Solution: Either enable manual configuration or create required resources:

Option A: Use Manual Configuration

export ASL_MCP_AUTO_DISCOVER_RESOURCES="false" export ASL_MCP_ATHENA_OUTPUT_LOCATION="s3://your-bucket/athena-results/" export ASL_MCP_SECURITY_LAKE_DATABASE="amazon_security_lake_glue_db"

Option B: Create Athena Results Bucket

# Create a bucket for Athena results aws s3 mb s3://aws-athena-query-results-$(aws sts get-caller-identity --query Account --output text)-us-east-1

Option C: Run Discovery Tool Use the discover_aws_resources tool to see what's missing and get specific recommendations.

3. Security Lake Database Not Found
Error: Database 'amazon_security_lake_glue_db' is not available

Solution:

  • Verify Security Lake is enabled in your AWS account
  • Check that the database name is correct
  • Ensure you have permissions to access the Glue catalog
4. Query Timeout
Error: Query timeout after 300 seconds

Solution:

  • Increase ASL_MCP_QUERY_TIMEOUT_SECONDS
  • Use more specific time ranges to reduce data scanned
  • Add appropriate WHERE clauses to filter results

Performance Optimization

  1. Use specific time ranges: Always specify start_time and end_time to leverage partitioning
  2. Limit results: Use appropriate limit values to avoid large result sets
  3. Filter by source: Specify sources parameter to query only relevant tables
  4. Monitor costs: Large queries can incur significant Athena charges

Debugging

Enable debug logging:

export ASL_MCP_LOG_LEVEL=DEBUG

This will provide detailed information about:

  • Query construction and execution
  • AWS API calls
  • Result processing
  • Error details

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass (pytest)
  6. Commit your changes (git commit -m 'Add amazing feature')
  7. Push to the branch (git push origin feature/amazing-feature)
  8. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For issues and questions:

  • Open an issue on GitHub
  • Check the troubleshooting section above
  • Review AWS Security Lake documentation
  • Consult the OCSF specification

Changelog

v0.1.0 (Initial Release)

  • IP address search functionality
  • GuardDuty findings search
  • Data source discovery
  • OCSF schema validation
  • Comprehensive test suite
  • AWS integration with proper error handling

Related MCP Servers

  • -
    security
    -
    license
    -
    quality
    An open standard server implementation that enables AI assistants to directly access APIs and services through Model Context Protocol, built using Cloudflare Workers for scalability.
    Last updated -
    TypeScript
  • -
    security
    F
    license
    -
    quality
    Provides secure access to AWS S3 buckets through pre-signed URLs, enabling listing, downloading, uploading, and deleting objects.
    Last updated -
    1
    TypeScript
  • -
    security
    A
    license
    -
    quality
    An experimental Model Context Protocol server that enables Large Language Models to read, search, and manipulate OpenFGA authorization stores, unlocking fine-grained access control for agentic AI and natural language interactions.
    Last updated -
    9
    Python
    Apache 2.0
  • -
    security
    A
    license
    -
    quality
    A Model Context Protocol server that provides network analysis tools for security professionals, enabling AI models like Claude to perform tasks such as ASN lookups, DNS analysis, WHOIS retrieval, and IP geolocation for security investigations.
    Last updated -
    1
    Python
    Apache 2.0
    • Linux
    • Apple

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kebabmane/asl-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server