Skip to main content
Glama

retrieve_logs

Access and analyze logs from Supabase services like PostgreSQL, API Gateway, and Edge Functions for debugging and monitoring. Query logs by collection, filters, or custom SQL to identify issues and track performance.

Instructions

Retrieve logs from your Supabase project's services for debugging and monitoring.

Returns log entries from various Supabase services with timestamps, messages, and metadata. This tool provides access to the same logs available in the Supabase dashboard's Logs & Analytics section.

AVAILABLE LOG COLLECTIONS:

  • postgres: Database server logs including queries, errors, warnings, and system messages

  • api_gateway: API requests, responses, and errors processed by the Kong API gateway

  • auth: Authentication and authorization logs for sign-ups, logins, and token operations

  • postgrest: Logs from the RESTful API service that exposes your PostgreSQL database

  • pooler: Connection pooling logs from pgbouncer and supavisor services

  • storage: Object storage service logs for file uploads, downloads, and permissions

  • realtime: Logs from the real-time subscription service for WebSocket connections

  • edge_functions: Serverless function execution logs including invocations and errors

  • cron: Scheduled job logs (can be queried through postgres logs with specific filters)

  • pgbouncer: Connection pooler logs

PARAMETERS:

  • collection: The log collection to query (required, one of the values listed above)

  • limit: Maximum number of log entries to return (default: 20)

  • hours_ago: Retrieve logs from the last N hours (default: 1)

  • filters: List of filter objects with field, operator, and value (default: []) Format: [{"field": "field_name", "operator": "=", "value": "value"}]

  • search: Text to search for in event messages (default: "")

  • custom_query: Complete custom SQL query to execute instead of the pre-built queries (default: "")

HOW IT WORKS: This tool makes a request to the Supabase Management API endpoint for logs, sending either a pre-built optimized query for the selected collection or your custom query. Each log collection has a specific table structure and metadata format that requires appropriate CROSS JOIN UNNEST operations to access nested fields.

EXAMPLES:

  1. Using pre-built parameters: collection: "postgres" limit: 20 hours_ago: 24 filters: [{"field": "parsed.error_severity", "operator": "=", "value": "ERROR"}] search: "connection"

  2. Using a custom query: collection: "edge_functions" custom_query: "SELECT id, timestamp, event_message, m.function_id, m.execution_time_ms FROM function_edge_logs CROSS JOIN unnest(metadata) AS m WHERE m.execution_time_ms > 1000 ORDER BY timestamp DESC LIMIT 10"

METADATA STRUCTURE: The metadata structure is important because it determines how to access nested fields in filters:

  • postgres_logs: Use "parsed.field_name" for fields like error_severity, query, application_name

  • edge_logs: Use "request.field_name" or "response.field_name" for HTTP details

  • function_edge_logs: Use "function_id", "execution_time_ms" for function metrics

NOTE FOR LLM CLIENTS: When encountering errors with field access, examine the error message to see what fields are actually available in the structure. Start with basic fields before accessing nested metadata.

SAFETY CONSIDERATIONS:

  • This is a low-risk read operation that can be executed in SAFE mode

  • Requires a valid Supabase Personal Access Token to be configured

  • Not available for local Supabase instances (requires cloud deployment)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
collectionYes
custom_queryNo
filtersNo
hours_agoNo
limitNo
searchNo

Implementation Reference

  • Primary MCP handler function for the 'retrieve_logs' tool, registered with @mcp.tool decorator. Defines input schema via type annotations and delegates execution to FeatureManager.
    @mcp.tool(description=tool_manager.get_description(ToolName.RETRIEVE_LOGS)) # type: ignore async def retrieve_logs( collection: str, limit: int = 20, hours_ago: int = 1, filters: list[dict[str, Any]] = [], search: str = "", custom_query: str = "", ) -> dict[str, Any]: """Retrieve logs from your Supabase project's services for debugging and monitoring.""" return await feature_manager.execute_tool( ToolName.RETRIEVE_LOGS, services_container=services_container, collection=collection, limit=limit, hours_ago=hours_ago, filters=filters, search=search, custom_query=custom_query, )
  • FeatureManager's retrieve_logs method, which logs the call and delegates to ApiManager for actual log retrieval.
    async def retrieve_logs( self, container: "ServicesContainer", collection: str, limit: int = 20, hours_ago: int = 1, filters: list[dict[str, Any]] = [], search: str = "", custom_query: str = "", ) -> dict[str, Any]: """Retrieve logs from your Supabase project's services for debugging and monitoring.""" logger.info( f"Tool called: retrieve_logs(collection={collection}, limit={limit}, hours_ago={hours_ago}, filters={filters}, search={search}, custom_query={'<custom>' if custom_query else None})" ) api_manager = container.api_manager result = await api_manager.retrieve_logs( collection=collection, limit=limit, hours_ago=hours_ago, filters=filters, search=search, custom_query=custom_query, ) logger.info(f"Tool completed: retrieve_logs - Retrieved log entries for collection={collection}") return result
  • Core implementation of retrieve_logs in ApiManager: constructs SQL query using LogManager and executes it against Supabase Logs API endpoint.
    async def retrieve_logs( self, collection: str, limit: int = 20, hours_ago: int | None = 1, filters: list[dict[str, Any]] | None = None, search: str | None = None, custom_query: str | None = None, ) -> dict[str, Any]: """Retrieve logs from a Supabase service. Args: collection: The log collection to query limit: Maximum number of log entries to return hours_ago: Retrieve logs from the last N hours filters: List of filter objects with field, operator, and value search: Text to search for in event messages custom_query: Complete custom SQL query to execute Returns: The query result Raises: ValueError: If the collection is unknown """ log_manager = self.log_manager # Build the SQL query using LogManager sql = log_manager.build_logs_query( collection=collection, limit=limit, hours_ago=hours_ago, filters=filters, search=search, custom_query=custom_query, ) logger.debug(f"Executing log query: {sql}") # Make the API request try: response = await self.execute_request( method="GET", path="/v1/projects/{ref}/analytics/endpoints/logs.all", path_params={}, request_params={"sql": sql}, request_body={}, ) return response except Exception as e: logger.error(f"Error retrieving logs: {e}") raise
  • Definition of ToolName.RETRIEVE_LOGS enum value used for tool identification and dispatch.
    RETRIEVE_LOGS = "retrieve_logs"

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/alexander-zuev/supabase-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server