get_application_profiling
Retrieve CPU and memory profiling data, including flame graphs, to identify performance bottlenecks and optimize application performance. Use time filters to manage large data responses.
Instructions
Get CPU and memory profiling data for an application.
Retrieves profiling data including flame graphs for CPU usage and memory allocation patterns to help identify performance bottlenecks and optimization opportunities.
⚠️ WARNING: This endpoint can return extremely large responses (180k+ tokens) for applications with extensive profiling data. Consider using time filters to limit the response size to specific time windows.
Args: project_id: Project ID app_id: Application ID (format: namespace/kind/name) from_timestamp: Start timestamp (optional, strongly recommended) to_timestamp: End timestamp (optional, strongly recommended) query: Search query (optional)
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| app_id | Yes | ||
| from_timestamp | No | ||
| project_id | Yes | ||
| query | No | ||
| to_timestamp | No |
Implementation Reference
- src/mcp_coroot/server.py:1084-1112 (handler)MCP tool registration and handler function that delegates to the implementation wrapper calling the Coroot client.@mcp.tool() async def get_application_profiling( project_id: str, app_id: str, from_timestamp: int | None = None, to_timestamp: int | None = None, query: str | None = None, ) -> dict[str, Any]: """Get CPU and memory profiling data for an application. Retrieves profiling data including flame graphs for CPU usage and memory allocation patterns to help identify performance bottlenecks and optimization opportunities. ⚠️ WARNING: This endpoint can return extremely large responses (180k+ tokens) for applications with extensive profiling data. Consider using time filters to limit the response size to specific time windows. Args: project_id: Project ID app_id: Application ID (format: namespace/kind/name) from_timestamp: Start timestamp (optional, strongly recommended) to_timestamp: End timestamp (optional, strongly recommended) query: Search query (optional) """ return await get_application_profiling_impl( # type: ignore[no-any-return] project_id, app_id, from_timestamp, to_timestamp, query )
- src/mcp_coroot/server.py:1066-1082 (helper)Helper implementation that calls the CorootClient.get_application_profiling method and formats the response.@handle_errors async def get_application_profiling_impl( project_id: str, app_id: str, from_timestamp: int | None = None, to_timestamp: int | None = None, query: str | None = None, ) -> dict[str, Any]: """Get profiling data for an application.""" profiling = await get_client().get_application_profiling( project_id, app_id, from_timestamp, to_timestamp, query ) return { "success": True, "profiling": profiling, }
- src/mcp_coroot/client.py:849-888 (helper)CorootClient method implementing the HTTP API request to retrieve application profiling data from the Coroot server.async def get_application_profiling( self, project_id: str, app_id: str, from_timestamp: int | None = None, to_timestamp: int | None = None, query: str | None = None, ) -> dict[str, Any]: """Get profiling data for an application. Args: project_id: Project ID. app_id: Application ID (format: namespace/kind/name). from_timestamp: Start timestamp. to_timestamp: End timestamp. query: Search query. Returns: Profiling data and flame graphs. """ # URL encode the app_id since it contains slashes from urllib.parse import quote encoded_app_id = quote(app_id, safe="") params = {} if from_timestamp: params["from"] = str(from_timestamp) if to_timestamp: params["to"] = str(to_timestamp) if query: params["query"] = query response = await self._request( "GET", f"/api/project/{project_id}/app/{encoded_app_id}/profiling", params=params, ) data: dict[str, Any] = response.json() return data