Skip to main content
Glama

get_prediction

Retrieve the status and results of AI model predictions from the Replicate API. Use this tool to monitor inference progress and access generated outputs.

Instructions

Get the status and results of a prediction.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
prediction_idYes
waitNo
max_retriesNo

Implementation Reference

  • The main asynchronous handler function that executes the get_prediction tool. It retrieves the prediction status from the ReplicateClient and constructs a Prediction object.
    async def get_prediction(prediction_id: str) -> Prediction: """Get the current status and results of a prediction. Args: prediction_id: The ID of the prediction to retrieve Returns: Prediction object containing the current status and results Raises: RuntimeError: If the Replicate client fails to initialize ValueError: If the prediction is not found Exception: If the status check fails """ async with ReplicateClient() as client: result = client.get_prediction_status(prediction_id) return Prediction(**result)
  • The @mcp.tool decorator that registers the get_prediction function as an MCP tool with the specified name and description.
    @mcp.tool( name="get_prediction", description="Get the current status and results of a prediction.", )
  • Pydantic BaseModel defining the output schema for the get_prediction tool response.
    class Prediction(BaseModel): """A prediction (model run) on Replicate.""" id: str = Field(..., description="Unique identifier for this prediction") version: str = Field(..., description="Model version used for this prediction") status: PredictionStatus = Field(..., description="Current status of the prediction") input: Dict[str, Any] = Field(..., description="Input parameters used for the prediction") output: Optional[Any] = Field(None, description="Output from the prediction if completed") error: Optional[str] = Field(None, description="Error message if prediction failed") logs: Optional[str] = Field(None, description="Execution logs from the prediction") created_at: datetime started_at: Optional[datetime] = None completed_at: Optional[datetime] = None urls: Dict[str, str] = Field(..., description="Related API URLs for this prediction") metrics: Optional[Dict[str, float]] = Field(None, description="Performance metrics if available") stream_url: Optional[str] = Field(None, description="URL for streaming output if requested")
  • Helper method in ReplicateClient that fetches and formats the raw prediction data from the Replicate API, used by the get_prediction tool.
    def get_prediction_status(self, prediction_id: str) -> dict[str, Any]: """Get the status of a prediction. Args: prediction_id: ID of the prediction to check Returns: Dict containing current status and output of the prediction Raises: ValueError: If the prediction is not found Exception: If the API request fails """ if not self.client: raise RuntimeError("Client not initialized. Check error property for details.") try: # Get prediction prediction = self.client.predictions.get(prediction_id) if not prediction: raise ValueError(f"Prediction not found: {prediction_id}") # Return prediction status and output return { "id": prediction.id, "status": prediction.status, "output": prediction.output, "error": prediction.error, "created_at": prediction.created_at.isoformat() if prediction.created_at else None, "started_at": prediction.started_at.isoformat() if prediction.started_at else None, "completed_at": prediction.completed_at.isoformat() if prediction.completed_at else None, "urls": prediction.urls, "metrics": prediction.metrics, } except ValueError as err: logger.error(f"Validation error: {str(err)}") raise except Exception as err: logger.error(f"Failed to get prediction status: {str(err)}") raise Exception(f"Failed to get prediction status: {str(err)}") from err

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gerred/mcp-server-replicate'

If you have feedback or need assistance with the MCP directory API, please join our Discord server