Skip to main content
Glama
andydukes
by andydukes

create_prediction

Send questions to Flowise chatflows or assistants to generate AI predictions and receive JSON responses.

Instructions

Create a prediction by sending a question to a specific chatflow or assistant.

Args:
    chatflow_id (str, optional): The ID of the chatflow to use. Defaults to FLOWISE_CHATFLOW_ID.
    question (str): The question or prompt to send to the chatflow.

Returns:
    str: The raw JSON response from Flowise API or an error message if something goes wrong.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
chatflow_idNo
questionYes

Implementation Reference

  • Handler function for the 'create_prediction' tool, registered via @mcp.tool(). Handles input parameters, determines chatflow_id, calls flowise_predict, and returns the result or error.
    @mcp.tool()
    def create_prediction(*, chatflow_id: str = None, question: str) -> str:
        """
        Create a prediction by sending a question to a specific chatflow or assistant.
    
        Args:
            chatflow_id (str, optional): The ID of the chatflow to use. Defaults to FLOWISE_CHATFLOW_ID.
            question (str): The question or prompt to send to the chatflow.
    
        Returns:
            str: The raw JSON response from Flowise API or an error message if something goes wrong.
        """
        logger.debug(f"create_prediction called with chatflow_id={chatflow_id}, question={question}")
        chatflow_id = chatflow_id or FLOWISE_CHATFLOW_ID
    
        if not chatflow_id and not FLOWISE_ASSISTANT_ID:
            logger.error("No chatflow_id or assistant_id provided or pre-configured.")
            return json.dumps({"error": "chatflow_id or assistant_id is required"})
    
        try:
            # Determine which chatflow ID to use
            target_chatflow_id = chatflow_id or FLOWISE_ASSISTANT_ID
    
            # Call the prediction function and return the raw JSON result
            result = flowise_predict(target_chatflow_id, question)
            logger.debug(f"Prediction result: {result}")
            return result  # Returning raw JSON as a string
        except Exception as e:
            logger.error(f"Unhandled exception in create_prediction: {e}", exc_info=True)
            return json.dumps({"error": str(e)})
  • Supporting utility function that performs the actual HTTP POST request to Flowise API's /prediction endpoint to get the prediction result.
    def flowise_predict(chatflow_id: str, question: str) -> str:
        """
        Sends a question to a specific chatflow ID via the Flowise API and returns the response JSON text.
    
        Args:
            chatflow_id (str): The ID of the Flowise chatflow to be used.
            question (str): The question or prompt to send to the chatflow.
    
        Returns:
            str: The raw JSON response text from the Flowise API, or an error message if something goes wrong.
        """
        logger = logging.getLogger(__name__)
    
        # Construct the Flowise API URL for predictions
        url = f"{FLOWISE_API_ENDPOINT.rstrip('/')}/api/v1/prediction/{chatflow_id}"
        headers = {
            "Content-Type": "application/json",
        }
        if FLOWISE_API_KEY:
            headers["Authorization"] = f"Bearer {FLOWISE_API_KEY}"
    
        payload = {"question": question}
        logger.debug(f"Sending prediction request to {url} with payload: {payload}")
    
        try:
            # Send POST request to the Flowise API
            response = requests.post(url, json=payload, headers=headers, timeout=30)
            logger.debug(f"Prediction response code: HTTP {response.status_code}")
            # response.raise_for_status()
    
            # Log the raw response text for debugging
            logger.debug(f"Raw prediction response: {response.text}")
    
            # Return the raw JSON response text
            return response.text
    
        #except requests.exceptions.RequestException as e:
        except Exception as e:
            # Log and return an error message
            logger.error(f"Error during prediction: {e}")
            return json.dumps({"error": str(e)})
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/andydukes/mcp-flowise'

If you have feedback or need assistance with the MCP directory API, please join our Discord server