Skip to main content
Glama

create_prediction

Generate predictions by sending questions or prompts to a specified chatflow or assistant using the MCP server MCP-Flowise. Facilitates dynamic interaction with Flowise API.

Instructions

Create a prediction by sending a question to a specific chatflow or assistant. Args: chatflow_id (str, optional): The ID of the chatflow to use. Defaults to FLOWISE_CHATFLOW_ID. question (str): The question or prompt to send to the chatflow. Returns: str: The raw JSON response from Flowise API or an error message if something goes wrong.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
chatflow_idNo
questionYes

Implementation Reference

  • Handler function for create_prediction tool, including @mcp.tool() registration decorator. Handles input parameters, determines chatflow ID from env or args, calls flowise_predict helper, and returns the prediction result or error as JSON string.
    @mcp.tool() def create_prediction(*, chatflow_id: str = None, question: str) -> str: """ Create a prediction by sending a question to a specific chatflow or assistant. Args: chatflow_id (str, optional): The ID of the chatflow to use. Defaults to FLOWISE_CHATFLOW_ID. question (str): The question or prompt to send to the chatflow. Returns: str: The raw JSON response from Flowise API or an error message if something goes wrong. """ logger.debug(f"create_prediction called with chatflow_id={chatflow_id}, question={question}") chatflow_id = chatflow_id or FLOWISE_CHATFLOW_ID if not chatflow_id and not FLOWISE_ASSISTANT_ID: logger.error("No chatflow_id or assistant_id provided or pre-configured.") return json.dumps({"error": "chatflow_id or assistant_id is required"}) try: # Determine which chatflow ID to use target_chatflow_id = chatflow_id or FLOWISE_ASSISTANT_ID # Call the prediction function and return the raw JSON result result = flowise_predict(target_chatflow_id, question) logger.debug(f"Prediction result: {result}") return result # Returning raw JSON as a string except Exception as e: logger.error(f"Unhandled exception in create_prediction: {e}", exc_info=True) return json.dumps({"error": str(e)})
  • Core helper utility that makes the HTTP POST request to Flowise API's /prediction endpoint, authenticates with API key if provided, and returns the raw JSON response text or error.
    def flowise_predict(chatflow_id: str, question: str) -> str: """ Sends a question to a specific chatflow ID via the Flowise API and returns the response JSON text. Args: chatflow_id (str): The ID of the Flowise chatflow to be used. question (str): The question or prompt to send to the chatflow. Returns: str: The raw JSON response text from the Flowise API, or an error message if something goes wrong. """ logger = logging.getLogger(__name__) # Construct the Flowise API URL for predictions url = f"{FLOWISE_API_ENDPOINT.rstrip('/')}/api/v1/prediction/{chatflow_id}" headers = { "Content-Type": "application/json", } if FLOWISE_API_KEY: headers["Authorization"] = f"Bearer {FLOWISE_API_KEY}" payload = {"question": question} logger.debug(f"Sending prediction request to {url} with payload: {payload}") try: # Send POST request to the Flowise API response = requests.post(url, json=payload, headers=headers, timeout=30) logger.debug(f"Prediction response code: HTTP {response.status_code}") # response.raise_for_status() # Log the raw response text for debugging logger.debug(f"Raw prediction response: {response.text}") # Return the raw JSON response text return response.text #except requests.exceptions.RequestException as e: except Exception as e: # Log and return an error message logger.error(f"Error during prediction: {e}") return json.dumps({"error": str(e)})
  • The @mcp.tool() decorator registers the create_prediction function as an MCP tool.
    @mcp.tool() def create_prediction(*, chatflow_id: str = None, question: str) -> str:
  • Type hints and docstring define the input schema (chatflow_id optional str, question required str) and output (str JSON). Similar schema in helper.
    def create_prediction(*, chatflow_id: str = None, question: str) -> str: """ Create a prediction by sending a question to a specific chatflow or assistant. Args: chatflow_id (str, optional): The ID of the chatflow to use. Defaults to FLOWISE_CHATFLOW_ID. question (str): The question or prompt to send to the chatflow. Returns: str: The raw JSON response from Flowise API or an error message if something goes wrong.

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/andydukes/mcp-flowise'

If you have feedback or need assistance with the MCP directory API, please join our Discord server