Skip to main content
Glama

log-query

Analyze conversation variations to log unusual interactions and noteworthy patterns for monitoring and analysis purposes.

Instructions

Conversation Variation analysis Continuously monitor our conversation and automatically log unusual or noteworthy interactions based on the following criteria: 1. Probability Classifications: HIGH (Not Logged): - Common questions and responses - Standard technical inquiries - Regular clarifications - Normal conversation flow MEDIUM (Logged): - Unexpected but plausible technical issues - Unusual patterns in user behavior - Noteworthy insights or connections - Edge cases in normal usage - Uncommon but valid use cases LOW (Logged with Priority): - Highly unusual technical phenomena - Potentially problematic patterns - Critical edge cases - Unexpected system behaviors - Novel or unique use cases

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
session_idYesUnique identifier for the chat session. Format: <date>_<user>_<sequence> Example: 20240124_u1_001 Components: - date: YYYYMMDD - user: 'u' + user number - sequence: 3-digit sequential number Valid examples: - 20240124_u1_001 - 20240124_u1_002 - 20240125_u2_001
user_idYesIdentifier for the user
interaction_typeYesType of interaction being monitored
probability_classYesClassification of interaction probability
message_contentYesThe user's message content
response_contentYesThe system's response content
context_summaryYesSummary of interaction context
reasoningYesExplanation for the probability classification

Implementation Reference

  • Handler for the 'log-query' tool: extracts input arguments and delegates to LogDatabase.add_log method.
    elif name == "log-query": # Existing log-query logic session_id = arguments.get("session_id", "") user_id = arguments.get("user_id", "") interaction_type = arguments.get("interaction_type", "") probability_class = arguments.get("probability_class", "") message_content = arguments.get("message_content", "") response_content = arguments.get("response_content", "") context_summary = arguments.get("context_summary", "") reasoning = arguments.get("reasoning", "") success = db.add_log( session_id=session_id, user_id=user_id, interaction_type=interaction_type, probability_class=probability_class, message_content=message_content, response_content=response_content, context_summary=context_summary, reasoning=reasoning ) return [types.TextContent( type="text", text="Log entry added successfully" if success else "Failed to add log entry" )],
  • JSON Schema defining the input parameters and validation for the 'log-query' tool.
    inputSchema={ "type": "object", "properties": { "session_id": { "type": "string", "description": """Unique identifier for the chat session. Format: <date>_<user>_<sequence> Example: 20240124_u1_001 Components: - date: YYYYMMDD - user: 'u' + user number - sequence: 3-digit sequential number Valid examples: - 20240124_u1_001 - 20240124_u1_002 - 20240125_u2_001""", "pattern": "^\\d{8}_u\\d+_\\d{3}$" # Regex pattern to validate format }, "user_id": { "type": "string", "description": "Identifier for the user" }, "interaction_type": { "type": "string", "description": "Type of interaction being monitored" }, "probability_class": { "type": "string", "enum": ["HIGH", "MEDIUM", "LOW"], "description": "Classification of interaction probability" }, "message_content": { "type": "string", "description": "The user's message content" }, "response_content": { "type": "string", "description": "The system's response content" }, "context_summary": { "type": "string", "description": "Summary of interaction context" }, "reasoning": { "type": "string", "description": "Explanation for the probability classification" } }, "required": [ "session_id", "user_id", "interaction_type", "probability_class", "message_content", "response_content", "context_summary", "reasoning" ] },
  • Tool registration in the @server.list_tools() handler, specifying name, description, and schema.
    types.Tool( name="log-query", description=""" Conversation Variation analysis Continuously monitor our conversation and automatically log unusual or noteworthy interactions based on the following criteria: 1. Probability Classifications: HIGH (Not Logged): - Common questions and responses - Standard technical inquiries - Regular clarifications - Normal conversation flow MEDIUM (Logged): - Unexpected but plausible technical issues - Unusual patterns in user behavior - Noteworthy insights or connections - Edge cases in normal usage - Uncommon but valid use cases LOW (Logged with Priority): - Highly unusual technical phenomena - Potentially problematic patterns - Critical edge cases - Unexpected system behaviors - Novel or unique use cases """, inputSchema={ "type": "object", "properties": { "session_id": { "type": "string", "description": """Unique identifier for the chat session. Format: <date>_<user>_<sequence> Example: 20240124_u1_001 Components: - date: YYYYMMDD - user: 'u' + user number - sequence: 3-digit sequential number Valid examples: - 20240124_u1_001 - 20240124_u1_002 - 20240125_u2_001""", "pattern": "^\\d{8}_u\\d+_\\d{3}$" # Regex pattern to validate format }, "user_id": { "type": "string", "description": "Identifier for the user" }, "interaction_type": { "type": "string", "description": "Type of interaction being monitored" }, "probability_class": { "type": "string", "enum": ["HIGH", "MEDIUM", "LOW"], "description": "Classification of interaction probability" }, "message_content": { "type": "string", "description": "The user's message content" }, "response_content": { "type": "string", "description": "The system's response content" }, "context_summary": { "type": "string", "description": "Summary of interaction context" }, "reasoning": { "type": "string", "description": "Explanation for the probability classification" } }, "required": [ "session_id", "user_id", "interaction_type", "probability_class", "message_content", "response_content", "context_summary", "reasoning" ] }, ),
  • Core helper function in LogDatabase that performs the SQL INSERT to store the log data.
    def add_log(self, session_id: str, user_id: str, interaction_type: str, probability_class: str, message_content: str, response_content: str, context_summary: str, reasoning: str) -> bool: """ Add a new log entry to the database. Args: session_id (str): Unique identifier for the chat session user_id (str): Identifier for the user interaction_type (str): Type of interaction being monitored probability_class (str): Classification (HIGH, MEDIUM, LOW) message_content (str): The user's message content response_content (str): The system's response content context_summary (str): Summary of interaction context reasoning (str): Explanation for the classification Returns: bool: True if successful, False otherwise """ try: with sqlite3.connect(self.db_path) as conn: cursor = conn.cursor() cursor.execute(''' INSERT INTO chat_monitoring ( session_id, user_id, interaction_type, probability_class, message_content, response_content, context_summary, reasoning ) VALUES (?, ?, ?, ?, ?, ?, ?, ?) ''', (session_id, user_id, interaction_type, probability_class, message_content, response_content, context_summary, reasoning)) return True except Exception as e: print(f"Error adding log: {e}") return False

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/truaxki/mcp-variance-log'

If you have feedback or need assistance with the MCP directory API, please join our Discord server