Skip to main content
Glama
Cappybara12

OpenXAI MCP Server

by Cappybara12

generate_explanation

Generate explanations for AI model predictions using methods like LIME, SHAP, or integrated gradients to understand how models make decisions.

Instructions

Generate explanations for model predictions using OpenXAI explainers

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
methodYesExplanation method to use (lime, shap, integrated_gradients, etc.)
data_sampleYesJSON string of the input data sample to explain
model_infoYesInformation about the model being explained

Implementation Reference

  • The handler function that implements the core logic for the 'generate_explanation' tool. It validates the explanation method and returns a structured response with a Python code example for using OpenXAI's Explainer.
    async generateExplanation(method, dataSample, modelInfo) { const methods = ['lime', 'shap', 'integrated_gradients', 'gradcam', 'guided_backprop']; if (!methods.includes(method)) { throw new Error(`Method '${method}' not supported. Available methods: ${methods.join(', ')}`); } const codeExample = ` # Example usage with OpenXAI: from openxai import Explainer from openxai import LoadModel from openxai.dataloader import ReturnLoaders # Load the model and data model = LoadModel(data_name='${modelInfo.data_name}', ml_model='${modelInfo.ml_model}', pretrained=True) trainloader, testloader = ReturnLoaders(data_name='${modelInfo.data_name}', download=True) # Initialize the explainer explainer = Explainer(method='${method}', model=model) # Generate explanations inputs, labels = next(iter(testloader)) explanations = explainer.get_explanations(inputs) print(f"Explanation shape: {explanations.shape}") print(f"Explanation values: {explanations}") `; return { content: [ { type: 'text', text: `Generated explanation using ${method.toUpperCase()}\n\n` + `Method: ${method}\n` + `Dataset: ${modelInfo.data_name}\n` + `Model: ${modelInfo.ml_model}\n` + `Data sample: ${dataSample}\n\n` + `Python code example:\n\`\`\`python${codeExample}\`\`\`` } ] }; }
  • The input schema definition for the 'generate_explanation' tool, including properties for method, data_sample, and model_info, as returned in the list_tools response.
    { name: 'generate_explanation', description: 'Generate explanations for model predictions using OpenXAI explainers', inputSchema: { type: 'object', properties: { method: { type: 'string', description: 'Explanation method to use (lime, shap, integrated_gradients, etc.)', enum: ['lime', 'shap', 'integrated_gradients', 'gradcam', 'guided_backprop'] }, data_sample: { type: 'string', description: 'JSON string of the input data sample to explain' }, model_info: { type: 'object', description: 'Information about the model being explained', properties: { data_name: { type: 'string' }, ml_model: { type: 'string' } } } }, required: ['method', 'data_sample', 'model_info'] } },
  • index.js:270-272 (registration)
    The switch case in the CallToolRequestHandler that registers and dispatches calls to the generateExplanation handler for the 'generate_explanation' tool.
    case 'generate_explanation': return await this.generateExplanation(args.method, args.data_sample, args.model_info);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cappybara12/mcpopenxAI'

If you have feedback or need assistance with the MCP directory API, please join our Discord server