Skip to main content
Glama
Cappybara12

OpenXAI MCP Server

by Cappybara12

load_model

Load pre-trained machine learning models for AI explanation benchmarking by specifying dataset and model type.

Instructions

Load a pre-trained model from OpenXAI

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
data_nameYesName of the dataset the model was trained on
ml_modelYesType of machine learning model (ann, lr, rf, svm, xgb)
pretrainedNoWhether to load a pretrained model

Implementation Reference

  • The core handler function for the 'load_model' tool. Validates the model type, generates a Python code snippet using OpenXAI's LoadModel class, and returns a formatted response with model information and usage example.
      async loadModel(dataName, mlModel, pretrained = true) {
        const modelInfo = {
          ann: 'Artificial Neural Network',
          lr: 'Logistic Regression',
          rf: 'Random Forest',
          svm: 'Support Vector Machine',
          xgb: 'XGBoost'
        };
    
        const modelName = modelInfo[mlModel];
        if (!modelName) {
          throw new Error(`Model type '${mlModel}' not supported. Available models: ${Object.keys(modelInfo).join(', ')}`);
        }
    
        const codeExample = `
    # Example usage with OpenXAI:
    from openxai import LoadModel
    
    # Load the pre-trained model
    model = LoadModel(data_name='${dataName}', ml_model='${mlModel}', pretrained=${pretrained})
    
    # Use the model for predictions
    # predictions = model.predict(input_data)
    `;
    
        return {
          content: [
            {
              type: 'text',
              text: `Model loaded successfully!\n\n` +
                    `Dataset: ${dataName}\n` +
                    `Model type: ${modelName} (${mlModel})\n` +
                    `Pretrained: ${pretrained}\n\n` +
                    `Python code example:\n\`\`\`python${codeExample}\`\`\``
            }
          ]
        };
      }
  • Input schema defining the parameters for the 'load_model' tool: data_name (required string), ml_model (required enum), pretrained (optional boolean).
    inputSchema: {
      type: 'object',
      properties: {
        data_name: {
          type: 'string',
          description: 'Name of the dataset the model was trained on'
        },
        ml_model: {
          type: 'string',
          description: 'Type of machine learning model (ann, lr, rf, svm, xgb)',
          enum: ['ann', 'lr', 'rf', 'svm', 'xgb']
        },
        pretrained: {
          type: 'boolean',
          description: 'Whether to load a pretrained model',
          default: true
        }
      },
      required: ['data_name', 'ml_model']
    }
  • index.js:91-114 (registration)
    Tool registration in the list of available tools returned by ListToolsRequestSchema, including name, description, and input schema.
    {
      name: 'load_model',
      description: 'Load a pre-trained model from OpenXAI',
      inputSchema: {
        type: 'object',
        properties: {
          data_name: {
            type: 'string',
            description: 'Name of the dataset the model was trained on'
          },
          ml_model: {
            type: 'string',
            description: 'Type of machine learning model (ann, lr, rf, svm, xgb)',
            enum: ['ann', 'lr', 'rf', 'svm', 'xgb']
          },
          pretrained: {
            type: 'boolean',
            description: 'Whether to load a pretrained model',
            default: true
          }
        },
        required: ['data_name', 'ml_model']
      }
    },
  • Dispatch case in the main CallToolRequestSchema handler that routes 'load_model' calls to the loadModel method.
    case 'load_model':
      return await this.loadModel(args.data_name, args.ml_model, args.pretrained);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cappybara12/mcpopenxAI'

If you have feedback or need assistance with the MCP directory API, please join our Discord server