Skip to main content
Glama
michaelyuwh

Enhanced MCP MSSQL Server

by michaelyuwh

mssql_bulk_insert

Insert multiple rows into Microsoft SQL Server tables efficiently using bulk operations for improved performance in data migration and batch processing scenarios.

Instructions

Insert multiple rows efficiently using bulk operations

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
serverYesMSSQL Server hostname or IP address
portNoPort number (default: 1433)
userYesUsername for authentication
passwordYesPassword for authentication
databaseYesDatabase name
tableYesTarget table name
dataYesArray of objects representing rows to insert
encryptNoUse encrypted connection (default: true)
trustServerCertificateNoTrust server certificate (default: true)

Implementation Reference

  • The main handler function that executes the mssql_bulk_insert tool logic. It validates input, checks table structure, validates data columns, performs batched INSERT operations using parameterized queries, and returns insertion statistics.
    private async handleBulkInsert(args: any) {
      const config = ConnectionSchema.parse(args);
      const { table, data } = args;
    
      if (!Array.isArray(data) || data.length === 0) {
        throw new MCPError('Data must be a non-empty array', 'INVALID_DATA');
      }
    
      try {
        const pool = await this.getConnection(config);
        
        // Get table structure to validate data
        const structureRequest = pool.request();
        const structureResult = await structureRequest.query(`
          USE [${config.database}];
          SELECT 
            COLUMN_NAME,
            DATA_TYPE,
            IS_NULLABLE,
            COLUMN_DEFAULT
          FROM INFORMATION_SCHEMA.COLUMNS 
          WHERE TABLE_NAME = '${table}'
          ORDER BY ORDINAL_POSITION
        `);
    
        if (structureResult.recordset.length === 0) {
          throw new MCPError(`Table '${table}' not found`, 'TABLE_NOT_FOUND');
        }
    
        const columns = structureResult.recordset;
        const columnNames = columns.map(col => col.COLUMN_NAME);
    
        // Validate data structure
        const firstRow = data[0];
        const dataColumns = Object.keys(firstRow);
        const invalidColumns = dataColumns.filter(col => !columnNames.includes(col));
        
        if (invalidColumns.length > 0) {
          throw new MCPError(
            `Invalid columns: ${invalidColumns.join(', ')}. Valid columns: ${columnNames.join(', ')}`,
            'INVALID_COLUMNS'
          );
        }
    
        // Build bulk insert query
        const insertColumns = dataColumns.join(', ');
        const valuesPlaceholder = `(${dataColumns.map(() => '?').join(', ')})`;
        const allValues = data.map(row => dataColumns.map(col => row[col])).flat();
    
        // For large datasets, use batching
        const batchSize = 100;
        let totalInserted = 0;
        
        for (let i = 0; i < data.length; i += batchSize) {
          const batch = data.slice(i, i + batchSize);
          const batchValues = batch.map(row => dataColumns.map(col => row[col])).flat();
          const batchPlaceholders = batch.map(() => valuesPlaceholder).join(', ');
          
          const insertQuery = `
            USE [${config.database}];
            INSERT INTO [${table}] (${insertColumns})
            VALUES ${batchPlaceholders}
          `;
    
          const request = pool.request();
          // Add parameters
          batchValues.forEach((value, index) => {
            request.input(`param${index}`, value);
          });
    
          const finalQuery = insertQuery.replace(/\?/g, (match, offset) => {
            const paramIndex = insertQuery.substring(0, offset).split('?').length - 1;
            return `@param${paramIndex}`;
          });
    
          const result = await request.query(finalQuery);
          totalInserted += result.rowsAffected[0] || 0;
        }
    
        return {
          content: [
            {
              type: 'text',
              text: JSON.stringify({
                server: config.server,
                database: config.database,
                table: table,
                totalRowsProcessed: data.length,
                totalRowsInserted: totalInserted,
                status: 'success'
              }, null, 2),
            },
          ],
        };
      } catch (error) {
        throw new MCPError(
          `Bulk insert failed: ${error instanceof Error ? error.message : String(error)}`,
          'BULK_INSERT_ERROR',
          { table, rowCount: data.length }
        );
      }
    }
  • The input schema defining parameters for the mssql_bulk_insert tool, including connection details, table name, data array, and optional connection flags.
    inputSchema: {
      type: 'object',
      properties: {
        server: { type: 'string', description: 'MSSQL Server hostname or IP address' },
        port: { type: 'number', description: 'Port number (default: 1433)', default: 1433 },
        user: { type: 'string', description: 'Username for authentication' },
        password: { type: 'string', description: 'Password for authentication' },
        database: { type: 'string', description: 'Database name' },
        table: { type: 'string', description: 'Target table name' },
        data: { type: 'array', description: 'Array of objects representing rows to insert' },
        encrypt: { type: 'boolean', description: 'Use encrypted connection (default: true)', default: true },
        trustServerCertificate: { type: 'boolean', description: 'Trust server certificate (default: true)', default: true },
      },
      required: ['server', 'user', 'password', 'database', 'table', 'data'],
    },
  • src/index.ts:408-426 (registration)
    The tool registration object that defines the name, description, and input schema for mssql_bulk_insert, added to the tools array.
    {
      name: 'mssql_bulk_insert',
      description: 'Insert multiple rows efficiently using bulk operations',
      inputSchema: {
        type: 'object',
        properties: {
          server: { type: 'string', description: 'MSSQL Server hostname or IP address' },
          port: { type: 'number', description: 'Port number (default: 1433)', default: 1433 },
          user: { type: 'string', description: 'Username for authentication' },
          password: { type: 'string', description: 'Password for authentication' },
          database: { type: 'string', description: 'Database name' },
          table: { type: 'string', description: 'Target table name' },
          data: { type: 'array', description: 'Array of objects representing rows to insert' },
          encrypt: { type: 'boolean', description: 'Use encrypted connection (default: true)', default: true },
          trustServerCertificate: { type: 'boolean', description: 'Trust server certificate (default: true)', default: true },
        },
        required: ['server', 'user', 'password', 'database', 'table', 'data'],
      },
    },
  • src/index.ts:453-454 (registration)
    The switch case in the tool request handler that routes calls to the mssql_bulk_insert tool to its handler function.
    case 'mssql_bulk_insert':
      return await this.handleBulkInsert(args);
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries full burden but only mentions efficiency without disclosing critical behavioral traits such as authentication requirements, potential data overwriting, transaction handling, error behavior, or rate limits. It lacks details on what 'bulk operations' entail beyond the basic insert action.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is a single, efficient sentence that front-loads the core purpose ('Insert multiple rows efficiently') without unnecessary words. It earns its place by conveying key information concisely.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the tool's complexity (mutation with 9 parameters, no annotations, no output schema), the description is insufficient. It lacks details on authentication, data format expectations, error handling, and output behavior, leaving significant gaps for safe and effective use by an AI agent.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, so the schema fully documents all 9 parameters. The description adds no additional meaning beyond implying 'multiple rows' relates to the 'data' parameter, but it doesn't clarify parameter interactions or usage nuances, meeting the baseline for high schema coverage.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the action ('Insert multiple rows') and resource ('using bulk operations'), specifying it's for efficiency. It distinguishes from basic insert operations but doesn't explicitly differentiate from sibling tools like mssql_query, which might also handle inserts.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description mentions 'efficiently' for bulk operations, implying usage for large datasets, but provides no explicit guidance on when to use this tool versus alternatives like mssql_query or other siblings. No exclusions or prerequisites are stated.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/michaelyuwh/mcp-mssql-connector'

If you have feedback or need assistance with the MCP directory API, please join our Discord server