Skip to main content
Glama

Loop MCP Server

An MCP (Model Context Protocol) server that enables LLMs to process arrays item by item with a specific task.

Overview

This MCP server provides tools for:

  • Initializing an array with a task description

  • Fetching items one by one or in batches for processing

  • Storing results for each processed item or batch

  • Retrieving all results (only after all items are processed)

  • Optional result summarization

  • Configurable batch size for efficient processing

Installation

npm install

Usage

Running the Server

npm start

Available Tools

  1. initialize_array - Set up the array and task

    • array: The array of items to process

    • task: Description of what to do with each item

    • batchSize (optional): Number of items to process in each batch (default: 1)

  2. get_next_item - Get the next item to process

    • Returns: Current item, index, task, and remaining count

  3. get_next_batch - Get the next batch of items based on batch size

    • Returns: Array of items, indices, task, and remaining count

  4. store_result - Store the result of processing

    • result: The processing result (single value or array for batch processing)

  5. get_all_results - Get all results after completion

    • summarize (optional): Include a summary

    • Note: This will error if processing is not complete

  6. reset - Clear the current processing state

Example Workflows

Single Item Processing

// 1. Initialize await callTool('initialize_array', { array: [1, 2, 3, 4, 5], task: 'Square each number' }); // 2. Process each item while (true) { const item = await callTool('get_next_item'); if (item.text === 'All items have been processed.') break; // Process the item (e.g., square it) const result = item.value * item.value; await callTool('store_result', { result }); } // 3. Get final results const results = await callTool('get_all_results', { summarize: true });

Batch Processing

// 1. Initialize with batch size await callTool('initialize_array', { array: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10], task: 'Double each number', batchSize: 3 }); // 2. Process in batches while (true) { const batch = await callTool('get_next_batch'); if (batch.text === 'All items have been processed.') break; // Process the batch const results = batch.items.map(item => item * 2); await callTool('store_result', { result: results }); } // 3. Get final results const results = await callTool('get_all_results', { summarize: true });

Running the Example

node example-client.js

Integration with Claude Desktop

Add to your Claude Desktop configuration:

{ "mcpServers": { "loop-processor": { "command": "node", "args": ["/path/to/loop_mcp/server.js"] } } }
-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/smogili1/loop_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server