fetch_multiple_feeds
Batch fetch multiple RSS feeds simultaneously or sequentially with success/error status for each URL, enabling efficient monitoring and management of feed data.
Instructions
Batch fetch multiple RSS feeds with success/error status for each
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| parallel | No | If 'true', fetch feeds in parallel; otherwise, fetch sequentially | true |
| urls | Yes | Array of RSS feed URLs to fetch |
Input Schema (JSON Schema)
{
"additionalProperties": false,
"properties": {
"parallel": {
"default": "true",
"description": "If 'true', fetch feeds in parallel; otherwise, fetch sequentially",
"type": "string"
},
"urls": {
"description": "Array of RSS feed URLs to fetch",
"items": {
"type": "string"
},
"type": "array"
}
},
"required": [
"urls"
],
"type": "object"
}
Implementation Reference
- src/index.ts:93-166 (handler)The execute handler for 'fetch_multiple_feeds' that fetches multiple RSS feeds, supports parallel/sequential modes with concurrency limits, caching via feedCache, and returns aggregated results with per-feed status.execute: async (args, context) => { logger.info( `Fetching ${args.urls.length} feeds (parallel: ${args.parallel})` ); const fetchFeed = async (url: string): Promise<MultiFeedResult> => { try { // Check cache first const cached = feedCache.get(url); if (cached) { return { url, success: true, data: cached }; } const result = await rssReader.fetchFeed(url); feedCache.set(url, result); return { url, success: true, data: result }; } catch (error: any) { logger.error(`Failed to fetch ${url}: ${error.message}`); return { url, success: false, error: { url, error: error.message, code: error.code, timestamp: Date.now(), }, }; } }; let results: MultiFeedResult[]; if (args.parallel === 'true') { // Parallel fetching with concurrency limit const chunks: string[][] = []; for ( let i = 0; i < args.urls.length; i += config.rssMaxConcurrentFetches ) { chunks.push(args.urls.slice(i, i + config.rssMaxConcurrentFetches)); } results = []; for (const chunk of chunks) { const chunkResults = await Promise.all(chunk.map(fetchFeed)); results.push(...chunkResults); } } else { // Sequential fetching results = []; for (const url of args.urls) { results.push(await fetchFeed(url)); } } const successCount = results.filter((r) => r.success).length; logger.info( `Fetched ${successCount}/${args.urls.length} feeds successfully` ); return JSON.stringify( { total: args.urls.length, successful: successCount, failed: args.urls.length - successCount, results, }, null, 2 ); },
- src/index.ts:77-86 (schema)Zod schema defining input parameters for the fetch_multiple_feeds tool: urls array and optional parallel flag.const FetchMultipleFeedsSchema = z.object({ urls: z.array(z.string()).describe("Array of RSS feed URLs to fetch"), parallel: z .string() .optional() .default("true") .describe( "If 'true', fetch feeds in parallel; otherwise, fetch sequentially" ), });
- src/index.ts:88-167 (registration)Registration of the 'fetch_multiple_feeds' tool with FastMCP server using addTool, including name, description, parameters schema, and execute handler.server.addTool({ name: "fetch_multiple_feeds", description: "Batch fetch multiple RSS feeds with success/error status for each", parameters: FetchMultipleFeedsSchema, execute: async (args, context) => { logger.info( `Fetching ${args.urls.length} feeds (parallel: ${args.parallel})` ); const fetchFeed = async (url: string): Promise<MultiFeedResult> => { try { // Check cache first const cached = feedCache.get(url); if (cached) { return { url, success: true, data: cached }; } const result = await rssReader.fetchFeed(url); feedCache.set(url, result); return { url, success: true, data: result }; } catch (error: any) { logger.error(`Failed to fetch ${url}: ${error.message}`); return { url, success: false, error: { url, error: error.message, code: error.code, timestamp: Date.now(), }, }; } }; let results: MultiFeedResult[]; if (args.parallel === 'true') { // Parallel fetching with concurrency limit const chunks: string[][] = []; for ( let i = 0; i < args.urls.length; i += config.rssMaxConcurrentFetches ) { chunks.push(args.urls.slice(i, i + config.rssMaxConcurrentFetches)); } results = []; for (const chunk of chunks) { const chunkResults = await Promise.all(chunk.map(fetchFeed)); results.push(...chunkResults); } } else { // Sequential fetching results = []; for (const url of args.urls) { results.push(await fetchFeed(url)); } } const successCount = results.filter((r) => r.success).length; logger.info( `Fetched ${successCount}/${args.urls.length} feeds successfully` ); return JSON.stringify( { total: args.urls.length, successful: successCount, failed: args.urls.length - successCount, results, }, null, 2 ); }, });
- src/types.ts:85-88 (schema)TypeScript interface defining parameters for fetch_multiple_feeds, matching the Zod schema.export interface FetchMultipleFeedsParams { urls: string[]; parallel?: 'true' | 'false'; }