Skip to main content
Glama

read_omi_conversations

Retrieve user conversations from Omi with pagination and filtering by status, user ID, or inclusion of discarded data for efficient data management.

Instructions

Retrieves user conversations from Omi with pagination and filtering options

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
include_discardedNoWhether to include discarded conversations (default: false)
limitNoMaximum number of conversations to return (max: 1000, default: 100)
offsetNoNumber of conversations to skip for pagination (default: 0)
statusesNoComma-separated list of statuses to filter conversations by
user_idYesThe user ID to fetch conversations for

Implementation Reference

  • src/index.ts:74-140 (registration)
    Complete registration of the 'read_omi_conversations' MCP tool using server.tool(). Includes tool name, description, Zod input schema for parameters (user_id, limit, offset, include_discarded, statuses), and the inline async handler that builds the API request to Omi, fetches conversations, and returns them as JSON text content.
    server.tool( 'read_omi_conversations', 'Retrieves user conversations from Omi with pagination and filtering options', { user_id: z.string().describe('The user ID to fetch conversations for'), limit: z.number().optional().describe('Maximum number of conversations to return (max: 1000, default: 100)'), offset: z.number().optional().describe('Number of conversations to skip for pagination (default: 0)'), include_discarded: z.boolean().optional().describe('Whether to include discarded conversations (default: false)'), statuses: z.string().optional().describe('Comma-separated list of statuses to filter conversations by'), }, async ({ user_id, limit, offset, include_discarded, statuses }) => { try { log(`Using appId: ${APP_ID}`); log(`User ID: ${user_id}`); // Construct URL with query parameters const url = new URL(`https://api.omi.me/v2/integrations/${APP_ID}/conversations`); const params = new URLSearchParams(); params.append('uid', user_id); if (typeof limit === 'number') { params.append('limit', String(limit)); } if (typeof offset === 'number') { params.append('offset', String(offset)); } if (typeof include_discarded === 'boolean') { params.append('include_discarded', String(include_discarded)); } if (typeof statuses === 'string' && statuses.length > 0) { params.append('statuses', statuses); } url.search = params.toString(); const fetchUrl = url.toString(); log(`Fetching from URL: ${fetchUrl}`); const response = await fetch(fetchUrl, { method: 'GET', headers: { Authorization: `Bearer ${API_KEY}`, 'Content-Type': 'application/json', }, }); log(`Response status: ${response.status}`); if (!response.ok) { const errorText = await response.text(); throw new Error(`Failed to fetch conversations: ${response.status} ${response.statusText} - ${errorText}`); } const data = (await response.json()) as ConversationsResponse; log('Data received'); const conversations = data.conversations || []; return { content: [{ type: 'text', text: JSON.stringify({ conversations }) }], }; } catch (error) { log(`Error fetching conversations: ${error}`); throw new Error(`Failed to read conversations: ${error instanceof Error ? error.message : String(error)}`); } } );
  • TypeScript interface for the ConversationsResponse from Omi API, used to type the fetched data in the tool handler.
    export interface ConversationsResponse { conversations: Conversation[]; }
  • The core handler logic for the tool: constructs query parameters for pagination and filtering, makes authenticated GET request to Omi API endpoint https://api.omi.me/v2/integrations/{APP_ID}/conversations, parses response as ConversationsResponse, extracts conversations array, and returns as MCP content text block with JSON.
    async ({ user_id, limit, offset, include_discarded, statuses }) => { try { log(`Using appId: ${APP_ID}`); log(`User ID: ${user_id}`); // Construct URL with query parameters const url = new URL(`https://api.omi.me/v2/integrations/${APP_ID}/conversations`); const params = new URLSearchParams(); params.append('uid', user_id); if (typeof limit === 'number') { params.append('limit', String(limit)); } if (typeof offset === 'number') { params.append('offset', String(offset)); } if (typeof include_discarded === 'boolean') { params.append('include_discarded', String(include_discarded)); } if (typeof statuses === 'string' && statuses.length > 0) { params.append('statuses', statuses); } url.search = params.toString(); const fetchUrl = url.toString(); log(`Fetching from URL: ${fetchUrl}`); const response = await fetch(fetchUrl, { method: 'GET', headers: { Authorization: `Bearer ${API_KEY}`, 'Content-Type': 'application/json', }, }); log(`Response status: ${response.status}`); if (!response.ok) { const errorText = await response.text(); throw new Error(`Failed to fetch conversations: ${response.status} ${response.statusText} - ${errorText}`); } const data = (await response.json()) as ConversationsResponse; log('Data received'); const conversations = data.conversations || []; return { content: [{ type: 'text', text: JSON.stringify({ conversations }) }], }; } catch (error) { log(`Error fetching conversations: ${error}`); throw new Error(`Failed to read conversations: ${error instanceof Error ? error.message : String(error)}`); } }

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/fourcolors/omi-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server