Skip to main content
Glama

ultra-continuation

Continue conversations with previous session context to maintain continuity across interactions, enabling context revival for AI providers like OpenAI and Gemini.

Instructions

Continue a conversation with context from a previous session, enabling context revival across interactions

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
sessionIdYesSession ID to continue from
promptYesNew prompt or question to continue the conversation
providerNoAI provider to use (optional, defaults to best available)
modelNoSpecific model to use (optional)
includeFilesNoWhether to include file context from the session (default: true)

Implementation Reference

  • Main handler function that executes the ultra-continuation tool: retrieves session context, builds conversation history, generates AI continuation, saves new messages, and returns formatted response.
    export async function handleContinuation(args: any, providerManager: ProviderManager) { const { sessionId, prompt, provider, model, includeFiles = true } = args; // Get conversation context const context = await conversationMemory.getConversationContext(sessionId, 8000, includeFiles); if (context.messages.length === 0) { throw new Error(`No conversation found for session ${sessionId}`); } // Build conversation history for the model (excluding 'tool' role) const messages = context.messages .filter(msg => msg.role !== 'tool') .map(msg => ({ role: msg.role as 'user' | 'assistant' | 'system', content: msg.content })); // Add file context if available let fileContext = ''; if (context.files.length > 0) { fileContext = '\n\nRelevant files from conversation:\n' + context.files.map(f => `**${f.filePath}**:\n${f.fileContent}`).join('\n\n'); } // Add the new prompt messages.push({ role: 'user', content: prompt + fileContext }); // Build a single prompt from the conversation history const conversationHistory = messages.map(msg => `${msg.role}: ${msg.content}`).join('\n\n'); // Use provider manager to get response const aiProvider = await providerManager.getProvider(provider); const result = await aiProvider.generateText({ prompt: conversationHistory, model: model || aiProvider.getDefaultModel(), temperature: 0.7, useSearchGrounding: false }); // Save new messages to conversation await conversationMemory.addMessage(sessionId, 'user', prompt, 'continuation'); await conversationMemory.addMessage( sessionId, 'assistant', result.text, 'continuation', undefined, { provider, model: result.model, continuedFromSession: true } ); return { content: [ { type: 'text', text: `## Continued Conversation\n\nSession: ${sessionId}\nContext: ${context.messages.length} messages, ${context.files.length} files\n\n${result.text}` } ] }; }
  • Zod schema defining the input parameters for the ultra-continuation tool.
    const ZenContinuationSchema = z.object({ sessionId: z.string().describe("Session ID to continue from"), prompt: z.string().describe("New prompt or question to continue the conversation"), provider: z.enum(["openai", "gemini", "azure", "grok", "openai-compatible"]).optional().describe("AI provider to use (optional, defaults to best available)"), model: z.string().optional().describe("Specific model to use (optional)"), includeFiles: z.boolean().optional().describe("Whether to include file context from the session (default: true)"), });
  • src/server.ts:977-984 (registration)
    MCP server registration of the ultra-continuation tool, including metadata and handler reference.
    server.registerTool("ultra-continuation", { title: "Zen Continuation", description: "Continue a conversation with context from a previous session, enabling context revival across interactions", inputSchema: ZenContinuationSchema.shape, }, async (args) => { const provider = await getProviderManager(); return await handleContinuation(args, provider) as any; });
  • MCP server registration of the prompt version for ultra-continuation.
    server.registerPrompt("ultra-continuation", { title: "Zen Continuation", description: "Continue conversations with session context revival", argsSchema: { sessionId: z.string(), prompt: z.string(), provider: z.string().optional(), model: z.string().optional(), includeFiles: z.string().optional(), }, }, (args) => ({ messages: [{ role: "user", content: { type: "text", text: `Continue conversation from session ${args.sessionId}: ${args.prompt}${args.includeFiles === 'false' ? ' (without files)' : ''}${args.provider ? ` (using ${args.provider} provider)` : ''}` } }] }));
  • Import statement bringing in the handleContinuation function and other ultra tools from the handlers module.
    import { ultraTools, handleChallenge, handleContinuation, handleSession, handleBudget } from './handlers/ultra-tools.js';

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/RealMikeChong/ultra-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server