Skip to main content
Glama
DriftOS

DriftOS MCP Server

Official
by DriftOS

driftos_build_prompt

Build ready-to-use LLM prompts with context and facts for API calls, enabling intelligent conversation routing without full history dependencies.

Instructions

Build a ready-to-use prompt for LLM calls with context and facts.

Args:

  • branch_id (string): The branch ID to build prompt for

  • system_prompt (string, optional): Custom system prompt prefix

Returns: { "system": string, // Full system prompt with topic and facts "messages": [{ "role": string, "content": string }] // Conversation messages }

Use this to get a complete prompt ready for OpenAI/Anthropic/etc API calls.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
branch_idYesBranch ID to build prompt for
system_promptNoCustom system prompt prefix

Implementation Reference

  • Complete registration of the 'driftos_build_prompt' MCP tool, including schema, description, and inline handler function that delegates to driftClient.buildPrompt.
    server.registerTool( 'driftos_build_prompt', { title: 'Build LLM Prompt', description: `Build a ready-to-use prompt for LLM calls with context and facts. Args: - branch_id (string): The branch ID to build prompt for - system_prompt (string, optional): Custom system prompt prefix Returns: { "system": string, // Full system prompt with topic and facts "messages": [{ "role": string, "content": string }] // Conversation messages } Use this to get a complete prompt ready for OpenAI/Anthropic/etc API calls.`, inputSchema: z.object({ branch_id: z.string().min(1).describe('Branch ID to build prompt for'), system_prompt: z.string().optional().describe('Custom system prompt prefix'), }).strict(), annotations: { readOnlyHint: true, destructiveHint: false, idempotentHint: true, openWorldHint: false, }, }, async (params) => { try { const result = await driftClient.buildPrompt( params.branch_id, params.system_prompt ); return { content: [ { type: 'text' as const, text: JSON.stringify(result, null, 2), }, ], }; } catch (error) { const message = error instanceof Error ? error.message : 'Unknown error'; return { content: [ { type: 'text' as const, text: `Error building prompt: ${message}`, }, ], isError: true, }; } } );
  • Inline handler function for executing the 'driftos_build_prompt' tool. It invokes driftClient.buildPrompt and returns the JSON-formatted result as MCP content, with error handling.
    async (params) => { try { const result = await driftClient.buildPrompt( params.branch_id, params.system_prompt ); return { content: [ { type: 'text' as const, text: JSON.stringify(result, null, 2), }, ], }; } catch (error) { const message = error instanceof Error ? error.message : 'Unknown error'; return { content: [ { type: 'text' as const, text: `Error building prompt: ${message}`, }, ], isError: true, }; } }
  • Tool metadata and input schema using Zod: requires branch_id (string), optional system_prompt (string). Includes title, description, and annotations indicating read-only, idempotent behavior.
    { title: 'Build LLM Prompt', description: `Build a ready-to-use prompt for LLM calls with context and facts. Args: - branch_id (string): The branch ID to build prompt for - system_prompt (string, optional): Custom system prompt prefix Returns: { "system": string, // Full system prompt with topic and facts "messages": [{ "role": string, "content": string }] // Conversation messages } Use this to get a complete prompt ready for OpenAI/Anthropic/etc API calls.`, inputSchema: z.object({ branch_id: z.string().min(1).describe('Branch ID to build prompt for'), system_prompt: z.string().optional().describe('Custom system prompt prefix'), }).strict(), annotations: { readOnlyHint: true, destructiveHint: false, idempotentHint: true, openWorldHint: false, }, },
  • Exports the driftClient instance, created from @driftos/client, which provides the buildPrompt method called by the tool handler.
    export const driftClient = createDriftClient(DRIFTOS_API_URL);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DriftOS/driftos-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server