driftos_build_prompt
Build ready-to-use LLM prompts with context and facts for API calls, enabling intelligent conversation routing without full history dependencies.
Instructions
Build a ready-to-use prompt for LLM calls with context and facts.
Args:
branch_id (string): The branch ID to build prompt for
system_prompt (string, optional): Custom system prompt prefix
Returns: { "system": string, // Full system prompt with topic and facts "messages": [{ "role": string, "content": string }] // Conversation messages }
Use this to get a complete prompt ready for OpenAI/Anthropic/etc API calls.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| branch_id | Yes | Branch ID to build prompt for | |
| system_prompt | No | Custom system prompt prefix |
Implementation Reference
- src/tools/context.ts:71-127 (registration)Complete registration of the 'driftos_build_prompt' MCP tool, including schema, description, and inline handler function that delegates to driftClient.buildPrompt.server.registerTool( 'driftos_build_prompt', { title: 'Build LLM Prompt', description: `Build a ready-to-use prompt for LLM calls with context and facts. Args: - branch_id (string): The branch ID to build prompt for - system_prompt (string, optional): Custom system prompt prefix Returns: { "system": string, // Full system prompt with topic and facts "messages": [{ "role": string, "content": string }] // Conversation messages } Use this to get a complete prompt ready for OpenAI/Anthropic/etc API calls.`, inputSchema: z.object({ branch_id: z.string().min(1).describe('Branch ID to build prompt for'), system_prompt: z.string().optional().describe('Custom system prompt prefix'), }).strict(), annotations: { readOnlyHint: true, destructiveHint: false, idempotentHint: true, openWorldHint: false, }, }, async (params) => { try { const result = await driftClient.buildPrompt( params.branch_id, params.system_prompt ); return { content: [ { type: 'text' as const, text: JSON.stringify(result, null, 2), }, ], }; } catch (error) { const message = error instanceof Error ? error.message : 'Unknown error'; return { content: [ { type: 'text' as const, text: `Error building prompt: ${message}`, }, ], isError: true, }; } } );
- src/tools/context.ts:99-126 (handler)Inline handler function for executing the 'driftos_build_prompt' tool. It invokes driftClient.buildPrompt and returns the JSON-formatted result as MCP content, with error handling.async (params) => { try { const result = await driftClient.buildPrompt( params.branch_id, params.system_prompt ); return { content: [ { type: 'text' as const, text: JSON.stringify(result, null, 2), }, ], }; } catch (error) { const message = error instanceof Error ? error.message : 'Unknown error'; return { content: [ { type: 'text' as const, text: `Error building prompt: ${message}`, }, ], isError: true, }; } }
- src/tools/context.ts:73-98 (schema)Tool metadata and input schema using Zod: requires branch_id (string), optional system_prompt (string). Includes title, description, and annotations indicating read-only, idempotent behavior.{ title: 'Build LLM Prompt', description: `Build a ready-to-use prompt for LLM calls with context and facts. Args: - branch_id (string): The branch ID to build prompt for - system_prompt (string, optional): Custom system prompt prefix Returns: { "system": string, // Full system prompt with topic and facts "messages": [{ "role": string, "content": string }] // Conversation messages } Use this to get a complete prompt ready for OpenAI/Anthropic/etc API calls.`, inputSchema: z.object({ branch_id: z.string().min(1).describe('Branch ID to build prompt for'), system_prompt: z.string().optional().describe('Custom system prompt prefix'), }).strict(), annotations: { readOnlyHint: true, destructiveHint: false, idempotentHint: true, openWorldHint: false, }, },
- src/services/drift-client.ts:4-4 (helper)Exports the driftClient instance, created from @driftos/client, which provides the buildPrompt method called by the tool handler.export const driftClient = createDriftClient(DRIFTOS_API_URL);