Skip to main content
Glama

architect

Generate architectural design feedback using natural language input and maintain context with optional conversation ID through POST requests to the LLM Architect tool on the MCP Server Template platform.

Instructions

MCP server for the LLM Architect tool. Exposes resource "/llm-architect/chat" accepting POST requests with a prompt and optional conversationId, and interacts with the llm chat CLI to provide architectural design feedback while maintaining conversation context.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
conversationIdNoOptional conversation ID for context
inputYesInput prompt to process

Implementation Reference

  • The ARCHITECT_HANDLERS object defining the handler function for the 'architect' tool. This is the primary execution logic that validates input, checks dependencies, processes the request via handleArchitectProcess, and returns the result.
    export const ARCHITECT_HANDLERS: ToolHandlers = { 'architect': async (request): Promise<ToolResult> => { try { const { input, conversationId } = request.params.arguments as { input: string, conversationId?: string }; // Validate input using zod architectInputSchema.parse({ input, conversationId }); // Ensure the 'llm' command exists if (!(await commandExists('llm'))) { throw new Error('LLM command not found. Please ensure it is installed and in your PATH.'); } const result = await handleArchitectProcess(input, conversationId); return { toolResult: { content: [{ type: 'text', text: JSON.stringify({ conversationId: result.conversationId, response: result.response }) }], } }; } catch (error) { const errorMessage = error instanceof Error ? error.message : String(error); throw new Error(`Failed to process input: ${errorMessage}`); } } };
  • Zod schema for validating the input parameters of the 'architect' tool (input string and optional conversationId). Used within the handler.
    const architectInputSchema = z.object({ input: z.string().min(1, 'Input must not be empty'), conversationId: z.string().optional() });
  • Definition of the 'architect' tool including its name, description, and input schema, exported as part of ARCHITECT_TOOLS array.
    const ARCHITECT_TOOL: Tool = { name: 'architect', description: 'MCP server for the LLM Architect tool. Exposes resource "/llm-architect/chat" accepting POST requests with a prompt and optional conversationId, and interacts with the llm chat CLI to provide architectural design feedback while maintaining conversation context.', inputSchema: { type: 'object', properties: { input: { type: 'string', description: 'Input prompt to process', minLength: 1 }, conversationId: { type: 'string', description: 'Optional conversation ID for context', nullable: true } }, required: ['input'] } }; // Export tools export const ARCHITECT_TOOLS = [ARCHITECT_TOOL];
  • src/index.ts:21-22 (registration)
    Incorporates the architect tools and handlers into the global ALL_TOOLS and ALL_HANDLERS used by the MCP server for tool listing and execution.
    const ALL_TOOLS = [...ARCHITECT_TOOLS] const ALL_HANDLERS = { ...ARCHITECT_HANDLERS }
  • Helper function that performs the core processing: cleans the input and delegates to conversation handling logic.
    async function handleArchitectProcess(input: string, conversationId?: string): Promise<{ conversationId: string, response: string }> { const cleanedInput = input.replace(/\n/g, ' ').trim(); return await handleConversation(cleanedInput, conversationId); }

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/stevennevins/architect-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server