Skip to main content
Glama

Enhanced Architecture MCP

query_local_ai

Leverage local AI models via Ollama to assist with reasoning tasks. Input prompts, adjust model parameters, and receive contextual insights for architecture-focused problem-solving.

Instructions

Query local AI model via Ollama for reasoning assistance

Input Schema

NameRequiredDescriptionDefault
modelNoModel name (default: architecture-reasoning:latest)architecture-reasoning:latest
promptYesThe reasoning prompt to send to local AI
temperatureNoTemperature for response (0.1-1.0)

Input Schema (JSON Schema)

{ "properties": { "model": { "default": "architecture-reasoning:latest", "description": "Model name (default: architecture-reasoning:latest)", "type": "string" }, "prompt": { "description": "The reasoning prompt to send to local AI", "type": "string" }, "temperature": { "default": 0.6, "description": "Temperature for response (0.1-1.0)", "type": "number" } }, "required": [ "prompt" ], "type": "object" }
Install Server

Other Tools from Enhanced Architecture MCP

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/autoexecbatman/arch-mcp'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server