Skip to main content
Glama

Enhanced Architecture MCP

query_local_ai

Access and query a local AI model via Ollama for reasoning assistance within the Enhanced Architecture MCP server. Input prompts to generate responses, customize models, and adjust temperature for tailored results.

Instructions

Query local AI model via Ollama for reasoning assistance

Input Schema

NameRequiredDescriptionDefault
modelNoModel name (default: architecture-reasoning:latest)architecture-reasoning:latest
promptYesThe reasoning prompt to send to local AI
temperatureNoTemperature for response (0.1-1.0)

Input Schema (JSON Schema)

{ "properties": { "model": { "default": "architecture-reasoning:latest", "description": "Model name (default: architecture-reasoning:latest)", "type": "string" }, "prompt": { "description": "The reasoning prompt to send to local AI", "type": "string" }, "temperature": { "default": 0.6, "description": "Temperature for response (0.1-1.0)", "type": "number" } }, "required": [ "prompt" ], "type": "object" }
Install Server

Other Tools from Enhanced Architecture MCP

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/autoexecbatman/arch-mcp'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server