Skip to main content
Glama

Enhanced Architecture MCP

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
query_local_ai

Query local AI model via Ollama for reasoning assistance

reasoning_assist

Structured reasoning assistance for complex problems

model_list

List available local AI models

hybrid_analysis

Hybrid local+cloud analysis for complex data

token_efficient_reasoning

Delegate heavy reasoning to local AI to conserve cloud tokens

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/autoexecbatman/arch-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server