Skip to main content
Glama

Enhanced Architecture MCP

package.json501 B
{ "name": "local-ai-mcp-server", "version": "1.0.0", "description": "MCP Server for Local AI Integration via Ollama", "main": "local-ai-server.js", "scripts": { "start": "node local-ai-server.js", "test": "node test-local-ai.js" }, "dependencies": { "@modelcontextprotocol/sdk": "^0.6.0", "node-fetch": "^3.3.2" }, "keywords": [ "mcp", "ollama", "local-ai", "reasoning", "architecture" ], "author": "Architecture v10.8", "license": "MIT" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/autoexecbatman/arch-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server