Skip to main content
Glama

Ontology MCP

by bigdata-coss

mcp_ollama_run

Execute Ollama models to generate responses by specifying a model name and prompt, enabling efficient integration with Ontology MCP for AI-driven ontology queries and data manipulation.

Instructions

Ollama 모델을 실행하여 응답을 생성합니다

Input Schema

NameRequiredDescriptionDefault
nameYes실행할 모델 이름
promptYes모델에 전송할 프롬프트
timeoutNo타임아웃(밀리초 단위, 기본값: 60000)

Input Schema (JSON Schema)

{ "properties": { "name": { "description": "실행할 모델 이름", "type": "string" }, "prompt": { "description": "모델에 전송할 프롬프트", "type": "string" }, "timeout": { "description": "타임아웃(밀리초 단위, 기본값: 60000)", "minimum": 1000, "type": "number" } }, "required": [ "name", "prompt" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bigdata-coss/agent_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server