mcp_ollama_run
Execute Ollama models to generate responses, enabling querying and manipulating ontology data via the Ontology MCP server. Input model name, prompt, and optional timeout.
Instructions
Ollama 모델을 실행하여 응답을 생성합니다
Input Schema
Name | Required | Description | Default |
---|---|---|---|
name | Yes | 실행할 모델 이름 | |
prompt | Yes | 모델에 전송할 프롬프트 | |
timeout | No | 타임아웃(밀리초 단위, 기본값: 60000) |
Input Schema (JSON Schema)
You must be authenticated.
Other Tools from Ontology MCP
- mcp_gemini_chat_completion
- mcp_gemini_create_image
- mcp_gemini_edit_image
- mcp_gemini_generate_image
- mcp_gemini_generate_images
- mcp_gemini_generate_multimodal_content
- mcp_gemini_generate_text
- mcp_gemini_generate_videos
- mcp_gemini_list_models
- mcp_http_request
- mcp_imagen_generate
- mcp_ollama_chat_completion
- mcp_ollama_list
- mcp_ollama_pull
- mcp_ollama_rm
- mcp_ollama_run
- mcp_ollama_show
- mcp_ollama_status
- mcp_openai_chat
- mcp_openai_embedding
- mcp_openai_image
- mcp_openai_transcribe
- mcp_openai_tts
- mcp_sparql_execute_query
- mcp_sparql_get_resource_info
- mcp_sparql_list_graphs
- mcp_sparql_list_repositories
- mcp_sparql_update
Related Tools
- @bigdata-coss/agent_mcp
- @bigdata-coss/agent_mcp
- @bigdata-coss/agent_mcp
- @bigdata-coss/agent_mcp
- @bigdata-coss/agent_mcp