Skip to main content
Glama

mcp_ollama_show

Display detailed information about a specific Ollama model by specifying its name, enabling users to retrieve metadata for better integration and utilization within the Ontology MCP server.

Instructions

Ollama 모델의 정보를 표시합니다

Input Schema

NameRequiredDescriptionDefault
nameYes정보를 조회할 모델 이름

Input Schema (JSON Schema)

{ "properties": { "name": { "description": "정보를 조회할 모델 이름", "type": "string" } }, "required": [ "name" ], "type": "object" }

Implementation Reference

  • MCP tool handler function that calls ollamaService.showModel(args) and formats the response as ToolResponse.
    async handler(args: any): Promise<ToolResponse> { const result = await ollamaService.showModel(args); return { content: [ { type: 'text' as const, text: result } ] }; }
  • Input schema for the mcp_ollama_show tool, requiring a 'name' parameter.
    inputSchema: { type: 'object', properties: { name: { type: 'string', description: '정보를 조회할 모델 이름' } }, required: ['name'] },
  • Core implementation in OllamaService: makes GET request to Ollama /api/show endpoint and returns JSON string of model info.
    async showModel(args: { name: string }): Promise<string> { try { const response = await axios.get( this.getApiUrl(`show?name=${encodeURIComponent(args.name)}`) ); return JSON.stringify(response.data, null, 2); } catch (error) { if (axios.isAxiosError(error)) { throw new McpError( ErrorCode.InternalError, `Ollama API 오류: ${error.response?.data?.error || error.message}` ); } throw new McpError(ErrorCode.InternalError, `모델 정보를 가져오는데 실패했습니다: ${formatError(error)}`); } }
  • src/index.ts:25-54 (registration)
    MCP server capabilities registration declaring mcp_ollama_show: true
    tools: { mcp_sparql_execute_query: true, mcp_sparql_update: true, mcp_sparql_list_repositories: true, mcp_sparql_list_graphs: true, mcp_sparql_get_resource_info: true, mcp_ollama_run: true, mcp_ollama_show: true, mcp_ollama_pull: true, mcp_ollama_list: true, mcp_ollama_rm: true, mcp_ollama_chat_completion: true, mcp_ollama_status: true, mcp_http_request: true, mcp_openai_chat: true, mcp_openai_image: true, mcp_openai_tts: true, mcp_openai_transcribe: true, mcp_openai_embedding: true, mcp_gemini_generate_text: true, mcp_gemini_chat_completion: true, mcp_gemini_list_models: true, mcp_gemini_generate_images: false, mcp_gemini_generate_image: false, mcp_gemini_generate_videos: false, mcp_gemini_generate_multimodal_content: false, mcp_imagen_generate: false, mcp_gemini_create_image: false, mcp_gemini_edit_image: false },
  • src/index.ts:62-68 (registration)
    Handler for ListToolsRequestSchema that exposes all tools including mcp_ollama_show.
    server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools: tools.map(tool => ({ name: tool.name, description: tool.description, inputSchema: tool.inputSchema })) }));

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bigdata-coss/agent_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server