Skip to main content
Glama

mcp_ollama_status

Check the status of Ollama servers connected to the Ontology MCP, enabling real-time monitoring of AI models for ontology data querying and manipulation.

Instructions

Ollama 서버 상태 확인

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
random_stringYesDummy parameter for no-parameter tools

Implementation Reference

  • Complete tool definition for 'mcp_ollama_status' including input schema, description, and handler function. The handler delegates to ollamaService.getStatus() and formats the response.
    {
      name: 'mcp_ollama_status',
      description: 'Ollama 서버 상태 확인',
      inputSchema: {
        type: 'object',
        properties: {
          random_string: {
            type: 'string',
            description: 'Dummy parameter for no-parameter tools'
          }
        },
        required: ['random_string']
      },
      async handler(args: any): Promise<ToolResponse> {
        try {
          const result = await ollamaService.getStatus();
          return {
            content: [{
              type: 'text',
              text: result
            }]
          };
        } catch (error) {
          return {
            content: [{
              type: 'text',
              text: `Status 확인 오류: ${error instanceof Error ? error.message : String(error)}`
            }]
          };
        }
      }
    },
  • Core implementation of Ollama status check in OllamaService.getStatus(). Executes 'ollama list' command and queries /api/tags endpoint to determine server status and list models.
    /**
     * Ollama 상태 확인
     */
    async getStatus(): Promise<string> {
      try {
        // 로컬 Ollama 목록 커맨드 실행
        const { stdout, stderr } = await execAsync('ollama list');
        if (stderr) {
          throw new Error(stderr);
        }
        
        // 설치된 모델 가져오기
        const response = await axios.get(this.getApiUrl('tags'));
        
        return JSON.stringify({
          status: 'online',
          localModels: stdout.trim(),
          apiModels: response.data
        }, null, 2);
      } catch (error) {
        return JSON.stringify({
          status: 'offline',
          error: formatError(error)
        }, null, 2);
      }
    }
  • src/index.ts:26-54 (registration)
    MCP Server capabilities registration enabling the 'mcp_ollama_status' tool (line 37).
      mcp_sparql_execute_query: true,
      mcp_sparql_update: true,
      mcp_sparql_list_repositories: true,
      mcp_sparql_list_graphs: true,
      mcp_sparql_get_resource_info: true,
      mcp_ollama_run: true,
      mcp_ollama_show: true,
      mcp_ollama_pull: true,
      mcp_ollama_list: true,
      mcp_ollama_rm: true,
      mcp_ollama_chat_completion: true,
      mcp_ollama_status: true,
      mcp_http_request: true,
      mcp_openai_chat: true,
      mcp_openai_image: true,
      mcp_openai_tts: true,
      mcp_openai_transcribe: true,
      mcp_openai_embedding: true,
      mcp_gemini_generate_text: true,
      mcp_gemini_chat_completion: true,
      mcp_gemini_list_models: true,
      mcp_gemini_generate_images: false,
      mcp_gemini_generate_image: false,
      mcp_gemini_generate_videos: false,
      mcp_gemini_generate_multimodal_content: false,
      mcp_imagen_generate: false,
      mcp_gemini_create_image: false,
      mcp_gemini_edit_image: false
    },

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bigdata-coss/agent_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server