Skip to main content
Glama

mcp_openai_chat

Generate text completions using OpenAI's ChatGPT API, enabling context-aware responses for queries and conversations within the Ontology MCP server.

Instructions

OpenAI ChatGPT API를 사용하여 텍스트 완성을 생성합니다

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
max_tokensNo생성할 최대 토큰 수
messagesYes대화 메시지 배열
modelYes사용할 모델 (예: gpt-4, gpt-3.5-turbo)
temperatureNo샘플링 온도(0-2)

Implementation Reference

  • The handler function for the mcp_openai_chat tool, which calls openaiService.chatCompletion and formats the response.
    async handler(args: any): Promise<ToolResponse> {
      try {
        const result = await openaiService.chatCompletion(args);
        return {
          content: [{
            type: 'text',
            text: result
          }]
        };
      } catch (error) {
        return {
          content: [{
            type: 'text',
            text: `OpenAI 채팅 오류: ${error instanceof Error ? error.message : String(error)}`
          }]
        };
      }
  • Input schema validating the parameters for the OpenAI chat completion tool.
    inputSchema: {
      type: 'object',
      properties: {
        model: {
          type: 'string',
          description: '사용할 모델 (예: gpt-4, gpt-3.5-turbo)'
        },
        messages: {
          type: 'array',
          items: {
            type: 'object',
            properties: {
              role: {
                type: 'string',
                enum: ['system', 'user', 'assistant']
              },
              content: {
                type: 'string'
              }
            },
            required: ['role', 'content']
          },
          description: '대화 메시지 배열'
        },
        temperature: {
          type: 'number',
          description: '샘플링 온도(0-2)',
          minimum: 0,
          maximum: 2
        },
        max_tokens: {
          type: 'number',
          description: '생성할 최대 토큰 수'
        }
      },
      required: ['model', 'messages']
    },
  • src/index.ts:39-39 (registration)
    Registration of mcp_openai_chat tool in MCP server capabilities.
    mcp_openai_chat: true,
  • The supporting service method that performs the actual OpenAI Chat Completions API request using axios.
    async chatCompletion(args: {
      model: string;
      messages: Array<{ role: string; content: string }>;
      temperature?: number;
      max_tokens?: number;
      stream?: boolean;
    }): Promise<string> {
      try {
        if (!OPENAI_API_KEY) {
          throw new McpError(
            ErrorCode.InternalError,
            'OPENAI_API_KEY가 설정되지 않았습니다.'
          );
        }
    
        const response = await axios.post(
          `${OPENAI_API_BASE}/chat/completions`,
          {
            model: args.model,
            messages: args.messages,
            temperature: args.temperature ?? 0.7,
            max_tokens: args.max_tokens,
            stream: args.stream ?? false
          },
          {
            headers: {
              'Content-Type': 'application/json',
              'Authorization': `Bearer ${OPENAI_API_KEY}`
            }
          }
        );
        
        return JSON.stringify(response.data, null, 2);
      } catch (error) {
        if (axios.isAxiosError(error)) {
          const statusCode = error.response?.status;
          const responseData = error.response?.data;
          
          throw new McpError(
            ErrorCode.InternalError,
            `OpenAI API 오류 (${statusCode}): ${
              typeof responseData === 'object' 
                ? JSON.stringify(responseData, null, 2) 
                : responseData || error.message
            }`
          );
        }
        
        throw new McpError(ErrorCode.InternalError, `채팅 완성 요청 실패: ${formatError(error)}`);
      }
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bigdata-coss/agent_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server