Skip to main content
Glama

Reviewer MCP

by jaggederest
openai.ts915 B
import OpenAI from 'openai'; import { AIProvider } from './types.js'; import { ProviderError } from './errors.js'; export class OpenAIProvider implements AIProvider { public readonly name = 'OpenAI'; private client: OpenAI; private model: string; constructor(apiKey: string, model = 'o1-preview') { this.client = new OpenAI({ apiKey }); this.model = model; } async chat(systemPrompt: string, userPrompt: string): Promise<string> { try { const response = await this.client.chat.completions.create({ model: this.model, messages: [ { role: 'system', content: systemPrompt }, { role: 'user', content: userPrompt }, ], temperature: 0.7, max_tokens: 4000, }); return response.choices[0]?.message?.content ?? 'No response generated'; } catch (error) { throw new ProviderError(this.name, error); } } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jaggederest/mcp_reviewer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server