Skip to main content
Glama

Document QA MCP Server

by parikshith49
llmService.ts•718 B
import { openai } from '../config/openai'; import { getRelevantChunks } from '../utils/retrieveChunks'; export async function askQuestion(query: string): Promise<string> { const context = await getRelevantChunks(query); console.log('šŸ“„ Retrieved context:', context); if (!context || context.length === 0) { return 'No relevant context found.'; } const prompt = `Use the following context to answer the question:\n\n${context.join( '\n' )}\n\nQuestion: ${query}`; const response = await openai.chat.completions.create({ messages: [{ role: 'user', content: prompt }], model: 'gpt-3.5-turbo', }); return response.choices[0]?.message?.content?.trim() || 'No answer generated.'; }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/parikshith49/document-qa-mcp12'

If you have feedback or need assistance with the MCP directory API, please join our Discord server