Skip to main content
Glama

Deep Research MCP Server

by Ozamatash
feedback.ts1.05 kB
import { generateObject } from 'ai'; import { z } from 'zod'; import { getDefaultModel } from './ai/providers.js'; import type { LanguageModelV2 } from '@ai-sdk/provider'; import { systemPrompt } from './prompt.js'; export async function generateFeedback({ query, numQuestions = 3, model, }: { query: string; numQuestions?: number; model?: LanguageModelV2; }) { const selectedModel = model ?? getDefaultModel(); const userFeedback = await generateObject({ model: selectedModel, system: systemPrompt(), prompt: `Given the following query from the user, ask some follow up questions to clarify the research direction. Return a maximum of ${numQuestions} questions, but feel free to return less if the original query is clear: <query>${query}</query>`, schema: z.object({ questions: z .array(z.string()) .describe( `Follow up questions to clarify the research direction, max of ${numQuestions}`, ), }), }); return userFeedback.object.questions.slice(0, numQuestions); }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Ozamatash/deep-research-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server