Skip to main content
Glama

Convex MCP server

Official
by get-convex
langchain.ts435 B
import { OpenAI } from "@langchain/openai"; import { action } from "./_generated/server"; export default action(async (_, { prompt }: { prompt: string }) => { const model = new OpenAI({ modelName: "gpt-4", temperature: 0.7, maxTokens: 1000, maxRetries: 5, openAIApiKey: process.env.OPENAI_API_KEY, }); const res = await model.call(`Question: ${prompt} \nAnswer: `); console.log({ res }); return res; });

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/get-convex/convex-backend'

If you have feedback or need assistance with the MCP directory API, please join our Discord server