Skip to main content
Glama

Convex MCP server

Official
by get-convex
vectorSearch2.ts1 kB
import { v } from "convex/values"; import { action } from "./_generated/server"; import { internal } from "./_generated/api"; import { Doc } from "./_generated/dataModel"; // @snippet start fetchResults export const similarFoods = action({ args: { descriptionQuery: v.string(), }, handler: async (ctx, args) => { // 1. Generate an embedding from your favorite third party API: const embedding = await embed(args.descriptionQuery); // 2. Then search for similar foods! const results = await ctx.vectorSearch("foods", "by_embedding", { vector: embedding, limit: 16, filter: (q) => q.eq("cuisine", "French"), }); // highlight-start // 3. Fetch the results const foods: Array<Doc<"foods">> = await ctx.runQuery( internal.foods.fetchResults, { ids: results.map((result) => result._id) }, ); return foods; // highlight-end }, }); // @snippet end fetchResults const embed = (...args: any[]): number[] => { return []; };

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/get-convex/convex-backend'

If you have feedback or need assistance with the MCP directory API, please join our Discord server