Skip to main content
Glama

Convex MCP server

Official
by get-convex
search.ts974 B
"use node"; import { action } from "./_generated/server"; import { v } from "convex/values"; import { SearchResult, embed } from "./vectorDemo"; import { internal } from "./_generated/api"; export const vectorSearch = action({ args: { query: v.string(), cuisines: v.optional(v.array(v.string())) }, handler: async (ctx, args) => { const embedding = await embed(args.query); let results; const cuisines = args.cuisines; if (cuisines !== undefined) { results = await ctx.vectorSearch("foods", "by_embedding", { vector: embedding, limit: 16, filter: (q) => q.or(...cuisines.map((cuisine) => q.eq("cuisine", cuisine))), }); } else { results = await ctx.vectorSearch("foods", "by_embedding", { vector: embedding, limit: 16, }); } const rows: SearchResult[] = await ctx.runQuery( internal.vectorDemo.fetchResults, { results }, ); return rows; }, });

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/get-convex/convex-backend'

If you have feedback or need assistance with the MCP directory API, please join our Discord server