Skip to main content
Glama

Convex MCP server

Official
by get-convex
vectorActionV8.ts886 B
import { v } from "convex/values"; import { ActionCtx, action } from "./_generated/server"; import { Doc } from "./_generated/dataModel"; import { api } from "./_generated/api"; export const vectorSearchHandler = async ( ctx: ActionCtx, args: { embedding: number[]; cuisine: string }, ): Promise<Doc<"foods">[]> => { const result = await ctx.vectorSearch("foods", "by_embedding", { vector: args.embedding, limit: 1, filter: (q) => q.eq("cuisine", args.cuisine), }); return await ctx.runQuery(api.foods.queryDocs, { ids: result.map((value) => value._id), }); }; export const vectorSearch = action({ args: { embedding: v.array(v.float64()), cuisine: v.string() }, // Avoid a method reference so that this action and the node action do not // register exactly the same function twice. handler: async (ctx, args) => vectorSearchHandler(ctx, args), });

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/get-convex/convex-backend'

If you have feedback or need assistance with the MCP directory API, please join our Discord server