Skip to main content
Glama
Perufitlife

supabase-security-mcp

by Perufitlife

audit_project

Detect security vulnerabilities in a Supabase project: RLS gaps, exposed SECURITY DEFINER functions, public storage buckets, default-privilege leaks, and unsafe authentication settings.

Instructions

Scan a Supabase project for security issues: RLS gaps, exposed SECURITY DEFINER functions, public buckets, default-privilege leaks, and unsafe auth config. Returns findings JSON. Caches result for use by apply_fix tools.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
project_refYesSupabase project ref, e.g. 'abcdefghijklmnopqrst'
tokenNoPersonal Access Token (sbp_...). Optional if SUPABASE_ACCESS_TOKEN env var is set.

Implementation Reference

  • src/server.js:36-63 (registration)
    Registration of the 'audit_project' tool on the MCP server, including its input schema (project_ref, token) and the handler function that calls the audit() function from ./audit.js, caches the result, and returns a summary + JSON findings.
    server.registerTool(
      "audit_project",
      {
        description: "Scan a Supabase project for security issues: RLS gaps, exposed SECURITY DEFINER functions, public buckets, default-privilege leaks, and unsafe auth config. Returns findings JSON. Caches result for use by apply_fix tools.",
        inputSchema: {
          project_ref: z.string().describe("Supabase project ref, e.g. 'abcdefghijklmnopqrst'"),
          token: z.string().optional().describe("Personal Access Token (sbp_...). Optional if SUPABASE_ACCESS_TOKEN env var is set."),
        },
      },
      async ({ project_ref, token }) => {
        const t = getToken(token);
        if (!t) {
          return { content: [{ type: "text", text: "Error: no token. Set SUPABASE_ACCESS_TOKEN env var or pass `token` parameter." }], isError: true };
        }
        try {
          const result = await audit(t, project_ref);
          cache.set(project_ref, { result, ts: Date.now(), token: t });
          return {
            content: [
              { type: "text", text: shortSummary(result) },
              { type: "text", text: "```json\n" + JSON.stringify(result, null, 2) + "\n```" },
            ],
          };
        } catch (e) {
          return { content: [{ type: "text", text: `Audit failed: ${e.message}` }], isError: true };
        }
      }
    );
  • The actual handler/execution logic for the audit_project tool: validates token, calls audit() from src/audit.js, caches the result, and returns summary + full JSON findings. On error, returns an error response.
    async ({ project_ref, token }) => {
      const t = getToken(token);
      if (!t) {
        return { content: [{ type: "text", text: "Error: no token. Set SUPABASE_ACCESS_TOKEN env var or pass `token` parameter." }], isError: true };
      }
      try {
        const result = await audit(t, project_ref);
        cache.set(project_ref, { result, ts: Date.now(), token: t });
        return {
          content: [
            { type: "text", text: shortSummary(result) },
            { type: "text", text: "```json\n" + JSON.stringify(result, null, 2) + "\n```" },
          ],
        };
      } catch (e) {
        return { content: [{ type: "text", text: `Audit failed: ${e.message}` }], isError: true };
      }
    }
  • The core audit() function that performs all security scanning logic: fetches tables with RLS status, SECURITY DEFINER functions, storage buckets, auth config, default privileges, and realtime publications. Returns findings JSON with project metadata and summary counts.
    async function audit(token, ref, opts = {}) {
      const { activeProbe = true } = opts;
      const findings = [];
      const meta = await getProjectMeta(token, ref);
      const supabaseUrl = `https://${ref}.supabase.co`;
      const anonKey = activeProbe ? await getAnonKey(token, ref) : null;
      const probeAvailable = !!anonKey;
    
      // 1. Tables: RLS status + policy count + anon grants
      const tables = await sql(
        token,
        ref,
        `SELECT
           c.relname AS table_name,
           c.relrowsecurity AS rls_enabled,
           (SELECT COUNT(*) FROM pg_policies p WHERE p.schemaname='public' AND p.tablename=c.relname) AS n_policies,
           has_table_privilege('anon', 'public.'||quote_ident(c.relname), 'SELECT') AS anon_select,
           has_table_privilege('anon', 'public.'||quote_ident(c.relname), 'INSERT') AS anon_insert,
           has_table_privilege('anon', 'public.'||quote_ident(c.relname), 'DELETE') AS anon_delete,
           has_table_privilege('authenticated', 'public.'||quote_ident(c.relname), 'SELECT') AS auth_select
         FROM pg_class c
         JOIN pg_namespace n ON n.oid = c.relnamespace
         WHERE n.nspname = 'public' AND c.relkind = 'r'
         ORDER BY c.relname;`
      );
    
      for (const t of tables) {
        if (!t.rls_enabled && (t.anon_select || t.anon_insert || t.anon_delete)) {
          const finding = {
            check: "rls_disabled",
            ...CHECKS.rls_disabled,
            target: t.table_name,
            details: {
              anon_select: t.anon_select,
              anon_insert: t.anon_insert,
              anon_delete: t.anon_delete,
            },
            fix_sql: `ALTER TABLE public.${t.table_name} ENABLE ROW LEVEL SECURITY;`,
          };
          if (probeAvailable && t.anon_select) {
            finding.probe = await probeAnonAccess(supabaseUrl, anonKey, t.table_name);
          }
          findings.push(finding);
        } else if (t.rls_enabled && t.n_policies === 0 && (t.anon_select || t.auth_select)) {
          findings.push({
            check: "rls_no_policies_with_anon_grants",
            ...CHECKS.rls_no_policies_with_anon_grants,
            target: t.table_name,
            details: { policies: 0, anon_select: t.anon_select, auth_select: t.auth_select },
            fix_sql: `-- Optional hardening: revoke direct grants to make leak impossible even if RLS is disabled.\nREVOKE ALL ON public.${t.table_name} FROM anon, authenticated;`,
          });
        }
      }
    
      // 2. SECURITY DEFINER functions: executable-by-anon AND missing search_path
      const funcs = await sql(
        token,
        ref,
        `SELECT
           p.proname AS function_name,
           p.prosecdef AS security_definer,
           pg_get_function_result(p.oid) AS return_type,
           has_function_privilege('anon', p.oid, 'EXECUTE') AS anon_execute,
           p.proconfig AS config
         FROM pg_proc p
         JOIN pg_namespace n ON n.oid = p.pronamespace
         WHERE n.nspname = 'public' AND p.prosecdef = true;`
      );
    
      for (const f of funcs) {
        // Trigger functions (return trigger) are not exploitable via PostgREST
        if (f.return_type === "trigger") continue;
        if (f.anon_execute) {
          findings.push({
            check: "function_security_definer_anon_executable",
            ...CHECKS.function_security_definer_anon_executable,
            target: f.function_name,
            details: { returns: f.return_type },
            fix_sql: `REVOKE EXECUTE ON FUNCTION public.${f.function_name} FROM anon;`,
          });
        }
        // SECURITY DEFINER without SET search_path = path-injection vector
        const hasSearchPath = Array.isArray(f.config) && f.config.some((c) => typeof c === "string" && c.toLowerCase().startsWith("search_path="));
        if (!hasSearchPath) {
          findings.push({
            check: "function_no_search_path",
            ...CHECKS.function_no_search_path,
            target: f.function_name,
            details: { returns: f.return_type, current_config: f.config },
            fix_sql: `ALTER FUNCTION public.${f.function_name} SET search_path = public, pg_temp;`,
          });
        }
      }
    
      // 2b. Realtime publication: tables exposed via supabase_realtime WebSocket
      let realtimeTables = [];
      try {
        realtimeTables = await sql(
          token,
          ref,
          `SELECT tablename FROM pg_publication_tables WHERE pubname = 'supabase_realtime' AND schemaname = 'public';`
        );
      } catch { /* publication may not exist */ }
      const tableRlsMap = new Map(tables.map((t) => [t.table_name, t.rls_enabled]));
      for (const rt of realtimeTables) {
        const rls = tableRlsMap.get(rt.tablename);
        if (rls === false) {
          const finding = {
            check: "realtime_publication_no_rls",
            ...CHECKS.realtime_publication_no_rls,
            target: rt.tablename,
            details: { in_publication: "supabase_realtime", rls_enabled: false },
            fix_sql: `ALTER TABLE public.${rt.tablename} ENABLE ROW LEVEL SECURITY;\n-- Or remove from publication: ALTER PUBLICATION supabase_realtime DROP TABLE public.${rt.tablename};`,
          };
          if (probeAvailable) {
            finding.probe = await probeAnonAccess(supabaseUrl, anonKey, rt.tablename);
          }
          findings.push(finding);
        }
      }
    
      // 3. Default privileges still grant CRUD on future tables?
      // Supabase tables can be owned by either `postgres` (SQL/CLI migrations) or `supabase_admin` (Dashboard).
      // Check both. ACL string format: anon=arwdDxtm/owner — letters: a=insert r=select w=update d=delete D=truncate x=refs t=trigger m=maintain.
      const defaults = await sql(
        token,
        ref,
        `SELECT defaclrole::regrole::text AS owner_role, defaclacl::text AS acl
         FROM pg_default_acl d
         JOIN pg_namespace n ON n.oid = d.defaclnamespace
         WHERE n.nspname = 'public' AND d.defaclobjtype = 'r';`
      );
    
      const ownersWithLeak = [];
      for (const ownerRole of ["postgres", "supabase_admin"]) {
        const row = defaults.find((d) => d.owner_role === ownerRole);
        if (!row) {
          // No custom default ACL = system grants apply (Supabase auto-grants CRUD to anon)
          ownersWithLeak.push(ownerRole);
          continue;
        }
        // Look for anon or authenticated still having CRUD letters (a=insert r=select w=update d=delete)
        const m = row.acl.match(/anon=([a-zA-Z]*)/);
        const auth = row.acl.match(/authenticated=([a-zA-Z]*)/);
        const hasCrud = (s) => s && /[arwd]/.test(s.replace(/[DxtmU]/g, ""));
        if (hasCrud(m && m[1]) || hasCrud(auth && auth[1])) {
          ownersWithLeak.push(ownerRole);
        }
      }
      if (ownersWithLeak.length > 0) {
        const fixes = [];
        if (ownersWithLeak.includes("postgres")) {
          fixes.push(
            `-- Tables created via SQL editor / migrations / CLI (owner = postgres):`,
            `ALTER DEFAULT PRIVILEGES FOR ROLE postgres IN SCHEMA public REVOKE SELECT, INSERT, UPDATE, DELETE ON TABLES FROM anon, authenticated, service_role;`,
            `ALTER DEFAULT PRIVILEGES FOR ROLE postgres IN SCHEMA public REVOKE USAGE, SELECT ON SEQUENCES FROM anon, authenticated, service_role;`,
          );
        }
        if (ownersWithLeak.includes("supabase_admin")) {
          fixes.push(
            ``,
            `-- Tables created via Supabase Dashboard (owner = supabase_admin) cannot be revoked from postgres role.`,
            `-- Toggle this in: Dashboard -> Project Settings -> Data API -> "Automatically expose new tables" = OFF`,
            `-- Or via Management API: PATCH /v1/projects/${ref}/postgrest with {"db_extra_search_path":"public", ...} (see docs).`,
          );
        }
        findings.push({
          check: "default_privileges_not_revoked",
          ...CHECKS.default_privileges_not_revoked,
          target: `schema:public (leaky owners: ${ownersWithLeak.join(", ")})`,
          details: { leaky_owner_roles: ownersWithLeak, note: "Supabase enforces revoke for all projects by Oct 30, 2026." },
          fix_sql: fixes.join("\n"),
        });
      }
    
      // 4. Public storage buckets
      const buckets = await getStorageBuckets(token, ref);
      for (const b of buckets) {
        if (b.public) {
          findings.push({
            check: "storage_bucket_public",
            ...CHECKS.storage_bucket_public,
            target: `bucket:${b.name}`,
            details: { id: b.id },
            fix_sql: `UPDATE storage.buckets SET public = false WHERE id = '${b.id}'; -- only if you don't need public CDN-style access`,
          });
        }
      }
    
      // 5. Auth config — multiple checks
      const authCfg = await getAuthConfig(token, ref);
      if (authCfg) {
        if (authCfg.disable_signup === false && authCfg.mailer_autoconfirm === true) {
          findings.push({
            check: "auth_signups_enabled_no_confirm",
            ...CHECKS.auth_signups_enabled_no_confirm,
            target: "auth:signups",
            details: { signups_enabled: true, autoconfirm: true },
            fix_sql: `-- Dashboard: Auth -> Providers -> Email -> "Confirm email" = ON\n-- API: PATCH /v1/projects/${ref}/config/auth body {"mailer_autoconfirm": false}`,
          });
        }
        if (authCfg.external_anonymous_users_enabled === true) {
          findings.push({
            check: "anonymous_signins_enabled",
            ...CHECKS.anonymous_signins_enabled,
            target: "auth:anonymous",
            details: { external_anonymous_users_enabled: true },
            fix_sql: `-- Dashboard: Auth -> Providers -> Anonymous Sign-Ins = OFF\n-- API: PATCH /v1/projects/${ref}/config/auth body {"external_anonymous_users_enabled": false}`,
          });
        }
        if (typeof authCfg.password_min_length === "number" && authCfg.password_min_length < 8) {
          findings.push({
            check: "weak_password_policy",
            ...CHECKS.weak_password_policy,
            target: "auth:password",
            details: { password_min_length: authCfg.password_min_length, password_required_characters: authCfg.password_required_characters },
            fix_sql: `-- Dashboard: Auth -> Providers -> Email -> "Minimum password length" >= 8\n-- API: PATCH /v1/projects/${ref}/config/auth body {"password_min_length": 12, "password_required_characters": "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!@#$%^&*()"}`,
          });
        }
        if (authCfg.security_captcha_enabled === false && authCfg.disable_signup === false) {
          findings.push({
            check: "no_captcha_on_auth",
            ...CHECKS.no_captcha_on_auth,
            target: "auth:captcha",
            details: { security_captcha_enabled: false },
            fix_sql: `-- Dashboard: Auth -> Settings -> Enable CAPTCHA (hCaptcha or Cloudflare Turnstile)\n-- API: PATCH /v1/projects/${ref}/config/auth body {"security_captcha_enabled": true, "security_captcha_provider": "hcaptcha", "security_captcha_secret": "<your_secret>"}`,
          });
        }
      }
    
      // Sort findings by severity
      findings.sort((a, b) => SEVERITY_ORDER[a.severity] - SEVERITY_ORDER[b.severity]);
    
      const summary = findings.reduce(
        (acc, f) => ({ ...acc, [f.severity]: (acc[f.severity] || 0) + 1 }),
        { critical: 0, high: 0, medium: 0, low: 0, info: 0 }
      );
    
      const probed = findings.filter((f) => f.probe).length;
      const confirmed = findings.filter((f) => f.probe?.confirmed).length;
    
      return {
        project_ref: ref,
        project_name: meta.name || ref,
        region: meta.region || "unknown",
        scanned_at: new Date().toISOString(),
        scanned_by: "supabase-security v0.3",
        active_probe: { enabled: probeAvailable, probed, confirmed },
        summary,
        n_tables_scanned: tables.length,
        n_functions_scanned: funcs.length,
        n_buckets_scanned: buckets.length,
        findings,
      };
    }
  • Input schema for audit_project tool defined via Zod: project_ref (required string) and token (optional string) for the Supabase access token.
    inputSchema: {
      project_ref: z.string().describe("Supabase project ref, e.g. 'abcdefghijklmnopqrst'"),
      token: z.string().optional().describe("Personal Access Token (sbp_...). Optional if SUPABASE_ACCESS_TOKEN env var is set."),
    },
  • CHECKS object mapping each security check ID to its severity, title, and explanation — defines the schema/classification for all finding types the audit can produce.
    const CHECKS = {
      rls_disabled: {
        severity: "critical",
        title: "RLS disabled on table accessible via anon",
        explain: "Without RLS, anon role with default CRUD grants can read/insert/delete any row.",
      },
      rls_no_policies_with_anon_grants: {
        severity: "low",
        title: "RLS-locked table still has direct anon grants (defense-in-depth)",
        explain: "Currently safe — RLS blocks all access. But if RLS is ever disabled by mistake, data leaks instantly. Best practice: revoke grants too.",
      },
      function_security_definer_anon_executable: {
        severity: "high",
        title: "SECURITY DEFINER function executable by anon",
        explain: "Function runs with creator privileges. If buggy, escalates to admin.",
      },
      default_privileges_not_revoked: {
        severity: "medium",
        title: "Default privileges not revoked from anon/authenticated",
        explain: "New tables you create will be auto-exposed. Supabase enforces this by Oct 30, 2026.",
      },
      storage_bucket_public: {
        severity: "high",
        title: "Storage bucket is public",
        explain: "Anyone can list and download all files in the bucket.",
      },
      auth_signups_enabled_no_confirm: {
        severity: "medium",
        title: "Signups enabled without email confirmation",
        explain: "Anyone can create accounts and bypass email-gated logic.",
      },
      realtime_publication_no_rls: {
        severity: "critical",
        title: "Table in supabase_realtime publication WITHOUT RLS",
        explain: "Realtime sends row changes over WebSockets to anyone subscribed with the anon key. RLS policies are checked, but with RLS disabled there's nothing to check. Every INSERT/UPDATE is broadcast.",
      },
      anonymous_signins_enabled: {
        severity: "high",
        title: "Anonymous sign-ins enabled",
        explain: "Anyone can become an 'authenticated' user without email verification. Defeats `auth.uid() IS NOT NULL` style RLS policies.",
      },
      weak_password_policy: {
        severity: "medium",
        title: "Weak password policy",
        explain: "Minimum length below 8 characters. Use at least 8 + complexity requirements (digits/symbols).",
      },
      no_captcha_on_auth: {
        severity: "medium",
        title: "No CAPTCHA on auth endpoints",
        explain: "Without CAPTCHA, signup/login forms can be brute-forced or spammed by bots.",
      },
      function_no_search_path: {
        severity: "medium",
        title: "SECURITY DEFINER function without SET search_path",
        explain: "Function with mutable search_path can be hijacked: an attacker with CREATE on any schema in the path can shadow built-in functions and run arbitrary code as the function owner.",
      },
    };
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations provided, so description carries full burden. It discloses caching and output type, but does not mention whether the operation is read-only, auth requirements beyond token, or potential side effects.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Two efficient sentences: first defines purpose with concrete examples, second notes output and caching. No redundant words.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Describes output format vaguely ('findings JSON') and mentions caching for apply_fix tools, but lacks detail on return structure (no output schema) and does not explain how list_findings relates to cached results.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema coverage is 100%, so baseline is 3. Description adds minimal value beyond schema descriptions (e.g., token optional with env var is already in schema). No additional parameter guidance.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Explicitly states it scans a Supabase project for security issues and lists specific categories (RLS gaps, etc.), clearly distinguishing from sibling tools that apply fixes or list findings.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines3/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Mentions caching for use by apply_fix tools, hinting at a workflow, but does not explicitly state when to use versus siblings like list_findings, nor when not to use.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Perufitlife/supabase-security-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server