Skip to main content
Glama

Sequa AI

Official
by sequa-ai

Sequa MCP

This repository is the entry point for using Sequa via the Model Context Protocol (MCP). If you arrived here looking to "add Sequa as an MCP server" to Cursor, Claude, Windsurf, VSCode, Cline, Highlight, Augment, or any other MCP‑capable client — you are in the right place.

It gives you a single drop‑in command that bridges STDIO/command MCP transports used by many IDEs today with Sequa’s native streamable HTTP MCP endpoint.


🧱 Prerequisites (Read First!)

Before you configure any AI agent:

  1. Create / sign in to your Sequa account at https://app.sequa.ai/login.
  2. Setup a Project inside the Sequa app.
  3. Inside that project, locate the MCP Setup URLs and select the transport your AI agent supports.
  4. Copy the URL or configuration and install it in your client.

If you skip project creation the MCP server will refuse connections — the proxy can launch but you will receive auth / project errors.


🤔 What is Sequa?

Sequa is a Contextual Knowledge Engine that unifies code, documentation and more across multiple repositories and continuously streams that context to any LLM‑powered agent. By injecting deep, current project knowledge, Sequa enables assistants to:

  • Execute architecture aware & cross‑repo tasks
  • Understand project goals and state
  • Generate more accurate production ready code
  • Centralize AI coding rules and best practices

🚀 Quick Start (Proxy Launch)

NPX (most common)

npx -y @sequa-ai/sequa-mcp@latest https://mcp.sequa.ai/v1/setup-code-assistant

Replace the URL if you use an endpoint from the specific project


🔌 IDE / Tool Configuration

Cursor (~/.cursor/mcp.json)

{ "mcpServers": { "sequa": { "url": "https://mcp.sequa.ai/v1/setup-code-assistant" } } }

Claude Desktop (Settings → Developer → Edit Config)

{ "mcpServers": { "sequa": { "command": "npx", "args": [ "-y", "@sequa-ai/sequa-mcp@latest", "https://mcp.sequa.ai/v1/setup-code-assistant" ] } } }

Windsurf (~/.codeium/windsurf/mcp_config.json)

{ "mcpServers": { "sequa": { "command": "npx", "args": [ "-y", "@sequa-ai/sequa-mcp@latest", "https://mcp.sequa.ai/v1/setup-code-assistant" ] } } }

VS Code (.vscode/mcp.json)

{ "servers": { "sequa": { "command": "npx", "args": [ "-y", "@sequa-ai/sequa-mcp@latest", "https://mcp.sequa.ai/v1/setup-code-assistant" ] } } }

Cline / Claude Dev Tools (cline_mcp_settings.json)

{ "mcpServers": { "sequa": { "command": "npx", "args": [ "-y", "@sequa-ai/sequa-mcp@latest", "https://mcp.sequa.ai/v1/setup-code-assistant" ], "disabled": false, "autoApprove": [] } } }

Highlight AI (GUI → Plugins → Custom Plugin → Add using a command)

npx -y @sequa-ai/sequa-mcp@latest https://mcp.sequa.ai/v1/setup-code-assistant

Augment Code

npx -y @sequa-ai/sequa-mcp@latest https://mcp.sequa.ai/v1/setup-code-assistant

Or augment_config.json:

{ "mcpServers": { "sequa": { "command": "npx", "args": [ "-y", "@sequa-ai/sequa-mcp@latest", "https://mcp.sequa.ai/v1/setup-code-assistant" ] } } }
-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Stop stitching context for Copilot and Cursor. With Sequa MCP, your AI tools know your entire codebase and docs out of the box.

  1. 🧱 Prerequisites (Read First!)
    1. 🤔 What is Sequa?
      1. 🚀 Quick Start (Proxy Launch)
        1. NPX (most common)
      2. 🔌 IDE / Tool Configuration
        1. Cursor (~/.cursor/mcp.json)
        2. Claude Desktop (Settings → Developer → Edit Config)
        3. Windsurf (~/.codeium/windsurf/mcp_config.json)
        4. VS Code (.vscode/mcp.json)
        5. Cline / Claude Dev Tools (cline_mcp_settings.json)
        6. Highlight AI (GUI → Plugins → Custom Plugin → Add using a command)
        7. Augment Code

      Related MCP Servers

      • A
        security
        A
        license
        A
        quality
        A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
        Last updated -
        1
        7,374
        1,973
        TypeScript
        MIT License
        • Apple
      • A
        security
        A
        license
        A
        quality
        The Seq MCP Server enables interaction with Seq's API endpoints for logging and monitoring, providing tools for managing signals, events, and alerts with extensive filtering and configuration options.
        Last updated -
        3
        12
        5
        JavaScript
        MIT License
        • Apple
      • -
        security
        A
        license
        -
        quality
        SeekChat supports MCP tool execution, enabling AI to directly control your computer and perform various tasks. Easily automate file management, data analysis, code development, and more, turning AI into a truly intelligent assistant.
        Last updated -
        47
        JavaScript
        Apache 2.0
        • Apple
        • Linux
      • A
        security
        A
        license
        A
        quality
        A comprehensive memory system for Cursor using the Model Context Protocol (MCP) that provides persistent context awareness across sessions by storing conversation history, project milestones, code snippets, and enabling semantic search.
        Last updated -
        18
        23
        MIT License

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/sequa-ai/sequa-mcp'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server