Skip to main content
Glama

Tecton MCP Server

Official
by tecton-ai

Tecton MCP Server & Cursor Rules

Tecton's Co-Pilot consists of an MCP Server and Cursor rules. Read this blog to learn much more.

ℹ️ Info: This guide will walk you through setting up the Tecton MCP server with this repository and configuring your feature repository to use it while developing features with Tecton.

Table of Contents

Quick Start

  1. Clone this repository to your local machine:
    git clone https://github.com/tecton-ai/tecton-mcp.git cd tecton-mcp pwd
    Note: The path to the directory where you just cloned the repository will be referred to as <path-to-your-local-clone> in the following steps. The pwd command in the end will tell you what the full path is.
  2. Install the uv package manager:
    brew install uv
  3. Verify your installation by running the following command. Replace <path-to-your-local-clone> with the path where you cloned the repository in step&nbsp;1:
    MCP_SMOKE_TEST=1 uv --directory <path-to-your-local-clone> run mcp run src/tecton_mcp/mcp_server/server.py
    The command should exit without any errors and print a message similar to MCP_SMOKE_TEST is set. Exiting after initialization.. This confirms that your local setup works correctly—Cursor will automatically spawn the MCP server as a subprocess when needed.
  4. Configure Cursor (or any other MCP client) with the MCP server (see below)
  5. Log into your Tecton cluster:
    tecton login yourcluster.tecton.ai
  6. Launch Cursor and start developing features with Tecton's Co-Pilot in Cursor!

Tecton MCP Tools

The Tecton MCP server exposes the following tools that can be used by an MCP client (like Cursor):

Tool NameDescription
query_example_code_snippet_index_toolFinds relevant Tecton code examples using a vector database. Helpful for finding usage patterns before writing new Tecton code.
query_documentation_index_toolRetrieves Tecton documentation snippets based on a query. Provides context directly from Tecton's official documentation.
get_full_tecton_sdk_reference_toolFetches the complete Tecton SDK reference, including all available classes and functions. Use when a broad overview of the SDK is needed.
query_tecton_sdk_reference_toolFetches the Tecton SDK reference for a specified list of classes or functions. Ideal for targeted information on specific SDK components.

Architecture

The Tecton MCP integrates with LLM-powered editors like Cursor to provide tool-based context and assistance for feature engineering:

Tecton MCP Architecture

The overall flow for building features with Tecton MCP looks like:

Tecton MCP Flow Chart

Setup Tecton with Cursor

The following is tested with Cursor 0.48 and above

Configure the Tecton MCP Server in Cursor

Navigate to Cursor Settings -> MCP and click the "Add new global MCP server" button, which will edit Cursor's mcp.json file. Add Tecton as an MCP server. You can use the following config as a starting point - make sure you modify the path <path-to-your-local-clone> to match the directory where you cloned the repository:

{ "mcpServers": { "tecton": { "command": "uv", "args": [ "--directory", "<path-to-your-local-clone>", "run", "mcp", "run", "src/tecton_mcp/mcp_server/server.py" ] } } }

Add Cursor rules

Copy the cursorrules from this repository's .cursor/rules folder into the .cursor/rules folder of your feature repository:

# Create the .cursor/rules directory structure in your feature repository mkdir -p <path-to-your-feature-repo>/.cursor/rules # Then copy the rules cp -r <path-to-your-local-clone>/.cursor/rules/* <path-to-your-feature-repo>/.cursor/rules/

Tecton Login

Log into your Tecton cluster:

tecton login yourcluster.tecton.ai

As of April 17th, the following is the stack ranked list of best performing Tecton feature engineering LLMs in Cursor:

  • OpenAI o3
  • Gemini 2.5 pro exp (03-25)
  • Sonnet 3.7

Verify that the Cursor <> Tecton MCP Integration is working as expected

To make sure that your integration works as expected, ask the Cursor Agent a question like the following and make sure it's properly invoking your Tecton MCP tools:

Query Tecton's Examples Index and tell me something about BatchFeatureViews and how they differ from StreamFeatureViews. Also look at the SDK Reference.

Start AI-Assisted Feature Engineering :-)

Now you can go to your Feature Repository in Cursor and start using Tecton's Co-Pilot - directly integrated in Cursor.

View this Loom to see how you can use the integration to build new features: https://www.loom.com/share/3658f665668a41d2b0ea2355b433c616

How to Use Specific Tecton SDK Version

By default, this tool provides guidance for the latest pre-release of the Tecton SDK. If you need the tools to align with a specific released version of Tecton (for example 1.0.34 or 1.1.10), follow these steps:

  1. Pin the version in pyproject.toml. Open pyproject.toml and replace the existing dependency line
dependencies = [ # ... other dependencies ... "tecton>=0.8.0a0" ]

with the exact version you want, e.g.

dependencies = [ # ... other dependencies ... "tecton==1.1.10" ]
  1. Remove the existing lock-file. Because uv.lock records the dependency graph, you must delete it so that uv can resolve the new Tecton version:
cd <path-to-your-local-clone> rm uv.lock
  1. Re-generate the lock-file by re-running Step&nbsp;3 (the MCP_SMOKE_TEST=1 uv --directory command) of the Quick Start section. (This will download the pinned version into an isolated environment for MCP and re-create uv.lock.)
  2. Restart Cursor so that the new Tecton version is loaded into the MCP virtual environment.

Supported versions: The tools currently support Tecton ≥ 1.0.0. Code examples are not versioned yet – they always use the latest stable SDK – however the documentation and SDK reference indices will now match the version you've pinned.

Troubleshooting

Cursor <-> Tecton MCP Server integration

Make sure that Cursor shows "tecton" as an "Enabled" MCP server in "Cursor Settings -> MCP". If you don't see a "green dot", run the MCP server in Diagnostics mode (see below)

Run MCP in Diagnostics Mode

To debug the Tecton MCP Server you can run the following command. Replace <path-to-your-local-clone> with the actual path where you cloned the repository:

uv --directory <path-to-your-local-clone> run mcp dev src/tecton_mcp/mcp_server/server.py

Note: Launching Tecton's MCP Server will take a few seconds because it's loading an embedding model into memory that it uses to search for relevant code snippets.

Wait a few seconds until the stdout tells you that the MCP Inspector is up and running and then access it at the printed URL (something like http://localhost:5173)

Click "Connect" and then list tools. You should see the Tecton MCP Server tools and be able to query them.

Resources

License

This project is licensed under the MIT License.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Provides a set of tools to interact with Tecton clusters, manage feature stores, and execute Tecton CLI commands through the Mission Control Protocol.

  1. Table of Contents
    1. Quick Start
      1. Tecton MCP Tools
        1. Architecture
          1. Setup Tecton with Cursor
            1. Configure the Tecton MCP Server in Cursor
            2. Add Cursor rules
            3. Tecton Login
            4. Recommended LLM
            5. Verify that the Cursor Tecton MCP Integration is working as expected
            6. Start AI-Assisted Feature Engineering :-)
          2. How to Use Specific Tecton SDK Version
            1. Troubleshooting
              1. Cursor Tecton MCP Server integration
              2. Run MCP in Diagnostics Mode
            2. Resources
              1. License

                Related MCP Servers

                • -
                  security
                  F
                  license
                  -
                  quality
                  The cosense-mcp-server facilitates the integration with Claude Desktop by serving as a middleware command pipeline server, allowing for interaction with projects in cosense.
                  Last updated -
                  7
                  3
                  TypeScript
                  • Apple
                • A
                  security
                  F
                  license
                  A
                  quality
                  A powerful Model Context Protocol (MCP) server enabling seamless Vercel project management, including deployments, domains, environment variables, and team configurations through Cursor's Composer or Codeium's Cascade.
                  Last updated -
                  65
                  17
                  TypeScript
                  • Apple
                  • Linux
                • -
                  security
                  A
                  license
                  -
                  quality
                  🌍 Terraform Model Context Protocol (MCP) Tool - An experimental CLI tool that enables AI assistants to manage and operate Terraform environments. Supports reading Terraform configurations, analyzing plans, applying configurations, and managing state with Claude Desktop integration. ⚡️
                  Last updated -
                  292
                  Rust
                  MIT License
                  • Apple
                • A
                  security
                  F
                  license
                  A
                  quality
                  Exposes Salesforce CLI functionality to LLM tools like Claude Desktop, allowing AI agents to execute Salesforce commands, manage orgs, deploy code, and query data through natural language.
                  Last updated -
                  5
                  72
                  28
                  JavaScript
                  • Linux
                  • Apple

                View all related MCP servers

                MCP directory API

                We provide all the information about MCP servers via our MCP API.

                curl -X GET 'https://glama.ai/api/mcp/v1/servers/tecton-ai/tecton-mcp'

                If you have feedback or need assistance with the MCP directory API, please join our Discord server