Provides comprehensive tools for managing Linear workspace operations including issues, projects, teams, users, comments, and cycles with optimized AI-friendly interfaces and batch operations
Linear MCP Server (HTTP / OAuth / Remote)
Warning: You connect this server to your MCP client at your own responsibility. Language models can make mistakes, misinterpret instructions, or perform unintended actions. Review tool outputs, verify changes (e.g., with
list_issues
), and prefer small, incremental writes. In production, enforce least‑privilege credentials, audit logs, and approval workflows.
A streamable HTTP MCP server for Linear that lets you manage issues, projects, teams, users, comments, and cycles — locally or remotely.
Below is a comparison between the official Linear MCP (top) and this MCP (bottom).
Notice
This repo works in two ways:
- As a Node/Hono server for local workflows
- As a Cloudflare Worker for remote interactions
The HTTP/OAuth setup here is designed for convenience during development, not for production‑grade security. If you’re deploying to Cloudflare, see Remote Model Context Protocol servers (MCP) for details.
Motivation
I’m a big fan of Linear and use it daily — for both personal projects and professional workflows, including automations. At the time of writing, the official MCP server isn’t fully optimized for language models (this may change soon, as Linear is actively improving it).
This server is built with a few key goals in mind:
- Let LLMs easily find things like Team IDs, Project IDs, Status IDs, or User IDs in a single action (
workspace_metadata
) instead of calling multiple tools just to gather required data. - Include clear MCP instructions and schema descriptions that cut API jargon, making it more likely the model uses the right tool in the right order.
- Map API responses into human‑readable feedback — useful for both the LLM and the user.
- Provide hints and suggestions for next steps, plus tips on using available data or recovering from errors.
- Support batch actions (e.g.,
add_issues
instead ofadd_issue
) so the LLM can perform multiple steps in one go. - Prefetch related values — for example, return both a status ID and the actual status name for an issue.
- Hide tools not enabled in a given team’s settings (like
cycles_list
) to reduce noise. - Adjust schemas to match workspace preferences, such as issue priority formats.
In short, it’s not a direct mirror of Linear’s API — it’s tailored so AI agents and chat clients know exactly how to use it effectively.
Installation & development
Prerequisites: Bun, Node.js 24+, Linear account. For remote: a Cloudflare account and the Wrangler package.
You also need an MCP client such as:
Ways to run (pick one)
- Local (API key)
- Local + OAuth
- Cloudflare Worker — wrangler dev (local Worker)
- Cloudflare Worker — remote deploy
1) Quick start (local workflow with API key)
This is the easiest way to start. Run the server with your Linear Personal Access Token (Settings → Security).
https://linear.app/[your-account-name]/settings/account/security
Now connect this server to Alice (Settings → MCP) and set it up as follows:
Or use Claude Desktop with the following settings:
2) Alternative: Local + OAuth
This is a more advanced workflow because it requires creating an OAuth application in Linear. Example:
Tip: the local Authorization Server (for OAuth) runs on PORT+1. If PORT=3040
, auth is on http://localhost:3041
.
When the server is up, connect to Alice:
Alternatively, connect with Claude Desktop:
RS‑only mode (recommended for remote clients)
Enable these flags to require RS‑minted bearer tokens. When enabled, requests without Authorization
or with a non‑mapped Bearer <opaque>
will receive 401
with WWW-Authenticate
so OAuth can start (works with mcp-remote
).
3) Cloudflare Worker — wrangler dev (local Worker)
Fast way to test the Worker locally.
If you want to pass a PAT directly in dev:
With OAuth, also set:
Endpoint (dev): http://127.0.0.1:8787/mcp
(Wrangler prints the exact port).
4) Cloudflare Worker — remote deploy
Wrangler reference (already included as linear/wrangler.toml
):
Deploy with API key:
Endpoint: https://<worker-name>.<account>.workers.dev/mcp
.
Environment examples
- Local (API key):
env.local-api.example
- Local + OAuth:
env.local-oauth.example
- Generic defaults:
env.example
Remote (Cloudflare Worker) with OAuth
- Create KV for token mapping and add it to
wrangler.toml
:
- Set secrets and vars:
- Ensure
OAUTH_SCOPES = "read write"
and include your Worker callback in the Linear app and allowlist:
- Deploy:
The Worker advertises OAuth discovery and maps Resource‑Server tokens to Linear tokens using KV. It reuses the same tool handlers as the local server. In RS‑only mode, it will:
- 401‑challenge when Authorization is missing
- 401‑challenge when a non‑mapped Bearer is presented (unless
AUTH_ALLOW_LINEAR_BEARER=true
) - Rewrite a mapped RS Bearer to a Linear access token before invoking tools
Troubleshooting (Worker)
- If OAuth doesn’t start:
curl -i -X POST https://<worker>/mcp ...
should return401
withWWW-Authenticate
andMcp-Session-Id
. - If tools appear empty in Claude: ensure the Worker returns JSON Schema for
tools/list
(this repo does), and configure Claude withmcp-remote
(not Research connectors). - If redirect is blocked: set a valid
OAUTH_REDIRECT_URI
and allowlist; for dev you can setNODE_ENV=development
and keep loopback hosts.
Client configuration
MCP Inspector (quick test):
Claude Desktop / Cursor via mcp‑remote:
For Cloudflare, replace the URL with https://<worker-name>.<account>.workers.dev/mcp
.
Examples
1) List my issues due today
Request (get viewer timezone/id for context):
Request (issues assigned to me, due today):
Response (example):
2) Create an issue for Alice v3.8 and add it to the project
Request (discover team/project ids):
Response (example):
3) Reschedule a release and mark a meeting as Done
Find the release issue:
Find the meeting issue:
Resolve workflow states (Done) for the team:
Update both:
Response (example):
License
MIT
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables AI agents to manage Linear issues, projects, teams, users, comments, and cycles through an optimized interface designed specifically for language models. Supports both local and remote deployment with OAuth authentication and batch operations.
Related MCP Servers
- AsecurityAlicenseAqualityEnables AI agents to manage issues, projects, and teams on the Linear platform programmatically.Last updated -83MIT License
- AsecurityFlicenseAqualityEnables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.Last updated -833
- -securityFlicense-qualityEnables AI models to interact with Linear for issue tracking and project management through capabilities like creating/searching issues, managing sprints, and retrieving workflow states.Last updated -
- AsecurityAlicenseAqualityA Model Context Protocol server that integrates with Linear, enabling AI assistants to create, update, search, and comment on issues for project management and issue tracking.Last updated -6Apache 2.0