@gpriday/techlead-mcp
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@@gpriday/techlead-mcpPlan refactoring the authentication module to use JWT"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
TechLead MCP
@gpriday/techlead-mcp is a read-only MCP server that gives coding agents two structured tools:
techlead.plantechlead.review
It supports local stdio mode, uses provider-side structured outputs, and can route across OpenAI, Anthropic Claude, and Gemini.
Each tool response includes the structured plan/review, provider token usage when returned by the API, and an estimated USD provider cost derived from the configured per-model pricing catalog. Provider billing dashboards remain authoritative.
Install
npm install -g @gpriday/techlead-mcpConfigure
Create a .env with at least one provider key:
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
GEMINI_API_KEY=Optional config:
techlead-mcp initOptional GitHub issue task source:
TECHLEAD_GITHUB_TOKEN=github_pat_or_fine_grained_tokenThen callers may omit task and provide:
{
"cwd": "/repo",
"githubIssue": {
"repository": "owner/repo",
"issueNumber": 123
},
"files": []
}The server reads the issue body and issue comments through the GitHub REST API and uses that text as the task.
Optional cheap-model router:
TECHLEAD_OPTIONAL_LLM_ROUTER=true
TECHLEAD_ROUTER_PROVIDER=openai
TECHLEAD_DEFAULT_MODEL_TIER=balancedWhen enabled, the router model receives the resolved task text, including GitHub issue text when githubIssue is used, plus context size and risk signals. It may choose OpenAI, Anthropic, or Gemini for the actual plan/review call, but deterministic safety rules still force max-tier routing for auth, security, migrations, data loss, failing tests, and similar high-risk work.
Run
techlead-mcp serve
techlead-mcp modelsTechLead MCP is stdio-only because it is designed to run next to the repository and read safe files under configured local roots. By default, local reads are restricted to the process working directory. Do not run the server from a sensitive parent directory.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/gregpriday/techlead-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server