Skip to main content
Glama

What you get

Type

What for

MCP URI / Tool id

Resources

Browse blueprints, hierarchies, and user info read-only

lynxprompt://blueprints

lynxprompt://blueprint/{id}

lynxprompt://hierarchies

lynxprompt://hierarchy/{id}

lynxprompt://user

Tools

Create, update, delete blueprints and manage hierarchies

search_blueprints

create_blueprint

update_blueprint

delete_blueprint

create_hierarchy

delete_hierarchy

Everything is exposed over a single JSON-RPC endpoint (/mcp). LLMs / Agents can: initialize -> readResource -> listTools -> callTool ... and so on.


Quick-start (Docker Compose)

services:
  lynxprompt-mcp:
    image: drumsergio/lynxprompt-mcp:latest
    ports:
      - "127.0.0.1:8080:8080"
    environment:
      - LYNXPROMPT_URL=https://lynxprompt.com
      - LYNXPROMPT_TOKEN=lp_xxx

Security note: The HTTP transport listens on 127.0.0.1:8080 by default. If you need to expose it on a network, place it behind a reverse proxy with authentication.

Install via npm (stdio transport)

npx lynxprompt-mcp

Or install globally:

npm install -g lynxprompt-mcp
lynxprompt-mcp

This downloads the pre-built Go binary from GitHub Releases for your platform and runs it with stdio transport. Requires at least one published release.

Local build

git clone https://github.com/GeiserX/lynxprompt-mcp
cd lynxprompt-mcp

# (optional) create .env from the sample
cp .env.example .env && $EDITOR .env

go run ./cmd/server

Configuration

Variable

Default

Description

LYNXPROMPT_URL

https://lynxprompt.com

LynxPrompt instance URL (without trailing /)

LYNXPROMPT_TOKEN

(required)

API token in lp_xxx format

LISTEN_ADDR

127.0.0.1:8080

HTTP listen address (Docker sets 0.0.0.0:8080)

TRANSPORT

(empty = HTTP)

Set to stdio for stdio transport

Put them in a .env file (from .env.example) or set them in the environment.

Testing

Tested with Inspector and it is currently fully working. Before making a PR, make sure this MCP server behaves well via this medium.

Example configuration for client LLMs

{
  "schema_version": "v1",
  "name_for_human": "LynxPrompt-MCP",
  "name_for_model": "lynxprompt_mcp",
  "description_for_human": "Browse, search, and manage AI configuration blueprints from LynxPrompt.",
  "description_for_model": "Interact with a LynxPrompt instance that stores AI configuration blueprints. First call initialize, then reuse the returned session id in header \"Mcp-Session-Id\" for every other call. Use readResource to fetch URIs that begin with lynxprompt://. Use listTools to discover available actions and callTool to execute them.",
  "auth": { "type": "none" },
  "api": {
    "type": "jsonrpc-mcp",
    "url":  "http://localhost:8080/mcp",
    "init_method": "initialize",
    "session_header": "Mcp-Session-Id"
  },
  "logo_url": "https://lynxprompt.com/logo.png",
  "contact_email": "acsdesk@protonmail.com",
  "legal_info_url": "https://github.com/GeiserX/lynxprompt-mcp/blob/main/LICENSE"
}

Credits

LynxPrompt -- AI configuration blueprint management

MCP-GO -- modern MCP implementation

GoReleaser -- painless multi-arch releases

Maintainers

@GeiserX.

Contributing

Feel free to dive in! Open an issue or submit PRs.

LynxPrompt-MCP follows the Contributor Covenant Code of Conduct.

Other MCP Servers by GeiserX

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/GeiserX/lynxprompt-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server