Skip to main content
Glama

Lightdash MCP Server

by syucream

lightdash-mcp-server

A MCP(Model Context Protocol) server that accesses to Lightdash.

This server provides MCP-compatible access to Lightdash's API, allowing AI assistants to interact with your Lightdash data through a standardized interface.

Features

Available tools:

  • list_projects - List all projects in the Lightdash organization
  • get_project - Get details of a specific project
  • list_spaces - List all spaces in a project
  • list_charts - List all charts in a project
  • list_dashboards - List all dashboards in a project
  • get_custom_metrics - Get custom metrics for a project
  • get_catalog - Get catalog for a project
  • get_metrics_catalog - Get metrics catalog for a project
  • get_charts_as_code - Get charts as code for a project
  • get_dashboards_as_code - Get dashboards as code for a project

Quick Start

Installation

Installing via Smithery

To install Lightdash MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install lightdash-mcp-server --client claude
Manual Installation
npm install lightdash-mcp-server

Configuration

  • LIGHTDASH_API_KEY: Your Lightdash PAT
  • LIGHTDASH_API_URL: The API base URL

Usage

The lightdash-mcp-server supports two transport modes: Stdio (default) and HTTP.

Stdio Transport (Default)
  1. Start the MCP server:
npx lightdash-mcp-server
  1. Edit your MCP configuration json:
... "lightdash": { "command": "npx", "args": [ "-y", "lightdash-mcp-server" ], "env": { "LIGHTDASH_API_KEY": "<your PAT>", "LIGHTDASH_API_URL": "https://<your base url>" } }, ...
HTTP Transport (Streamable HTTP)
  1. Start the MCP server in HTTP mode:
npx lightdash-mcp-server -port 8080

This starts the server using StreamableHTTPServerTransport, making it accessible via HTTP at http://localhost:8080/mcp.

  1. Configure your MCP client to connect via HTTP:

For Claude Desktop and other MCP clients:

Edit your MCP configuration json to use the url field instead of command and args:

... "lightdash": { "url": "http://localhost:8080/mcp" }, ...

For programmatic access:

Use the streamable HTTP client transport:

import { Client } from '@modelcontextprotocol/sdk/client/index.js'; import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js'; const client = new Client({ name: 'my-client', version: '1.0.0' }, { capabilities: {} }); const transport = new StreamableHTTPClientTransport( new URL('http://localhost:8080/mcp') ); await client.connect(transport);

Note: When using HTTP mode, ensure the environment variables LIGHTDASH_API_KEY and LIGHTDASH_API_URL are set in the environment where the server is running, as they cannot be passed through MCP client configuration.

See examples/list_spaces_http.ts for a complete example of connecting to the HTTP server programmatically.

Development

Available Scripts

  • npm run dev - Start the server in development mode with hot reloading (stdio transport)
  • npm run dev:http - Start the server in development mode with HTTP transport on port 8080
  • npm run build - Build the project for production
  • npm run start - Start the production server
  • npm run lint - Run linting checks (ESLint and Prettier)
  • npm run fix - Automatically fix linting issues
  • npm run examples - Run the example scripts

Contributing

  1. Fork the repository
  2. Create your feature branch
  3. Run tests and linting: npm run lint
  4. Commit your changes
  5. Push to the branch
  6. Create a Pull Request
Deploy Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

MCP-compatible server that enables AI assistants to interact with Lightdash analytics data, providing tools to list and retrieve projects, spaces, charts, dashboards, and metrics through a standardized interface.

  1. Features
    1. Quick Start
      1. Installation
      2. Configuration
      3. Usage
    2. Development
      1. Available Scripts
      2. Contributing

    Related MCP Servers

    • A
      security
      A
      license
      A
      quality
      A Model Context Protocol server that enables AI assistants to interact with Linear project management systems, allowing users to retrieve, create, and update issues, projects, and teams through natural language.
      Last updated -
      42
      827
      99
      TypeScript
      MIT License
      • Apple
    • A
      security
      F
      license
      A
      quality
      An MCP server that provides AI assistants with access to Astro documentation, enabling them to search and reference Astro docs when helping users with Astro-related tasks.
      Last updated -
      1
      3
      JavaScript
      • Apple
    • A
      security
      F
      license
      A
      quality
      An MCP-compatible server that exposes automated API tools to MCP clients like Claude Desktop or Postman, allowing AI assistants to interact with your selected APIs.
      Last updated -
      3
      JavaScript
    • -
      security
      A
      license
      -
      quality
      An MCP server that enables AI assistants to interact with Signoz observability platform, providing tools to query dashboards, metrics, traces, logs, and APM data with time range support.
      Last updated -
      9
      Python
      MIT License

    View all related MCP servers

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/syucream/lightdash-mcp-server'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server