Skip to main content
Glama

Observe Community MCP Server

by rustomax

Observe Community MCP Server

Python FastAPI Pinecone Model Context Protocol Observe

A Model Context Protocol (MCP) server that provides LLMs with access to Observe platform functionality through semantic search and LLM-powered dataset intelligence.

Purpose

This MCP server enables LLMs to interact with Observe platform data through a set of 7 tools. It includes dataset discovery capabilities powered by LLM reasoning, OPAL query execution, semantic search for documentation, and AI-powered observability investigations.

Key capabilities:

  • Dataset discovery using LLM-powered semantic analysis
  • OPAL query execution against Observe datasets
  • Vector-based search for OPAL documentation and runbooks
  • AI-powered observability investigations with GPT-5 reasoning
  • JWT-based authentication with role-based access control

⚠️ DISCLAIMER: This is an experimental MCP server for testing and collaboration. Use at your own risk. A production-ready version, based on a completely different code base, is available to Observe customers.

Table of Contents

Available MCP Tools

This MCP server provides 7 core tools for Observe platform access:

Dataset Intelligence

  • query_semantic_graph: Find relevant datasets using LLM-powered analysis of query intent and dataset metadata. This uses a chached and semantically enriched dataset metadata.
  • list_datasets: List available datasets with filtering options (a direct API call to Observe)
  • get_dataset_info: Get detailed schema information about specific datasets (a direct API call to Observe)

Query Execution

  • execute_opal_query: Execute OPAL queries against Observe datasets with error handling and multi-dataset support (a direct API call to Observe)

Knowledge & Documentation

  • get_relevant_docs: Search Observe documentation, include OPAL language reference (semantic vector search)
  • get_system_prompt: Retrieve the system prompt that configures LLMs as an Observe expert

AI-Powered Investigations

  • o11y_scout: Autonomous observability investigation agent powered by GPT-5 with reasoning capabilities. Takes a natural language query and executes multi-step investigations using real data from Observe platform tools to provide data-driven analysis and recommendations.

Each tool includes error handling, authentication validation, and structured result formatting.

Claude Desktop using Observe MCP Server

Quick Start

Prerequisites

  • Python 3.11+
  • Docker and Docker Compose (recommended)
  • Pinecone account with API key
  • Observe API credentials (customer ID and token)
  • OpenAI API key (for LLM-powered dataset intelligence)

Installation

git clone https://github.com/your-repo/observe-community-mcp.git cd observe-community-mcp

Environment Setup

Copy the example environment file and configure your credentials:

cp .env.template .env # Edit .env with your API keys and configuration

Required variables:

# Observe Platform OBSERVE_CUSTOMER_ID=your_customer_id OBSERVE_TOKEN=your_api_token OBSERVE_DOMAIN=observeinc.com # Vector Search (Pinecone) PINECONE_API_KEY=your_pinecone_key PINECONE_DOCS_INDEX=observe-docs # LLM Intelligence (OpenAI) OPENAI_API_KEY=your_openai_key # Database (Dataset Intelligence) POSTGRES_PASSWORD=secure_password # Security PUBLIC_KEY_PEM="-----BEGIN PUBLIC KEY----- your_public_key_here -----END PUBLIC KEY-----"

Database Setup

The server requires a PostgreSQL database with pgvector extension for dataset intelligence functionality. This is automatically configured when using Docker Compose.

Important: The dataset intelligence system requires populating the database with dataset metadata before first use.

Initialize Vector Database

Populate the Pinecone indices with documentation:

# Populate documentation index python scripts/populate_docs_index.py # Initialize dataset intelligence database (REQUIRED) # This populates PostgreSQL with dataset metadata and embeddings python scripts/populate_dataset_intelligence.py

Note: If you don't have access to Observe documentation files, contact your Observe representative.

Options for all scripts:

  • --force: Recreate the index from scratch
  • --verbose: Enable detailed logging

Running the Server

# Start with Docker Compose docker-compose up --build

The server will be available at http://localhost:8000 with automatic health checks and PostgreSQL database.

Manual Python Execution

For development:

python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt # Initialize database (required before first run) python scripts/populate_dataset_intelligence.py # Start server python observe_server.py

Authentication Setup

⚠️ CRITICAL: READ THIS SECTION COMPLETELY

There are two types of authentication mechanisms used in this server:

Observe API authentication (Observe API bearer token) - Uses your Observe API token to access platform data. This token inherits the permissions of the user who created it.

⚠️ IMPORTANT: Once a user is authenticated to the MCP server, they assume the identity of the user who generated the Observe token, not their own identity. Use RBAC and limit the Observe API token to specific roles and permissions you want available to MCP server users.

MCP authentication (MCP bearer token) - Controls access to the MCP server itself. This is necessary because the server exposes resource-intensive APIs (Pinecone, OpenAI).

Authentication

The MCP server includes basic RBAC with predefined roles: admin, read, write. These do not map to Observe roles and only control MCP server tool access.

Setting up MCP Authentication

Create private and public key files:

openssl genrsa -out private_key.pem 2048 openssl rsa -in private_key.pem -pubout -out public_key.pem

This creates:

  • private_key.pem - Keep this secret. Used to sign MCP bearer tokens.
  • public_key.pem - Add to the server configuration for token verification.

Copy the public key to your .env file:

cat public_key.pem # Copy output to .env as PUBLIC_KEY_PEM

Generate user tokens:

cd ./scripts generate_mcp_token.sh 'user@example.com' 'admin,read,write' '4H'

Security: Keep token expiration times short (hours rather than days).

Local-only deployment: If running locally without public access, you can disable MCP authentication by modifying the server configuration.

Using with Claude Desktop

Add the following to your claude_desktop_config.json:

{ "mcpServers": { "observe-community": { "command": "npx", "args": [ "mcp-remote@latest", "http://localhost:8000/sse", "--header", "Authorization: Bearer your_mcp_token_here" ] } } }

Network Configuration Note: MCP clients typically restrict HTTP access to localhost only. For internet-accessible deployments, implement an HTTPS reverse proxy with proper DNS configuration and SSL certificates.

The server will be available with 7 MCP tools for dataset discovery, query execution, documentation search, and AI-powered investigations.

Claude Desktop using Observe MCP Server

Architecture Overview

The MCP server uses a modular architecture:

ComponentPurpose
observe_server.pyMain MCP server with 6 tool definitions
src/observe/Observe API integration (queries, datasets, client)
src/dataset_intelligence/LLM-powered dataset discovery with PostgreSQL + pgvector
src/pinecone/Vector database operations for documentation search
src/auth/JWT authentication and scope-based authorization
scripts/Database population and maintenance scripts

Technology Stack:

  • MCP Server: FastAPI + MCP Protocol
  • Dataset Intelligence: PostgreSQL + pgvector + OpenAI GPT-4
  • Query Engine: Python asyncio + Observe API
  • Vector Search: Pinecone + OpenAI embeddings
  • Authentication: JWT + RSA keys
  • Caching: PostgreSQL-based dataset metadata caching

Dataset Semantic Search System

The dataset semantic search system uses LLM reasoning to understand user queries and match them with relevant Observe datasets.

How It Works

  1. Query Analysis: Analyzes user queries to detect explicit dataset mentions, domain keywords, and intent
  2. Candidate Selection: Retrieves relevant datasets from PostgreSQL cache with smart sampling
  3. LLM Ranking: Uses GPT-4 to rank datasets based on relevance with detailed explanations
  4. Result Enhancement: Applies quality filters and diversity balancing

Key Features

Explicit Dataset Detection: Recognizes when users mention specific datasets by name

"Give me k8s logs" → Kubernetes Explorer/Kubernetes Logs (prioritized) "Show me span data" → OpenTelemetry/Span (prioritized)

Domain Intelligence: Maps query domains to appropriate dataset types

"database performance" → Database Call datasets "trace analysis" → OpenTelemetry/Span datasets "error investigation" → Log datasets + Error spans

Smart Prioritization: Applies observability expertise

  • OpenTelemetry/Span always ranks first for trace/performance queries
  • Log datasets prioritized for debugging/error queries
  • Database datasets top-ranked for SQL/performance queries

Performance

The system provides dataset recommendations typically within 1-3 seconds, with high accuracy for domain-specific queries. It maintains a local cache of dataset metadata in PostgreSQL for performance.

Maintenance

Update Vector Databases

# Update documentation index python scripts/populate_docs_index.py --force # Update dataset intelligence cache python scripts/populate_dataset_intelligence.py --force

Monitor Performance

# Check logs for performance metrics docker logs observe-mcp-server | grep "[SEMANTIC_GRAPH]" # Check database status docker exec observe-opal-memory psql -U opal -d opal_memory -c "\dt"

Common Issues

  1. No dataset recommendations: Verify OpenAI API key and database population
  2. Slow responses: Check PostgreSQL connection and dataset cache
  3. Authentication errors: Validate JWT token and public key configuration
  4. Missing documentation: Run populate scripts with --force flag

All scripts support --force to recreate indices and --verbose for detailed logging.


Contributing: Issues, feature requests, and pull requests are welcome. This project demonstrates LLM-native observability tooling approaches.

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables interaction with the Observe platform through OPAL query execution, worksheet data export, dataset management, and monitor operations. Provides AI-powered troubleshooting assistance through vector search across documentation and specialized runbooks.

  1. Purpose
    1. Table of Contents
      1. Available MCP Tools
        1. Dataset Intelligence
        2. Query Execution
        3. Knowledge & Documentation
        4. AI-Powered Investigations
      2. Quick Start
        1. Prerequisites
        2. Installation
        3. Environment Setup
        4. Database Setup
        5. Initialize Vector Database
      3. Running the Server
        1. Docker (Recommended)
        2. Manual Python Execution
      4. Authentication Setup
        1. Setting up MCP Authentication
      5. Using with Claude Desktop
        1. Architecture Overview
          1. Dataset Semantic Search System
            1. How It Works
            2. Key Features
            3. Performance
          2. Maintenance
            1. Update Vector Databases
            2. Monitor Performance
            3. Common Issues

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol server that provides access to Observe API functionality, enabling LLMs to execute OPAL queries, manage datasets/monitors, and leverage vector search for documentation and troubleshooting runbooks.
            Last updated -
            1
            GPL 3.0
          • -
            security
            A
            license
            -
            quality
            Execute SQL queries, browse schemas, and analyze Oracle Database performance through an AI-compatible Model Context Protocol server.
            Last updated -
            1
            MIT License
          • A
            security
            A
            license
            A
            quality
            Provides flexible access to Oracle databases for AI assistants like Claude, supporting SQL queries across multiple schemas with comprehensive database introspection capabilities.
            Last updated -
            6
            5
            MIT License
            • Apple
          • -
            security
            A
            license
            -
            quality
            Enables AI assistants to interact with New Relic monitoring and observability data through programmatic access to New Relic APIs. Supports APM management, NRQL queries, alert policies, synthetic monitoring, dashboards, infrastructure monitoring, and deployment tracking.
            Last updated -
            1
            MIT License
            • Apple

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/rustomax/observe-community-mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server