Observe Community MCP Server
A Model Context Protocol (MCP) server that provides LLMs with intelligent access to Observe platform data through semantic search, automated dataset discovery, and metrics intelligence.
What This Does
This MCP server transforms how LLMs interact with observability data by providing intelligent discovery and search capabilities for the Observe platform. Instead of requiring users to know specific dataset names or metric structures, it enables natural language queries that automatically find relevant data sources and provide contextual analysis.
Key Features:
Smart Dataset Discovery: Find relevant datasets using natural language descriptions
Metrics Intelligence: Discover and understand metrics with automated categorization and usage guidance
Documentation Search: Fast BM25-powered search through Observe documentation and OPAL reference
OPAL Query Execution: Run queries against any Observe dataset with multi-dataset join support
OpenTelemetry Integration: Built-in Observe agent for collecting application telemetry data
Zero External Dependencies: Self-contained with PostgreSQL BM25 search
⚠️ EXPERIMENTAL: This is a community-built MCP server for testing and collaboration. A production version is available to Observe customers through official channels.
Table of Contents
Available Tools
The server provides 5 intelligent tools for Observe platform interaction:
🔍 Discovery & Search
discover_datasets
: Find datasets using natural language queries with intelligent categorization and usage examplesdiscover_metrics
: Search through analyzed metrics with business/technical categorization and relevance scoringget_relevant_docs
: Search Observe documentation and OPAL language reference using fast PostgreSQL BM25 search
⚡ Query Execution
execute_opal_query
: Run OPAL queries against single or multiple Observe datasets with comprehensive error handling
🤖 System Integration
get_system_prompt
: Retrieve the system prompt that configures LLMs as Observe platform experts
Each tool includes authentication validation, error handling, and structured result formatting optimized for LLM consumption.
Quick Start
Prerequisites
Docker & Docker Compose (recommended approach)
Python 3.11+ (for manual installation)
Observe API credentials (customer ID and token)
1. Clone and Configure
2. Environment Configuration
Edit your .env
file with these required values:
3. Start with Docker (Recommended)
4. Initialize Intelligence Systems
Run these commands locally to populate the intelligence databases:
5. Connect with Claude Desktop
Add to your claude_desktop_config.json
:
Architecture
The MCP server uses a modern, self-contained architecture built for performance and reliability:
System Overview
Core Components
Component | Technology | Purpose |
MCP Server | FastAPI + MCP Protocol | Tool definitions and request handling |
Observe Integration | Python asyncio + Observe API | Dataset queries and metadata access |
Search Engine | PostgreSQL + ParadeDB BM25 | Fast documentation and content search |
Intelligence Systems | PostgreSQL + Rule-based Analysis | Dataset and metrics discovery with categorization |
OpenTelemetry Collector | OTEL Collector Contrib | Application telemetry collection and forwarding |
Authentication | JWT + RSA signatures | Secure access control |
Database Schema
PostgreSQL with Extensions:
pg_search
(ParadeDB BM25) - Fast full-text searchStandard PostgreSQL - Metadata storage and analysis
Key Tables:
datasets_intelligence
- Analyzed dataset metadata with categories and usage patternsmetrics_intelligence
- Analyzed metrics with business/technical categorizationdocumentation_chunks
- Searchable documentation content with BM25 indexing
Intelligence Systems
Dataset Intelligence
Automatically categorizes and analyzes all Observe datasets to enable natural language discovery:
Categories:
Business: Application, Infrastructure, Database, User, Security, Network
Technical: Logs, Metrics, Traces, Events, Resources
Usage Patterns: Common query examples, grouping suggestions, typical use cases
Example Query: "Find kubernetes error logs" → Automatically discovers and ranks Kubernetes log datasets
Metrics Intelligence
Analyzes metrics from Observe with comprehensive metadata:
Analysis Includes:
Categorization: Business domain (Infrastructure/Application/Database) + Technical type (Error/Latency/Performance)
Dimensions: Common grouping fields with cardinality analysis
Usage Guidance: Typical aggregation functions, alerting patterns, troubleshooting approaches
Value Analysis: Data ranges, frequencies, and patterns
Example Query: "CPU memory utilization metrics" → Returns relevant infrastructure performance metrics with usage guidance
Documentation Search
Fast BM25 full-text search through:
Complete OPAL language reference
Observe platform documentation
Query examples and troubleshooting guides
Search Features:
Relevance scoring with BM25 algorithm
Context-aware chunk retrieval
No external API dependencies
OpenTelemetry Integration
The MCP server includes built-in OpenTelemetry collection via a standard OpenTelemetry Collector, enabling comprehensive application monitoring and observability.
OpenTelemetry Collector
The included OpenTelemetry Collector acts as a telemetry gateway that:
Receives telemetry data from instrumented applications via OTLP protocol
Forwards data to Observe using the standard OTLP HTTP exporter with proper authentication
Adds resource attributes for proper service identification and categorization
Handles retries and buffering for reliable data delivery
Provides debug output for development visibility
Available Endpoints
When the server is running, applications can send telemetry data to:
Protocol | Endpoint | Usage |
OTLP gRPC |
| Recommended for production (within Docker network) |
OTLP HTTP |
| Alternative for HTTP-based integrations |
Health Check |
| Collector health monitoring |
Configuration
The OpenTelemetry Collector is configured via otel-collector-config.yaml
with:
Instrumentation Example
To instrument your Python code with OpenTelemetry:
Data Flow
Application Code → Generates traces, metrics, logs via OTEL SDK
OpenTelemetry Collector → Receives OTLP data on ports 4317/4318
Collector Processing → Adds resource attributes, batches data, provides debug output
OTLP HTTP Export → Sends data to Observe platform with proper authentication
Observe Platform → Receives processed telemetry for analysis
The collector automatically handles authentication, retry logic, and reliable data delivery to the Observe platform.
Authentication
MCP Server Authentication
The server uses JWT-based authentication to control access:
Observe API Access
Important Security Note: Once authenticated to the MCP server, users assume the identity and permissions of the Observe API token configured in the environment. Use Observe RBAC to limit the token's permissions appropriately.
Maintenance
Update Intelligence Data
Monitor Performance
Troubleshooting
Common Issues:
Empty search results: Run intelligence scripts to populate data
Slow performance: Check PostgreSQL connection and restart if needed
Authentication failures: Verify JWT token and public key configuration
Missing datasets: Confirm Observe API credentials and network access
Performance Expectations:
The system is designed for fast response times:
Dataset discovery: < 2 seconds
Metrics discovery: < 1 second
Documentation search: < 500ms
Intelligence updates: Run when data changes
Development
Manual Setup
Available Scripts
Script | Purpose | Runtime |
| Initialize documentation search index | ~30 seconds |
| Analyze and categorize all datasets | ~5-10 minutes |
| Analyze and categorize metrics | ~5-10 minutes |
| Generate JWT tokens for authentication | Instant |
Contributing
This project demonstrates modern approaches to LLM-native observability tooling. Issues, feature requests, and pull requests are welcome.
Architecture Principles:
Self-contained (minimal external dependencies)
Fast (< 2 second response times)
Intelligent (automated categorization and discovery)
Reliable (comprehensive error handling)
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Enables interaction with the Observe platform through OPAL query execution, worksheet data export, dataset management, and monitor operations. Provides AI-powered troubleshooting assistance through vector search across documentation and specialized runbooks.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that provides access to Observe API functionality, enabling LLMs to execute OPAL queries, manage datasets/monitors, and leverage vector search for documentation and troubleshooting runbooks.Last updated -1GPL 3.0
- -securityAlicense-qualityExecute SQL queries, browse schemas, and analyze Oracle Database performance through an AI-compatible Model Context Protocol server.Last updated -2MIT License
- AsecurityAlicenseAqualityProvides flexible access to Oracle databases for AI assistants like Claude, supporting SQL queries across multiple schemas with comprehensive database introspection capabilities.Last updated -62MIT License
- -securityAlicense-qualityEnables AI assistants to interact with New Relic monitoring and observability data through programmatic access to New Relic APIs. Supports APM management, NRQL queries, alert policies, synthetic monitoring, dashboards, infrastructure monitoring, and deployment tracking.Last updated -2MIT License