Integrates with Codecov for tracking code coverage statistics and reporting.
Supports GitHub integration for repository management, tracking issues, pulls, stars and other GitHub project metrics.
Integrates with PyPI for package distribution and management information.
Connects with PyUp for monitoring and managing Python dependencies.
Uses YAML for configuration of LLM environments, including resources, prompts, tools, and global settings.
mcp-server-llmling
LLMling Server Manual
Overview
mcp-server-llmling is a server for the Machine Chat Protocol (MCP) that provides a YAML-based configuration system for LLM applications.
LLMLing, the backend, provides a YAML-based configuration system for LLM applications. It allows to set up custom MCP servers serving content defined in YAML files.
Static Declaration: Define your LLM's environment in YAML - no code required
MCP Protocol: Built on the Machine Chat Protocol (MCP) for standardized LLM interaction
Component Types:
Resources: Content providers (files, text, CLI output, etc.)
Prompts: Message templates with arguments
Tools: Python functions callable by the LLM
The YAML configuration creates a complete environment that provides the LLM with:
Access to content via resources
Structured prompts for consistent interaction
Tools for extending capabilities
Key Features
1. Resource Management
Load and manage different types of resources:
Text files (
PathResource
)Raw text content (
TextResource
)CLI command output (
CLIResource
)Python source code (
SourceResource
)Python callable results (
CallableResource
)Images (
ImageResource
)
Support for resource watching/hot-reload
Resource processing pipelines
URI-based resource access
2. Tool System
Register and execute Python functions as LLM tools
Support for OpenAPI-based tools
Entry point-based tool discovery
Tool validation and parameter checking
Structured tool responses
3. Prompt Management
Static prompts with template support
Dynamic prompts from Python functions
File-based prompts
Prompt argument validation
Completion suggestions for prompt arguments
4. Multiple Transport Options
Stdio-based communication (default)
Server-Sent Events (SSE) / Streamable HTTP for web clients
Support for custom transport implementations
Usage
With Zed Editor
Add LLMLing as a context server in your settings.json
:
With Claude Desktop
Configure LLMLing in your claude_desktop_config.json
:
Manual Server Start
Start the server directly from command line:
1. Programmatic usage
2. Using Custom Transport
3. Resource Configuration
4. Tool Configuration
For OpenAPI schemas, you can installRedocly CLI to bundle and resolve OpenAPI specifications before using them with LLMLing. This helps ensure your schema references are properly resolved and the specification is correctly formatted. If redocly is installed, it will be used automatically.
Server Configuration
The server is configured through a YAML file with the following sections:
MCP Protocol
The server implements the MCP protocol which supports:
Resource Operations
List available resources
Read resource content
Watch for resource changes
Tool Operations
List available tools
Execute tools with parameters
Get tool schemas
Prompt Operations
List available prompts
Get formatted prompts
Get completions for prompt arguments
Notifications
Resource changes
Tool/prompt list updates
Progress updates
Log messages
This server cannot be installed
A server for the Machine Chat Protocol (MCP) that provides a YAML-based configuration system for LLM applications, allowing users to define resources, tools, and prompts without writing code.
Related Resources
Related MCP Servers
- AsecurityAlicenseAqualityAn MCP server that enables LLMs to interact with Moodle platforms to manage courses, students, assignments, and quizzes through natural language commands.Last updated -718MIT License
- AsecurityAlicenseAqualityA Model Context Protocol (MCP) server designed to easily dump your codebase context into Large Language Models (LLMs).Last updated -1242Apache 2.0
- AsecurityFlicenseAqualityA ready-to-use starter implementation of the Model Context Protocol (MCP) server that enables applications to provide standardized context for LLMs with sample resources, tools, and prompts.Last updated -21
- -securityFlicense-qualityA comprehensive Model Context Protocol (MCP) server that provides 37+ intelligent development tools across JavaScript/TypeScript, Rust, and Python with security-first design and high-performance features.Last updated -111