Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GITHUB_TOKENNoGitHub token used for content fetching or other integrations within the Horizon pipeline.
HORIZON_PATHNoPath to the Horizon repository. The server uses this to locate the Horizon implementation and its data/config.json. If not provided, the server will attempt to discover it in default locations like ./Horizon or ../Horizon.
OPENAI_API_KEYNoOpenAI API key used for AI-driven tools such as scoring (hz_score_items), enrichment (hz_enrich_items), and summary generation (hz_generate_summary).

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
hz_validate_config

校验 Horizon 配置和关键环境变量。

hz_fetch_items

抓取并去重内容,写入 run 的 raw 阶段。

hz_score_items

对指定阶段内容执行 AI 打分,写入 scored 阶段。

hz_filter_items

按阈值过滤并做主题去重,写入 filtered 阶段。

hz_enrich_items

对高分内容执行背景富化,写入 enriched 阶段。

hz_generate_summary

从某阶段内容生成 Markdown 摘要。

hz_run_pipeline

一键执行抓取->打分->过滤->富化->摘要。

hz_list_runs

列出最近运行记录与阶段状态。

hz_get_run_meta

读取指定 run 的元数据。

hz_get_run_stage

读取指定 run 的某一阶段内容。

hz_get_run_summary

读取指定 run 的摘要内容。

hz_get_metrics

读取服务内存指标。

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription
r_server_infoServer metadata resource.
r_metricsIn-memory metrics snapshot.
r_runsRecent run list.
r_effective_configEffective default config resolved from local Horizon path.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/henry-insomniac/Horizon-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server