Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
EMBED_MODELNoModel name
EMBED_API_KEYNoAPI key
EMBED_API_URLNoAPI endpointhttp://127.0.0.1:1234/v1
OBSIDIAN_ROOTNoVault path./obsidian
EMBED_PROVIDERNoopenai / gigachat / ollamaopenai

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
read_fileB

Read Obsidian file with metadata (YAML frontmatter + content)

write_fileC

Write file with YAML frontmatter

list_filesC

List files in base with filters

get_childrenC

Get child files

get_parentsC

Get parent links from file

suggest_metadataC

Suggest metadata for file based on content

embedC

Get embedding for text

index_allC

Index all files in database

suggest_parentsC

Suggest parents based on embeddings

calibrate_coresB

Recalculate core etalon embeddings

recalc_core_mixA

Recalculate core_mix bottom-up: quants (L4) -> modules (L3) -> patterns (L2). Run after index_all with embeddings or after recalc_signs.

recalc_signsB

Recalculate sign_auto for all files based on content embeddings. Does not modify YAML.

format_entity_compactD

Format compact entity structure formula

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/KVANTRA-dev/NOUZ-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server