Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
TAVILY_API_KEYNoAPI key for Tavily web search
FLOW_LLM_API_KEYYesAPI key for OpenAI-compatible LLM service
DASHSCOPE_API_KEYNoAPI key for DashScope search and entity extraction
FLOW_LLM_BASE_URLYesBase URL for OpenAI-compatible LLM service
TUSHARE_API_TOKENNoAPI token for Tushare historical data analysis
BAILIAN_MCP_API_KEYNoAPI key for external MCP services

Tools

Functions exposed to the LLM to take actions

NameDescription
load_skill_metadata

Load metadata (name and description) for all available skills from the skills directory.

load_skill

Load one skill's instructions from the SKILL.md.

read_reference_file

Read a reference file from a skill (e.g., forms.md, reference.md, ooxml.md)

run_shell_command

run shell command in a subprocess. Here you need to fill in skill_name. This skill_name parameter allows you to navigate directly to the folder corresponding to skill_name, making it more convenient to use the scripts within that folder to execute commands. If you want to know the exact path, you can use pwd to get the absolute path.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/zouyingcao/agentskills-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server