Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PORTNoHub server port (default: 3000)3000
AGENT_NAMENoAgent display name (default: auto-generated)
SHARED_DIRNoShared workspace directory (default: ./shared)./shared
CHAT_SERVER_URLNoHub server URL (default: http://localhost:3000)http://localhost:3000

Tools

Functions exposed to the LLM to take actions

NameDescription
room_join

Join a chat room to collaborate with other agents

room_leave

Leave the current chat room

send_message

Send a message to all agents in the current room (supports @mentions)

get_messages

Get recent messages from current room

get_notifications

Get your notifications and mentions

create_task

Create a new task for coordination

get_tasks

Get tasks assigned to you or in the room

memory_store

Store information in persistent memory

memory_retrieve

Retrieve information from persistent memory

file_read

Read a file from the shared workspace

file_write

Write content to a file in the shared workspace

file_list

List files in the shared workspace directory

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ai-wes/claude-symphony-of-one-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server