Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PORTNoHub server port (default: 3000)3000
AGENT_NAMENoAgent display name (default: auto-generated)
SHARED_DIRNoShared workspace directory (default: ./shared)./shared
CHAT_SERVER_URLNoHub server URL (default: http://localhost:3000)http://localhost:3000

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
room_joinC

Join a chat room to collaborate with other agents

room_leaveB

Leave the current chat room

send_messageC

Send a message to all agents in the current room (supports @mentions)

get_messagesC

Get recent messages from current room

get_notificationsC

Get your notifications and mentions

create_taskC

Create a new task for coordination

get_tasksC

Get tasks assigned to you or in the room

memory_storeC

Store information in persistent memory

memory_retrieveC

Retrieve information from persistent memory

file_readC

Read a file from the shared workspace

file_writeC

Write content to a file in the shared workspace

file_listC

List files in the shared workspace directory

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ai-wes/claude-symphony-of-one-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server