Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
create_instructionA

Create a new VS Code .instructions.md file with the specified description and content.

list_instructionsA

List all VS Code .instructions.md files in the prompts directory.

get_instructionA

Get the raw content of a VS Code .instructions.md file.

update_instructionB

Update an existing VS Code .instructions.md file with new description or content.

delete_instructionA

Delete a VS Code .instructions.md file from the prompts directory.

create_chatmodeA

Create a new VS Code .chatmode.md file with the specified description, content, and tools.

list_chatmodesA

List all VS Code .chatmode.md files in the prompts directory.

get_chatmodeA

Get the raw content of a VS Code .chatmode.md file.

update_chatmodeA

Update an existing VS Code .chatmode.md file with new description, content, or tools.

delete_chatmodeA

Delete a VS Code .chatmode.md file from the prompts directory.

update_chatmode_from_sourceB

Update a .chatmode.md file from its source definition.

refresh_libraryA

Refresh the Mode Manager MCP Library from its source URL.

browse_mode_libraryA

Browse the Mode Manager MCP Library and filter by category or search term.

install_from_libraryB

Install a chatmode or instruction from the Mode Manager MCP Library.

optimize_memoryB

Manually optimize a memory file using AI to reorganize and consolidate entries while preserving all information.

memory_statsB

Get detailed statistics and optimization status for a memory file.

configure_memory_optimizationC

Configure memory optimization settings for auto-optimization behavior.

rememberA

Store user information persistently for future conversations. When users share preferences, coding standards, project details, or any context they want remembered, use this tool. Extract the key information from natural language and store it appropriately. The system automatically detects scope (user/workspace) and language specificity from context. For ambiguous cases, you will receive clarification prompts to ask the user. Examples of what to remember: coding preferences ('I like detailed docstrings'), project specifics ('This app uses PostgreSQL'), language standards ('For Python, use type hints'), workflow preferences ('Always run tests before committing'). Use only the memory_item parameter with natural language - the system handles scope detection.

Prompts

Interactive templates invoked by user choice

NameDescription
onboardingDirect onboarding instructions for Copilot, including memory file structure.

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NiclasOlofsson/mode-manager-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server