Skip to main content
Glama

Mode Manager MCP

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Schema

Prompts

Interactive templates invoked by user choice

NameDescription
onboardingDirect onboarding instructions for Copilot, including memory file structure.

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
create_instruction

Create a new VS Code .instructions.md file with the specified description and content.

list_instructions

List all VS Code .instructions.md files in the prompts directory.

get_instruction

Get the raw content of a VS Code .instructions.md file.

update_instruction

Update an existing VS Code .instructions.md file with new description or content.

delete_instruction

Delete a VS Code .instructions.md file from the prompts directory.

create_chatmode

Create a new VS Code .chatmode.md file with the specified description, content, and tools.

list_chatmodes

List all VS Code .chatmode.md files in the prompts directory.

get_chatmode

Get the raw content of a VS Code .chatmode.md file.

update_chatmode

Update an existing VS Code .chatmode.md file with new description, content, or tools.

delete_chatmode

Delete a VS Code .chatmode.md file from the prompts directory.

update_chatmode_from_source

Update a .chatmode.md file from its source definition.

refresh_library

Refresh the Mode Manager MCP Library from its source URL.

browse_mode_library

Browse the Mode Manager MCP Library and filter by category or search term.

install_from_library

Install a chatmode or instruction from the Mode Manager MCP Library.

optimize_memory

Manually optimize a memory file using AI to reorganize and consolidate entries while preserving all information.

memory_stats

Get detailed statistics and optimization status for a memory file.

configure_memory_optimization

Configure memory optimization settings for auto-optimization behavior.

remember

Store user information persistently for future conversations. When users share preferences, coding standards, project details, or any context they want remembered, use this tool. Extract the key information from natural language and store it appropriately. The system automatically detects scope (user/workspace) and language specificity from context. For ambiguous cases, you will receive clarification prompts to ask the user. Examples of what to remember: coding preferences ('I like detailed docstrings'), project specifics ('This app uses PostgreSQL'), language standards ('For Python, use type hints'), workflow preferences ('Always run tests before committing'). Use only the memory_item parameter with natural language - the system handles scope detection.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NiclasOlofsson/mode-manager-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server