Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
get_project_settingsB
Get the project settings for the current working directory or a proposed path.

Returns configuration settings including project path, type, and metadata.
If proposed_path is not provided or invalid, uses the current directory.
thinkC

Record a thought for later reference and analysis.

This tool allows you to record thoughts during development or analysis processes. Thoughts can be organized by category and depth to create a hierarchical structure of analysis.

get_thoughtsB

Retrieve recorded thoughts.

This tool retrieves all previously recorded thoughts, optionally filtered by category. You can also choose to organize them hierarchically by depth.

clear_thoughtsA

Clear recorded thoughts.

This tool removes previously recorded thoughts, optionally filtered by category. If no category is specified, all thoughts will be cleared.

get_thought_statsB

Get statistics about recorded thoughts.

This tool provides statistics about recorded thoughts, such as count and depth distribution. Results can be filtered by category.

detect_thinking_directiveB

Detect thinking directives.

This tool analyzes text to detect directives suggesting deeper thinking, such as "think harder", "think deeper", "think again", etc.

should_thinkC

Assess whether deeper thinking is needed for a query.

This tool analyzes a query to determine if it requires deeper thinking, based on complexity indicators and context.

think_moreC

Get guidance for thinking more deeply.

This tool provides suggestions and guidance for thinking more deeply about a specific query or thought.

initialize_ideB
Initialize IDE project structure with appropriate directories and config files.

This tool sets up the necessary directories and configuration files for IDE
integration, including .ai-templates directory and IDE-specific rules.

Note: If project_path is omitted, not a string, invalid, or the directory doesn't exist,
the current working directory will be used automatically.
initialize_ide_rulesB
Initialize IDE rules for a project.

This tool sets up IDE-specific rules for a project, creating the necessary
files and directories for AI assistants to understand project conventions.

Note: If project_path is omitted, not a string, or invalid, the current working
directory will be used automatically.
prime_contextB
Prime project context by analyzing documentation and structure.

This tool analyzes the project structure and documentation to provide
context information for AI assistants working with the project.

Note: If project_path is omitted, not a string, or invalid, the current working
directory will be used automatically.
migrate_mcp_configB
Migrate MCP configuration between different IDEs.

This tool helps migrate configuration and rules between different IDEs,
ensuring consistent AI assistance across different environments.

Note: If project_path is omitted, not a string, or invalid, the current working
directory will be used automatically.
process_natural_languageA

Process natural language command and route to appropriate tool.

This tool takes a natural language query and determines which tool to call with what parameters, providing a way to interact with the MCP Agile Flow tools using natural language.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/smian0/mcp-agile-flow'

If you have feedback or need assistance with the MCP directory API, please join our Discord server