Skip to main content
Glama

Linux Bash MCP Server

by gunjanjp

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
debugModeNoEnable or disable debug mode for detailed loggingfalse
maxBufferSizeNoThe maximum buffer size for command output in bytes10485760
scriptTimeoutNoThe timeout for script execution in milliseconds60000
defaultTimeoutNoThe default timeout for commands in milliseconds30000
wslDistributionNoThe WSL distribution to use for running commandsauto-detect

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
execute_bash_command

Execute a bash command in WSL2 Linux environment

execute_bash_script

Execute a bash script file in WSL2 Linux environment

create_bash_script

Create a bash script file with specified content

list_directory

List contents of a directory in WSL2 Linux environment

get_system_info

Get system information about the WSL2 Linux environment

check_wsl_status

Check WSL2 status and get distribution information

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gunjanjp/linuxshell-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server