Skip to main content
Glama

@profullstack/mcp-server

by profullstack

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
HOSTNoServer hostlocalhost
PORTNoServer port3000
NODE_ENVNoEnvironment (development/production)development
OPENAI_API_KEYNoOpenAI API key (required for OpenAI models)
ANTHROPIC_API_KEYNoAnthropic API key (required for Claude models)
STABILITY_API_KEYNoStability AI API key (required for Stable Diffusion)
HUGGINGFACE_API_KEYNoHugging Face API key (required for Hugging Face models)

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/profullstack/mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server