Skip to main content
Glama

AI Collaboration MCP Server

by hurryupmitch

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OLLAMA_MODELNoOllama model to usellama3.2:latest
GEMINI_API_KEYNoYour Gemini API key
OPENAI_API_KEYNoYour OpenAI API key
OLLAMA_BASE_URLNoBase URL for Ollama configuration (for local AI)http://localhost:11434
ANTHROPIC_API_KEYNoYour Claude API key

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hurryupmitch/ai-collaboration-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server