Skip to main content
Glama

AutoGen MCP Server

.env.example1.25 kB
# Enhanced AutoGen MCP Server Configuration # OpenAI API Key for AutoGen (Required) OPENAI_API_KEY=your-openai-api-key-here # Path to AutoGen MCP configuration file AUTOGEN_MCP_CONFIG=config.json # Model Configuration (Optional - overrides config.json) OPENAI_MODEL=gpt-4o OPENAI_TEMPERATURE=0.7 OPENAI_MAX_TOKENS=4000 OPENAI_TIMEOUT=60 # Code Execution Settings CODE_EXECUTION_MODE=local # local or docker CODE_EXECUTION_TIMEOUT=60 CODE_EXECUTION_WORK_DIR=coding # Python Path (Optional - for custom Python installations) PYTHON_PATH=python # Enhanced Features Configuration ENABLE_PROMPTS=true ENABLE_RESOURCES=true ENABLE_WORKFLOWS=true ENABLE_TEACHABILITY=true ENABLE_MEMORY_PERSISTENCE=true # Advanced Settings SPEAKER_SELECTION_METHOD=auto # auto, manual, random, round_robin SUMMARY_METHOD=reflection_with_llm # last_msg, reflection_with_llm MAX_CHAT_TURNS=10 MAX_GROUP_CHAT_ROUNDS=15 # Memory and Learning AGENT_MEMORY_PATH=./agent_memory LEARNING_RATE=0.1 MEMORY_CLEANUP_INTERVAL=3600 # Performance and Debugging LOG_LEVEL=INFO # DEBUG, INFO, WARNING, ERROR CACHE_DURATION=300 AUTO_REFRESH_RESOURCES=true # Workflow Quality Checks DEFAULT_QUALITY_CHECKS=true DEFAULT_OUTPUT_FORMAT=json # json, markdown, text, structured

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DynamicEndpoints/Autogen_MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server