Skip to main content
Glama

OmniMCP

by OpenAdaptAI
# .env.example # Copy this file to .env and fill in your AWS credentials # --- Project --- PROJECT_NAME=omnimcp # --- Core LLM Configuration --- # Required for planning/reasoning steps (Only one provider needed usually) ANTHROPIC_API_KEY=your_anthropic_api_key # Not yet supported: # OPENAI_API_KEY=your_openai_api_key # GOOGLE_API_KEY=your_google_api_key # Optional: Specify exact Anthropic model (defaults to "claude-3-7-sonnet-20250219" in config.py) # ANTHROPIC_DEFAULT_MODEL=claude-3-sonnet-20240229 # ANTHROPIC_DEFAULT_MODEL=claude-3-haiku-20240307 # --- OmniParser Service Configuration --- # Option 1: Leave blank/commented to use auto-deployment features (requires AWS keys below) # OMNIPARSER_URL= # Option 2: Specify URL if running OmniParser manually or elsewhere # OMNIPARSER_URL=http://<your_parser_ip>:8000 # Optional: Factor (0.1-1.0) to resize screenshot before parsing (lower = faster, less accurate) # Default is 1.0 (no downsampling). Set to e.g. 0.5 for 50% scaling. # OMNIPARSER_DOWNSAMPLE_FACTOR=1.0 # --- AWS Credentials (Required ONLY for OmniParser auto-deployment) --- # AWS_ACCESS_KEY_ID=YOUR_AWS_ACCESS_KEY # AWS_SECRET_ACCESS_KEY=YOUR_SECRET_KEY # AWS_REGION=us-east-1 # Optional: Defaults to us-east-1 if not set # --- Optional AWS EC2 Configuration (Overrides defaults in config.py for auto-deploy) --- # See config.py for default AMI/Instance Type (currently G6/DLAMI) # AWS_EC2_INSTANCE_TYPE=g6.xlarge # AWS_EC2_AMI=ami-xxxxxxxxxxxxxxxxx # --- Logging --- # Log level: DEBUG, INFO, WARNING, ERROR, CRITICAL (Default: INFO) # LOG_LEVEL=INFO # # Optional: view full prompts in DEBUG mode # DEBUG_FULL_PROMPTS=False

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/OpenAdaptAI/OmniMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server