Skip to main content
Glama

Okta-mcp-server

.env.sample1.99 kB
# Okta tenant details OKTA_CLIENT_ORGURL=https://dev-1602.okta.com # Replace with your Okta API token # You can generate an API token in Okta by going to Admin > Security > API > Tokens # Read only permissions are fine currently OKTA_API_TOKEN= #additional settings LOG_LEVEL=INFO ### Do NOT change this for now #### OKTA_CONCURRENT_LIMIT=15 # AI Provider Selection. It can be one of the following values: # vertex_ai, openai, azure_openai, openai_compatible, anthropic AI_PROVIDER=openai_compatible # Vertex AI Models (if using Vertex AI) ## or set GOOGLE_APPLICATION_CREDENTIALS environment variable and comment this line VERTEX_AI_SERVICE_ACCOUNT_FILE=path/to/service-account.json VERTEX_AI_REASONING_MODEL=gemini-2.5-pro-exp-03-25 # OpenAI Models (if using OpenAI) OPENAI_API_KEY= OPENAI_CODING_MODEL=gpt-4o # Azure OpenAI Models (if using Azure) AZURE_OPENAI_KEY=your-api-key AZURE_OPENAI_ENDPOINT=your-endpoint AZURE_OPENAI_VERSION=2024-07-01-preview AZURE_OPENAI_REASONING_MODEL=gpt-4 # OpenAI Compatible Configuration OPENAI_COMPATIBLE_REASONING_MODEL=accounts/fireworks/models/deepseek-v3 OPENAI_COMPATIBLE_BASE_URL=https://api.fireworks.ai/inference/v1 OPENAI_COMPATIBLE_TOKEN= # CUSTOM LLM / HTTP PROXY headers for openai_compatible model (if needed) CUSTOM_HTTP_HEADERS={"x-ai-organization": "org-0a1b2c3d49j"} # OpenAI Compatible Configuration - Ollama #OPENAI_COMPATIBLE_REASONING_MODEL=qwen2.5:latest #OPENAI_COMPATIBLE_BASE_URL=http://localhost:11434/v1/ #OPENAI_COMPATIBLE_TOKEN=xxxxxx # Authentication Settings (Optional - only for HTTP transport) # Set to 'true' to enable JWT Bearer token authentication ENABLE_AUTH=false # Choose ONE of these two options: # Option A: Static public key (for development/testing) #AUTH_PUBLIC_KEY= # Option B: JWKS URI (recommended for production) AUTH_JWKS_URI=https://your-identity-provider.com/.well-known/jwks.json # Optional JWT validation settings AUTH_ISSUER= AUTH_AUDIENCE=okta-mcp-server AUTH_REQUIRED_SCOPES=

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/fctr-id/okta-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server