Skip to main content
Glama

MCP Prompt Cleaner

by Da-Colon
config.py779 B
from typing import Optional from pydantic_settings import BaseSettings from pydantic import ConfigDict class Settings(BaseSettings): """Application settings loaded from environment variables""" # LLM API Configuration llm_api_endpoint: str = "https://api.openai.com/v1/chat/completions" llm_api_key: Optional[str] = None # Set via LLM_API_KEY env var llm_model: str = "gpt-4" # Set via LLM_MODEL env var llm_timeout: float = 60.0 # Request timeout in seconds llm_max_tokens: int = 600 # Maximum tokens to generate # Retry Configuration content_max_retries: int = 2 # Max retries for content extraction/validation model_config = ConfigDict(env_file=".env", case_sensitive=False) # Global settings instance settings = Settings()

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Da-Colon/mcp-py-prompt-cleaner'

If you have feedback or need assistance with the MCP directory API, please join our Discord server