Skip to main content
Glama

@arizeai/phoenix-mcp

Official
by Arize-ai
configuring-the-llm.md2.68 kB
# Configuring the LLM  LLM Evaluators require an LLM in order to score an evaluation input. Given the wide range of providers and SDKs, `phoenix-evals` provides an `LLM` abstraction that delegates LLM calls to an appropriate SDK/API that is already available in your Python environment. The configuration arguments of the SDK client and LLM call invocation parameters will be the same as the target SDK so you won't have to learn another API. To see the currently supported LLM providers, use the `show_provider_availability` function. ```python from phoenix.evals.llm import show_provider_availability show_provider_availability() # 📦 AVAILABLE PROVIDERS (sorted by client priority) # -------------------------------------------------------------------- # Provider | Status | Client | Dependencies # -------------------------------------------------------------------- # azure | ✓ Available | openai | openai # openai | ✓ Available | openai | openai # openai | ✓ Available | langchain | langchain, langchain_openai # openai | ✓ Available | litellm | litellm # anthropic | ✓ Available | langchain | langchain, langchain_anthropic # anthropic | ✓ Available | litellm | litellm ``` The `provider` column shows the supported providers, and the `status` column will read "Available" if the required dependencies are installed in the active Python environment. Note that multiple client SDKs can be used to make LLM requests to a provider, the desired client SDK can be specified when constructing the LLM wrapper client. ```python from phoenix.evals.llm import LLM LLM(provider="openai", model="gpt-5") # uses the the first available provider SDK LLM(provider="openai", model="gpt-5", client="litellm") # uses LiteLLM to make requests ``` ## Client Configuration The `LLM` wrappers can be configured the same way you'd configure the underlying client SDK. For example, when using the OpenAI Python Client: ```python from phoenix.evals.llm import LLM LLM(provider="openai", model="gpt-5", client="openai", api_key="my-openai-api-key") ``` Similarly for OpenAI's Azure Python Client: ```python from phoenix.evals.llm import LLM llm = LLM( provider="azure", model="gpt-5o", api_key="your-api-key", api_version="api-version", base_url="base-url", ) ``` ## Unified Interface The `LLM` wrapper provides a unified interface to common LLM operations: generating text and structured outputs. For more information, refer to the [API Documentation](https://arize-phoenix.readthedocs.io/projects/evals/en/latest/api/evals.html#llm).

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Arize-ai/phoenix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server