Skip to main content
Glama
Arize-ai

@arizeai/phoenix-mcp

Official
by Arize-ai

phoenix-support

Get expert guidance on using Phoenix for tracing AI applications, managing datasets and prompts, and conducting evaluations with OpenInference.

Instructions

Get help with Phoenix and OpenInference.

  • Tracing AI applications via OpenInference and OpenTelemetry

  • Phoenix datasets, experiments, and prompt management

  • Phoenix evals and annotations

Use this tool when you need assistance with Phoenix features, troubleshooting, or best practices.

Expected return: Expert guidance about how to use and integrate Phoenix

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYesYour question about Arize Phoenix, OpenInference, or related topics

Implementation Reference

  • Handler for the 'phoenix-support' tool. It calls the 'callRunLLMQuery' helper with the user query and returns the result formatted as MCP text content.
    async ({ query }) => { const result = await callRunLLMQuery({ query }); return { content: [ { type: "text", text: result, }, ], }; }
  • Input schema for the 'phoenix-support' tool using Zod: a string 'query' with description.
    { query: z .string() .describe( "Your question about Arize Phoenix, OpenInference, or related topics" ), },
  • Registration of the 'phoenix-support' tool on the MCP server within the initializeSupportTools function, including name, description reference, input schema, and inline handler.
    "phoenix-support", PHOENIX_SUPPORT_DESCRIPTION, { query: z .string() .describe( "Your question about Arize Phoenix, OpenInference, or related topics" ), }, async ({ query }) => { const result = await callRunLLMQuery({ query }); return { content: [ { type: "text", text: result, }, ], }; } );
  • Core helper function used by the phoenix-support handler. Creates an MCP client to RunLLM server and calls their 'search' tool with the query, extracts and returns text response.
    export async function callRunLLMQuery({ query, }: { query: string; }): Promise<string> { const client = await createRunLLMClient(); // Call the chat tool with the user's question const result = await client.callTool({ name: "search", arguments: { query: query, }, }); // There's usually only one content item, but we'll handle multiple for safety if (result.content && Array.isArray(result.content)) { const textContent = result.content .filter((item) => item.type === "text") .map((item) => item.text) .join("\n"); if (textContent) { return textContent; } } return "No response received from support"; }
  • Invocation of initializeSupportTools in the main server setup, which registers the phoenix-support tool among others.
    initializeSupportTools({ server });

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Arize-ai/phoenix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server