Skip to main content
Glama
Arize-ai

@arizeai/phoenix-mcp

Official
by Arize-ai

phoenix-support

Access expert assistance for using Phoenix and OpenInference, including AI tracing, datasets, experiments, prompt management, and evals. Ideal for troubleshooting, integration, and best practices.

Instructions

Get help with Phoenix and OpenInference.

  • Tracing AI applications via OpenInference and OpenTelemetry

  • Phoenix datasets, experiments, and prompt management

  • Phoenix evals and annotations

Use this tool when you need assistance with Phoenix features, troubleshooting, or best practices.

Expected return: Expert guidance about how to use and integrate Phoenix

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYesYour question about Arize Phoenix, OpenInference, or related topics

Implementation Reference

  • Exports the initializeSupportTools function which registers the 'phoenix-support' tool on the MCP server, including schema and inline handler.
    export const initializeSupportTools = async ({ server, }: { server: McpServer; }) => { server.tool( "phoenix-support", PHOENIX_SUPPORT_DESCRIPTION, { query: z .string() .describe( "Your question about Arize Phoenix, OpenInference, or related topics" ), }, async ({ query }) => { const result = await callRunLLMQuery({ query }); return { content: [ { type: "text", text: result, }, ], }; } ); };
  • Inline handler for phoenix-support tool that invokes callRunLLMQuery and formats the response as MCP content.
    async ({ query }) => { const result = await callRunLLMQuery({ query }); return { content: [ { type: "text", text: result, }, ], }; }
  • Core helper function that creates an MCP client to RunLLM server and calls their 'search' tool with the query, extracting text response.
    export async function callRunLLMQuery({ query, }: { query: string; }): Promise<string> { const client = await createRunLLMClient(); // Call the chat tool with the user's question const result = await client.callTool({ name: "search", arguments: { query: query, }, }); // There's usually only one content item, but we'll handle multiple for safety if (result.content && Array.isArray(result.content)) { const textContent = result.content .filter((item) => item.type === "text") .map((item) => item.text) .join("\n"); if (textContent) { return textContent; } } return "No response received from support"; }
  • Helper function to create and connect an MCP client to the RunLLM support server.
    async function createRunLLMClient(): Promise<Client> { const transport = new StreamableHTTPClientTransport( new URL("https://mcp.runllm.com/mcp/"), { requestInit: { headers: { "assistant-name": "arize-phoenix", }, }, } ); const client = new Client({ name: "runllm-client", version: "1.0.0", }); await client.connect(transport); return client; }
  • Calls initializeSupportTools during server setup, triggering the registration of phoenix-support tool.
    initializeSpanTools({ client, server }); initializeSupportTools({ server });

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Arize-ai/phoenix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server