phoenix-support
Access expert assistance for using Phoenix and OpenInference, including AI tracing, datasets, experiments, prompt management, and evals. Ideal for troubleshooting, integration, and best practices.
Instructions
Get help with Phoenix and OpenInference.
Tracing AI applications via OpenInference and OpenTelemetry
Phoenix datasets, experiments, and prompt management
Phoenix evals and annotations
Use this tool when you need assistance with Phoenix features, troubleshooting, or best practices.
Expected return: Expert guidance about how to use and integrate Phoenix
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| query | Yes | Your question about Arize Phoenix, OpenInference, or related topics |
Implementation Reference
- js/packages/phoenix-mcp/src/supportTools.ts:75-102 (registration)Exports the initializeSupportTools function which registers the 'phoenix-support' tool on the MCP server, including schema and inline handler.export const initializeSupportTools = async ({ server, }: { server: McpServer; }) => { server.tool( "phoenix-support", PHOENIX_SUPPORT_DESCRIPTION, { query: z .string() .describe( "Your question about Arize Phoenix, OpenInference, or related topics" ), }, async ({ query }) => { const result = await callRunLLMQuery({ query }); return { content: [ { type: "text", text: result, }, ], }; } ); };
- Inline handler for phoenix-support tool that invokes callRunLLMQuery and formats the response as MCP content.async ({ query }) => { const result = await callRunLLMQuery({ query }); return { content: [ { type: "text", text: result, }, ], }; }
- Core helper function that creates an MCP client to RunLLM server and calls their 'search' tool with the query, extracting text response.export async function callRunLLMQuery({ query, }: { query: string; }): Promise<string> { const client = await createRunLLMClient(); // Call the chat tool with the user's question const result = await client.callTool({ name: "search", arguments: { query: query, }, }); // There's usually only one content item, but we'll handle multiple for safety if (result.content && Array.isArray(result.content)) { const textContent = result.content .filter((item) => item.type === "text") .map((item) => item.text) .join("\n"); if (textContent) { return textContent; } } return "No response received from support"; }
- Helper function to create and connect an MCP client to the RunLLM support server.async function createRunLLMClient(): Promise<Client> { const transport = new StreamableHTTPClientTransport( new URL("https://mcp.runllm.com/mcp/"), { requestInit: { headers: { "assistant-name": "arize-phoenix", }, }, } ); const client = new Client({ name: "runllm-client", version: "1.0.0", }); await client.connect(transport); return client; }
- js/packages/phoenix-mcp/src/index.ts:48-49 (registration)Calls initializeSupportTools during server setup, triggering the registration of phoenix-support tool.initializeSpanTools({ client, server }); initializeSupportTools({ server });