Skip to main content
Glama

@arizeai/phoenix-mcp

Official
by Arize-ai
tracing_openai_node_tutorial.ipynb2.81 kB
{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "<center>\n", " <p style=\"text-align:center\">\n", " <img alt=\"phoenix logo\" src=\"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/9e6101d95936f4bd4d390efc9ce646dc6937fb2d/images/socal/github-large-banner-phoenix.jpg\" width=\"1000\"/>\n", " <br>\n", " <br>\n", " <a href=\"https://arize.com/docs/phoenix/\">Docs</a>\n", " |\n", " <a href=\"https://github.com/Arize-ai/phoenix\">GitHub</a>\n", " |\n", " <a href=\"https://arize-ai.slack.com/join/shared_invite/zt-2w57bhem8-hq24MB6u7yE_ZF_ilOYSBw#/shared-invite/email\">Community</a>\n", " </p>\n", "</center>\n", "<h1 align=\"center\">OpenAI Node SDK Tracing</h1>\n", "\n", "Let's see how to get started with using the OpenAI Node SDK to trace your LLM calls using Deno. \n", "\n", "> Note: that this example requires the OPENAI_API_KEY environment variable to be set and assumes you are running the Phoenix server on localhost:6006." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import { register } from \"npm:@arizeai/phoenix-otel\";\n", "\n", "// Setup OpenTelemetry and point it to your Phoenix\n", "const provider = register({\n", " url: \"http://localhost:6006\",\n", " apiKey: \"your-api-key\",\n", " projectName: \"openai-deno-example\",\n", " batch: false, // turn off batching so we can see results immediately\n", "})" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import OpenAI from 'npm:openai';\n", "import { OpenAIInstrumentation } from \"npm:@arizeai/openinference-instrumentation-openai\";\n", "\n", "const oaiInstrumentor = new OpenAIInstrumentation();\n", "oaiInstrumentor.manuallyInstrument(OpenAI);" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "const client = new OpenAI({\n", " apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n", "});\n", "\n", "async function main() {\n", " try {\n", " const chatCompletion = await client.chat.completions.create({\n", " messages: [{ role: 'user', content: 'Say this is a test' }],\n", " model: 'gpt-3.5-turbo',\n", " });\n", " console.dir(chatCompletion.choices[0].message);\n", " } catch (e) {\n", " console.error(e);\n", " }\n", "}\n", "\n", "await main();" ] } ], "metadata": { "language_info": { "name": "typescript" } }, "nbformat": 4, "nbformat_minor": 2 }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Arize-ai/phoenix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server