Skip to main content
Glama

@arizeai/phoenix-mcp

Official
by Arize-ai
tracing_openai_sessions_tutorial.ipynb5.94 kB
{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "<center>\n", " <p style=\"text-align:center\">\n", " <img alt=\"phoenix logo\" src=\"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/9e6101d95936f4bd4d390efc9ce646dc6937fb2d/images/socal/github-large-banner-phoenix.jpg\" width=\"1000\"/>\n", " <br>\n", " <br>\n", " <a href=\"https://arize.com/docs/phoenix/\">Docs</a>\n", " |\n", " <a href=\"https://github.com/Arize-ai/phoenix\">GitHub</a>\n", " |\n", " <a href=\"https://arize-ai.slack.com/join/shared_invite/zt-2w57bhem8-hq24MB6u7yE_ZF_ilOYSBw#/shared-invite/email\">Community</a>\n", " </p>\n", "</center>\n", "<h1 align=\"center\">Setting Up Sessions</h1>\n", "\n", "A Session is a sequence of traces representing a single session (e.g. a session or a thread). Each response is represented as it's own trace, but these traces are linked together by being part of the same session.\n", "To associate traces together, you need to pass in a special metadata key where the value is the unique identifier for that thread.\n", "\n", "In this tutorial we will setup sessions using OpenAI and OpenInference instrumentation.\n", "\n", "> Note: that this example requires the OPENAI_API_KEY environment variable to be set and assumes you are running the Phoenix server on localhost:6006." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import {\n", " NodeTracerProvider,\n", " SimpleSpanProcessor,\n", "} from \"npm:@opentelemetry/sdk-trace-node\";\n", "import { Resource } from \"npm:@opentelemetry/resources\";\n", "import { OTLPTraceExporter } from \"npm:@opentelemetry/exporter-trace-otlp-proto\";\n", "import { SEMRESATTRS_PROJECT_NAME } from \"npm:@arizeai/openinference-semantic-conventions\";\n", "import { diag, DiagConsoleLogger, DiagLogLevel } from \"npm:@opentelemetry/api\";\n", "\n", "// For troubleshooting, set the log level to DiagLogLevel.DEBUG\n", "diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.INFO);\n", "\n", "const provider = new NodeTracerProvider({\n", " resource: new Resource({\n", " [SEMRESATTRS_PROJECT_NAME]: \"openai-node-sessions-example\",\n", " }),\n", "});\n", "\n", "provider.addSpanProcessor(\n", " new SimpleSpanProcessor(\n", " new OTLPTraceExporter({\n", " url: \"http://localhost:6006/v1/traces\",\n", " }),\n", " ),\n", ");\n", "\n", "provider.register();\n", "\n", "console.log(\"👀 OpenInference initialized\");" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import OpenAI from 'npm:openai';\n", "import { OpenAIInstrumentation } from \"npm:@arizeai/openinference-instrumentation-openai\";\n", "\n", "const oaiInstrumentor = new OpenAIInstrumentation();\n", "oaiInstrumentor.manuallyInstrument(OpenAI);" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import { trace } from \"npm:@opentelemetry/api\";\n", "import { SemanticConventions } from \"npm:@arizeai/openinference-semantic-conventions\";\n", "import { context } from \"npm:@opentelemetry/api\";\n", "import { setSession } from \"npm:@arizeai/openinference-core\";\n", "\n", "const tracer = trace.getTracer(\"agent\");\n", "\n", "const client = new OpenAI({\n", " apiKey: process.env[\"OPENAI_API_KEY\"], // This is the default and can be omitted\n", "});\n", "\n", "async function assistant(params: {\n", " messages: { role: string; content: string }[];\n", " sessionId: string;\n", "}) {\n", " return tracer.startActiveSpan(\"agent\", async (span: Span) => {\n", " span.setAttribute(SemanticConventions.OPENINFERENCE_SPAN_KIND, \"agent\");\n", " span.setAttribute(SemanticConventions.SESSION_ID, params.sessionId);\n", " span.setAttribute(\n", " SemanticConventions.INPUT_VALUE,\n", " messages[messages.length - 1].content,\n", " );\n", " try {\n", " // This is not strictly necessary but it helps propagate the session ID\n", " // to all child spans\n", " return context.with(\n", " setSession(context.active(), { sessionId: params.sessionId }),\n", " async () => {\n", " // Calls within this block will generate spans with the session ID set\n", " const chatCompletion = await client.chat.completions.create({\n", " messages: params.messages,\n", " model: \"gpt-3.5-turbo\",\n", " });\n", " const response = chatCompletion.choices[0].message;\n", " span.setAttribute(SemanticConventions.OUTPUT_VALUE, response.content);\n", " span.end();\n", " return response;\n", " },\n", " );\n", " } catch (e) {\n", " span.error(e);\n", " }\n", " });\n", "}\n", "\n", "const sessionId = crypto.randomUUID();\n", "\n", "let messages = [{ role: \"user\", content: \"hi! im Tim\" }];\n", "\n", "const res = await assistant({\n", " messages,\n", " sessionId: sessionId,\n", "});\n", "\n", "messages = [res, { role: \"assistant\", content: \"What is my name?\" }];\n", "\n", "await assistant({\n", " messages,\n", " sessionId: sessionId,\n", "});\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "language_info": { "name": "typescript" } }, "nbformat": 4, "nbformat_minor": 2 }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Arize-ai/phoenix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server