Skip to main content
Glama

@arizeai/phoenix-mcp

Official
by Arize-ai
llamaindex-workflows-tracing.md1.86 kB
--- description: How to use the python LlamaIndexInstrumentor to trace LlamaIndex Workflows --- # LlamaIndex Workflows Tracing [LlamaIndex Workflows](https://www.llamaindex.ai/blog/introducing-workflows-beta-a-new-way-to-create-complex-ai-applications-with-llamaindex) are a subset of the LlamaIndex package specifically designed to support agent development. {% hint style="success" %} Our LlamaIndexInstrumentor automatically captures traces for LlamaIndex Workflows agents. If you've already enabled that instrumentor, you do not need to complete the steps below. {% endhint %} {% hint style="info" %} We recommend using `llama_index >= 0.11.0` {% endhint %} ## Launch Phoenix {% include "../../../../phoenix-integrations/.gitbook/includes/sign-up-for-phoenix-sign-up....md" %} ## Install ```bash pip install openinference-instrumentation-llama_index ``` ## Setup Initialize the LlamaIndexInstrumentor before your application code. This instrumentor will trace both LlamaIndex Workflows calls, as well as calls to the general LlamaIndex package. ```python from openinference.instrumentation.llama_index import LlamaIndexInstrumentor from phoenix.otel import register tracer_provider = register() LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider) ``` ## Run LlamaIndex Workflows By instrumenting LlamaIndex, spans will be created whenever an agent is invoked and will be sent to the Phoenix server for collection. ## Observe Now that you have tracing setup, all invocations of chains will be streamed to your running Phoenix for observability and evaluation. ## Resources * [Example project](https://github.com/Arize-ai/phoenix/tree/main/examples/llamaindex-workflows-research-agent) * [OpenInference package](https://github.com/Arize-ai/openinference/blob/main/python/instrumentation/openinference-instrumentation-langchain)

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Arize-ai/phoenix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server