Skip to main content
Glama

@arizeai/phoenix-mcp

Official
by Arize-ai
cookbooks.md1.02 kB
# More Cookbooks Trace through the execution of your LLM application to understand its internal structure and to troubleshoot issues with retrieval, tool execution, LLM calls, and more. ## Use Cases * [LlamaIndex + OpenAI RAG Application](https://colab.research.google.com/github/Arize-ai/phoenix/blob/main/tutorials/tracing/llama_index_tracing_tutorial.ipynb) * [LangChain + OpenAI RAG Application](https://colab.research.google.com/github/Arize-ai/phoenix/blob/main/tutorials/tracing/langchain_tracing_tutorial.ipynb) * [LangChain OpenAI Agent](https://colab.research.google.com/github/Arize-ai/phoenix/blob/main/tutorials/tracing/langchain_agent_tracing_tutorial.ipynb) * [LlamaIndex OpenAI Agent](https://colab.research.google.com/github/Arize-ai/phoenix/blob/main/tutorials/tracing/llama_index_openai_agent_tracing_tutorial.ipynb) * [Multilingual Text2Cypher with Custom Evaluation Tracing](https://colab.research.google.com/github/Arize-ai/phoenix/blob/docs/tutorials/tracing/multilingual_text2cypher_evals.ipynb)

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Arize-ai/phoenix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server