MCPHub LabRegistryArize-ai/phoenix
Arize-ai

Arize ai/phoenix

Built by Arize-ai β€’ 9,063 stars

What is Arize ai/phoenix?

AI Observability & Evaluation

How to use Arize ai/phoenix?

1. Install a compatible MCP client (like Claude Desktop). 2. Open your configuration settings. 3. Add Arize ai/phoenix using the following command: npx @modelcontextprotocol/arize-ai-phoenix 4. Restart the client and verify the new tools are active.
πŸ›‘οΈ Scoped (Restricted)
npx @modelcontextprotocol/arize-ai-phoenix --scope restricted
πŸ”“ Unrestricted Access
npx @modelcontextprotocol/arize-ai-phoenix

Key Features

Native MCP Protocol Support
Real-time Tool Activation & Execution
Verified High-performance Implementation
Secure Resource & Context Handling

Optimized Use Cases

Extending AI models with custom local capabilities
Automating system workflows via natural language
Connecting external data sources to LLM context windows

Arize ai/phoenix FAQ

Q

Is Arize ai/phoenix safe?

Yes, Arize ai/phoenix follows the standardized Model Context Protocol security patterns and only executes tools with explicit user-granted permissions.

Q

Is Arize ai/phoenix up to date?

Arize ai/phoenix is currently active in the registry with 9,063 stars on GitHub, indicating its reliability and community support.

Q

Are there any limits for Arize ai/phoenix?

Usage limits depend on the specific implementation of the MCP server and your system resources. Refer to the official documentation below for technical details.

Official Documentation

View on GitHub
<p align="center"> <a target="_blank" href="https://phoenix.arize.com" style="background:none"> <img alt="phoenix banner" src="https://github.com/Arize-ai/phoenix-assets/blob/main/images/socal/github-large-banner-phoenix-v2.jpg?raw=true" width="auto" height="auto"></img> </a> <br/> <br/> <a href="https://arize.com/docs/phoenix/"> <img src="https://img.shields.io/static/v1?message=Docs&logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAIAAAACACAYAAADDPmHLAAAG4ElEQVR4nO2d4XHjNhCFcTf+b3ZgdWCmgmMqOKUC0xXYrsBOBVEqsFRB7ApCVRCygrMriFQBM7h5mNlwKBECARLg7jeDscamSQj7sFgsQfBL27ZK4MtXsT1vRADMEQEwRwTAHBEAc0QAzBEBMEcEwBwRAHNEAMwRATBnjAByFGE+MqVUMcYOY24GVUqpb/h8VErVKAf87QNFcEcbd4WSw+D6803njHscO5sATmGEURGBiCj6yUlv1uX2gv91FsDViArbcA2RUKF8QhAV8RQc0b15DcOt0VaTE1oAfWj3dYdCBfGGsmSM0XX5HsP3nEMAXbqCeCdiOERQPx9og5exGJ0S4zRQN9KrUupfpdQWjZciure/YIj7K0bjqwTyAHdovA805iqCOg2xgnB1nZ97IvaoSCURdIPG/IHGjTH/YAz/A8KdJai7lBQzgbpx/0Hg6DT18UzWMXxSjMkDrElPNEmKfAbl6znwI3IMU/OCa0/1nfckwWaSbvWYYDnEsvCMJDNckhqu7GCMKWYOBXp9yPGd5kvqUAKf6rkAk7M2SY9QDXdEr9wEOr9x96EiejMFnixBNteDISsyNw7hHRqc22evWcP4vt39O85bzZH30AKg4+eo8cQRI4bHAJ7hyYM3CNHrG9RrimSXuZmUkZjN/O6nAPpcwCcJNmipAle2QM/1GU3vITCXhvY91u9geN/jOY27VuTnYL1PCeAcRhwh7/Bl8Ai+IuxPiOCShtfX/sPDtY8w+sZjby86dw6dBeoigD7obd/Ko6fI4BF8DA9HnGdrcU0fLt+n4dfE6H5jpjYcVdu2L23b5lpjHoo+18FDbcszddF1rUee/4C6ZiO+80rHZmjDoIQUQLdRtm3brkcKIUPjjqVPBIUHgW1GGN4YfawAL2IqAVB8iEE31tvIelARlCPPVaFOLoIupzY6xVcM4MoRUyHXyHhslH6PaPl5RP1Lh4UsOeKR2e8dzC0Aiuvc2Nx3fwhfxf/hknouUYbWUk5GTAIwmOh5e+H0cor8vEL91hfOdEqINLq1AV+RKImJ6869f9tFIBVc6y7gd3lHfWyNX0LEr7EuDElhRdAlQjig0e/RU31xxDltM4pF7IY3pLIgxAhhgzF/iC2M0Hi4dkOGlyGMd/g7dsMbUlsR9ICe9WhxbA3DjRkSdjiHzQzlBSKNJsCzIcUlYdfI0dcWS8LMkPDkcJ0n/O+Qyy/IAtDkSPnp4Fu4WpthQR/zm2VcoI/51fI28iYld9/HEh4Pf7D0Bm845pwIPnHMUJSf45pT5x68s5T9AW6INzhHDeP1BYcNMew5SghkinWOwVnaBhHGG5ybMn70zBDe8buh8X6DqV0Sa/5tWOIOIbcWQ8KBiGBnMb/P0OuTd/lddCrY5jn/VLm3nL+fY4X4YREuv8vS9wh6HSkAExMs0viKySZRd44iyOH2FzPe98Fll7A7GNMmjay4GF9BAKGXesfCN0sRsDG+YrhP4O2ACFgZXzHdKPL2RMJoxc34ivFOod3AMMNUj5XxFfOtYrUIXvB5MandS+G+V/AzZ+MrEcBPlpoFtUIEwBwRAG+OIgDe1CIA5ogAmCMCYI4IgDkiAOaIAJgjAmCOCIA5IgDmiACYIwJgjgiAOSIA5ogAmCMCYI4IgDkiAOaIAJgjAmCOCIA5IgDmiACYIwJgjgiAOSIA5ogAmCMCYI4IgDkiAOaIAJgjAmDOVYBXvwvxQV8NWJOd0esvJ94babZaz7B5ovldxnlDpYhp0JFr/KTlLKcEMMQKpcDPXIQxGXsYmhZnXAXQh/EWBQrr3bc80mATyyrEvs4+BdBHgbdxFOIhrDkSg1/6Iu2LCS0AyoqI4ftUF00EY/Q3h1fRj2JKAVCMGErmnsH1lfnemEsAlByvgl0z2qx5B8OPCuB8EIMADBlEEOV79j1whNE3c/X2PmISAGUNr7CEmUSUhjfEKgBDAY+QohCiNrwhdgEYzPv7UxkadvBg0RrekMrNoAozh3vLN4DPhc7S/WL52vkoSO1u4BZC+DOCulC0KJ/gqWaP7C8hlSGgjxyCmDuPsEePT/KuasrrAcyr4H+f6fq01yd7Sz1lD0CZ2hs06PVJufs+lrIiyLwufjfBtXYpjvWnWIoHoJSYe4dIK/t4HX1ULFEACkPCm8e8wXFJvZ6y1EWhJkDcWxw7RINzLc74auGrgg8e4oIm9Sh/CA7LwkvHqaIJ9pLI6Lmy1BigDy2EV8tjdzh+8XB6MGSLKH4INsZXDJ8MGhIBK+Mrpo+GnRIBO+MrZjFAFxoTNBwCvj6u4qvSZJiM3iNX4yvmHoA9Sh4PF0QAzBEBMEcEwBwRAHNEAMwRAXBGKfUfr5hKvglRfO4AAAAASUVORK5CYII=&labelColor=grey&color=blue&logoColor=white&label=%20"/> </a> <a target="_blank" href="https://join.slack.com/t/arize-ai/shared_invite/zt-3r07iavnk-ammtATWSlF0pSrd1DsMW7g"> <img src="https://img.shields.io/static/v1?message=Community&logo=slack&labelColor=grey&color=blue&logoColor=white&label=%20"/> </a> <a target="_blank" href="https://bsky.app/profile/arize-phoenix.bsky.social"> <img src="https://img.shields.io/badge/-phoenix-blue.svg?color=blue&labelColor=gray&logo=bluesky"> </a> <a target="_blank" href="https://x.com/ArizePhoenix"> <img src="https://img.shields.io/badge/-ArizePhoenix-blue.svg?color=blue&labelColor=gray&logo=x"> </a> <a target="_blank" href="https://pypi.org/project/arize-phoenix/"> <img src="https://img.shields.io/pypi/v/arize-phoenix?color=blue"> </a> <a target="_blank" href="https://anaconda.org/conda-forge/arize-phoenix"> <img src="https://img.shields.io/conda/vn/conda-forge/arize-phoenix.svg?color=blue"> </a> <a target="_blank" href="https://pypi.org/project/arize-phoenix/"> <img src="https://img.shields.io/pypi/pyversions/arize-phoenix"> </a> <a target="_blank" href="https://hub.docker.com/r/arizephoenix/phoenix/tags"> <img src="https://img.shields.io/docker/v/arizephoenix/phoenix?sort=semver&logo=docker&label=image&color=blue"> </a> <a target="_blank" href="https://hub.docker.com/r/arizephoenix/phoenix-helm"> <img src="https://img.shields.io/badge/Helm-blue?style=flat&logo=helm&labelColor=grey"/> </a> <a target="_blank" href="https://github.com/Arize-ai/phoenix/tree/main/js/packages/phoenix-mcp"> <img src="https://badge.mcpx.dev?status=on" title="MCP Enabled"/> </a> <a href="cursor://anysphere.cursor-deeplink/mcp/install?name=phoenix&config=eyJjb21tYW5kIjoibnB4IC15IEBhcml6ZWFpL3Bob2VuaXgtbWNwQGxhdGVzdCAtLWJhc2VVcmwgaHR0cHM6Ly9teS1waG9lbml4LmNvbSAtLWFwaUtleSB5b3VyLWFwaS1rZXkifQ%3D%3D"><img src="https://cursor.com/deeplink/mcp-install-dark.svg" alt="Add Arize Phoenix MCP server to Cursor" height=20 /></a> <img referrerpolicy="no-referrer-when-downgrade" src="https://static.scarf.sh/a.png?x-pxid=8e8e8b34-7900-43fa-a38f-1f070bd48c64&page=README.md" /> </p>

Phoenix is an open-source AI observability platform designed for experimentation, evaluation, and troubleshooting. It provides:

  • Tracing - Trace your LLM application's runtime using OpenTelemetry-based instrumentation.
  • Evaluation - Leverage LLMs to benchmark your application's performance using response and retrieval evals.
  • Datasets - Create versioned datasets of examples for experimentation, evaluation, and fine-tuning.
  • Experiments - Track and evaluate changes to prompts, LLMs, and retrieval.
  • Playground- Optimize prompts, compare models, adjust parameters, and replay traced LLM calls.
  • Prompt Management- Manage and test prompt changes systematically using version control, tagging, and experimentation.

Phoenix is vendor and language agnostic with out-of-the-box support for popular frameworks (OpenAI Agents SDK, Claude Agent SDK, LangGraph, Vercel AI SDK, Mastra, CrewAI, LlamaIndex, DSPy) and LLM providers (OpenAI, Anthropic, Google GenAI, Google ADK, AWS Bedrock, OpenRouter, LiteLLM, and more). For details on auto-instrumentation, check out the OpenInference project.

Phoenix runs practically anywhere, including your local machine, a Jupyter notebook, a containerized deployment, or in the cloud.

Installation

Install Phoenix via pip or conda

pip install arize-phoenix

Phoenix container images are available via Docker Hub and can be deployed using Docker or Kubernetes. Arize AI also provides cloud instances at app.phoenix.arize.com.

Packages

The arize-phoenix package includes the entire Phoenix platform. However, if you have deployed the Phoenix platform, there are lightweight Python sub-packages and TypeScript packages that can be used in conjunction with the platform.

Python Subpackages

PackageVersion & DocsDescription
arize-phoenix-otelPyPI Version DocsProvides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
arize-phoenix-clientPyPI Version DocsLightweight client for interacting with the Phoenix server via its OpenAPI REST interface
arize-phoenix-evalsPyPI Version DocsTooling to evaluate LLM applications including RAG relevance, answer relevance, and more

TypeScript Subpackages

PackageVersion & DocsDescription
@arizeai/phoenix-otelNPM Version DocsProvides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
@arizeai/phoenix-clientNPM Version DocsClient for the Arize Phoenix API
@arizeai/phoenix-evalsNPM Version DocsTypeScript evaluation library for LLM applications (alpha release)
@arizeai/phoenix-mcpNPM Version DocsMCP server implementation for Arize Phoenix providing unified interface to Phoenix's capabilities
@arizeai/phoenix-cliNPM Version DocsCLI for fetching traces, datasets, and experiments for use with Claude Code, Cursor, and other coding agents

Tracing Integrations

Phoenix is built on top of OpenTelemetry and is vendor, language, and framework agnostic. For details about tracing integrations and example applications, see the OpenInference project.

Python Integrations

IntegrationPackageVersion
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/openai.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/openai.png"></picture>OpenAIopeninference-instrumentation-openaiPyPI Version
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/openai.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/openai.png"></picture>OpenAI Agentsopeninference-instrumentation-openai-agentsPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/llamaindex-color.png" height="14">LlamaIndexopeninference-instrumentation-llama-indexPyPI Version
DSPyopeninference-instrumentation-dspyPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/bedrock-color.png" height="14">AWS Bedrockopeninference-instrumentation-bedrockPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/langchain-color.png" height="14">LangChainopeninference-instrumentation-langchainPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/mistral-color.png" height="14">MistralAIopeninference-instrumentation-mistralaiPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/google-color.png" height="14">Google GenAIopeninference-instrumentation-google-genaiPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/google-color.png" height="14">Google ADKopeninference-instrumentation-google-adkPyPI Version
Guardrailsopeninference-instrumentation-guardrailsPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/vertexai-color.png" height="14">VertexAIopeninference-instrumentation-vertexaiPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/crewai-color.png" height="14">CrewAIopeninference-instrumentation-crewaiPyPI Version
Haystackopeninference-instrumentation-haystackPyPI Version
LiteLLMopeninference-instrumentation-litellmPyPI Version
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/groq.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/groq.png"></picture>Groqopeninference-instrumentation-groqPyPI Version
Instructoropeninference-instrumentation-instructorPyPI Version
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/anthropic.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/anthropic.png"></picture>Anthropicopeninference-instrumentation-anthropicPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/huggingface-color.png" height="14">Smolagentsopeninference-instrumentation-smolagentsPyPI Version
Agnoopeninference-instrumentation-agnoPyPI Version
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/mcp.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/mcp.png"></picture>MCPopeninference-instrumentation-mcpPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/pydanticai-color.png" height="14">Pydantic AIopeninference-instrumentation-pydantic-aiPyPI Version
Autogen AgentChatopeninference-instrumentation-autogen-agentchatPyPI Version
Portkeyopeninference-instrumentation-portkeyPyPI Version
Agent Specopeninference-instrumentation-agentspecPyPI Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/claude-color.png" height="14">Claude Agent SDKopeninference-instrumentation-claude-agent-sdkPyPI Version

Span Processors

Normalize and convert data across other instrumentation libraries by adding span processors that unify data.

PackageDescriptionVersion
openinference-instrumentation-openlitOpenInference Span Processor for OpenLIT traces.PyPI Version
openinference-instrumentation-openllmetryOpenInference Span Processor for OpenLLMetry (Traceloop) traces.PyPI Version

JavaScript Integrations

IntegrationPackageVersion
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/openai.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/openai.png"></picture>OpenAI@arizeai/openinference-instrumentation-openaiNPM Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/langchain-color.png" height="14">LangChain.js@arizeai/openinference-instrumentation-langchainNPM Version
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/vercel.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/vercel.png"></picture>Vercel AI SDK@arizeai/openinference-vercelNPM Version
BeeAI@arizeai/openinference-instrumentation-beeaiNPM Version
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/claude-color.png" height="14">Claude Agent SDK@arizeai/openinference-instrumentation-claude-agent-sdkNPM Version
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/mastra.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/mastra.png"></picture>Mastra@mastra/arizeNPM Version
<picture><source media="(prefers-color-scheme: dark)" srcset="https://unpkg.com/@lobehub/icons-static-png@latest/dark/mcp.png"><img height="14" src="https://unpkg.com/@lobehub/icons-static-png@latest/light/mcp.png"></picture>MCP@arizeai/openinference-instrumentation-mcpNPM Version

Java Integrations

IntegrationPackageVersion
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/langchain-color.png" height="14">LangChain4jopeninference-instrumentation-langchain4jMaven Central
SpringAIopeninference-instrumentation-springAIMaven Central
Arconiaopeninference-instrumentation-springAIMaven Central

Platforms

PlatformDescriptionDocs
BeeAIAI agent framework with built-in observabilityIntegration Guide
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/dify-color.png" height="14">DifyOpen-source LLM app development platformIntegration Guide
Envoy AI GatewayAI Gateway built on Envoy Proxy for AI workloadsIntegration Guide
LangFlowVisual framework for building multi-agent and RAG applicationsIntegration Guide
LiteLLM ProxyProxy server for LLMsIntegration Guide
FlowiseVisual framework for building LLM applicationsIntegration Guide
Prompt FlowMicrosoft's prompt flow orchestration toolIntegration Guide
<img src="https://unpkg.com/@lobehub/icons-static-png@latest/dark/nvidia-color.png" height="14">NVIDIA NeMoNVIDIA NeMo Agent Toolkit for enterprise agentsIntegration Guide
GraphiteMulti-agent LLM workflow framework with visual builderIntegration Guide

Security & Privacy

We take data security and privacy very seriously. For more details, see our Security and Privacy documentation.

Telemetry

By default, Phoenix collects basic web analytics (e.g., page views, UI interactions) to help us understand how Phoenix is used and improve the product. None of your trace data, evaluation results, or any sensitive information is ever collected.

You can opt-out of telemetry by setting the environment variable: PHOENIX_TELEMETRY_ENABLED=false

Community

Join our community to connect with thousands of AI builders.

  • 🌍 Join our Slack community.
  • πŸ“š Read our documentation.
  • πŸ’‘ Ask questions and provide feedback in the #phoenix-support channel.
  • 🌟 Leave a star on our GitHub.
  • 🐞 Report bugs with GitHub Issues.
  • 𝕏 Follow us on 𝕏.
  • πŸ—ΊοΈ Check out our roadmap to see where we're heading next.
  • πŸ§‘β€πŸ« Deep dive into everything Agents and LLM Evaluations on Arize's Learning Hubs.

Breaking Changes

See the migration guide for a list of breaking changes.

Copyright, Patent, and License

Copyright 2025 Arize AI, Inc. All Rights Reserved.

Portions of this code are patent protected by one or more U.S. Patents. See the IP_NOTICE.

This software is licensed under the terms of the Elastic License 2.0 (ELv2). See LICENSE.

Global Ranking

8.5
Trust ScoreMCPHub Index

Based on codebase health & activity.

Manual Config

{ "mcpServers": { "arize-ai-phoenix": { "command": "npx", "args": ["arize-ai-phoenix"] } } }