@arizeai/openinference instrumentation
This MCP server provides instrumentation for the OpenInference platform, enabling AI models to integrate with the Model Context Protocol (MCP) for improved context awareness and performance. It allows developers to easily integrate their AI models with the MCP, providing access to contextual data and enhancing the overall capabilities of their AI applications.
Installation Instructions
Get started with @arizeai/openinference instrumentation in your project
Install via npm
npm install @arizeai/openinference-instrumentation-mcpOr with yarn
yarn add @arizeai/openinference-instrumentation-mcpRelated Servers
@heroku/mcp server
The @heroku/mcp-server is a platform server that enables AI assistants like Claude to integrate with Heroku's Metrics and Correlation Protocol (MCP) service. It provides a standardized way for AI applications to access and analyze performance metrics and data from Heroku-hosted applications, allowing developers to build more robust and data-driven AI systems.
@cap js/mcp server
Model Context Protocol (MCP) server for AI-assisted development of CAP applications.
@dynatrace oss/dynatrace
Model Context Protocol (MCP) server for Dynatrace