Back to Projects
2024

LLM Observability Integration

MLOps
LLM Observability Integration
ObservabilityLLM MonitoringHugging Face SpacesPhoenixLangfuseServerless InferenceMulti-agent SystemsProduction Monitoring

Initiated and built the integration for the first LLM observability tools on the Hugging Face Hub, leading the planning, integration, documentation, release, and communications for both Arize AI's Phoenix and Langfuse observability platforms.

Successfully deployed Phoenix as a Hugging Face Space, enabling users to trace LLM calls with Arize AI's observability dashboards. Created comprehensive documentation including a new recipe in the Open-Source AI Cookbook that demonstrates deployment on HF Spaces with persistent storage, LLM tracing with the Serverless Inference API, and multi-agent application monitoring with CrewAI integration.

Integrated Langfuse as a Docker Space on the HF Hub, bringing end-to-end observability and tooling to accelerate development workflows from experiments through production. The integration includes one-click deployment with persistent storage and integrated OAuth, comprehensive tracing of LLM calls, retrieval, and agent actions, simple prompt management with versioning capabilities, and intuitive evaluation tools for collecting user feedback and running model/prompt evaluations.

These integrations address a critical gap in the LLM development ecosystem by making observability tools accessible directly on Hugging Face Spaces, enabling developers to build more robust and reliable LLM applications with proper monitoring and evaluation capabilities.