Overview

Introduction

LLM Observability in Middleware provides comprehensive monitoring and troubleshooting capabilities for your LLM-powered applications. It offers deep insights through metrics, dashboards, and traces, enabling you to optimize performance and quickly resolve issues.

Key Features

  1. Traces: Get detailed, end-to-end tracing of your LLM requests and workflows.
LLM Main Page
  1. Metrics: Capture essential LLM-specific metrics:

  2. Dashboards: Utilize pre-built dashboards for quick insights into your LLM application's performance.

LLM Main Page

Benefits

  • Enhanced Visibility: Gain deep insights into your LLM application's performance and behavior.

  • Quick Troubleshooting: Identify and resolve issues faster with comprehensive tracing and metrics.

  • Performance Optimization: Use collected data to optimize your LLM applications for better efficiency and cost-effectiveness.

  • Seamless Integration: Easy integration with popular LLM providers and frameworks.

LLM Sidebar

Supported SDKs

Middleware supports two OpenTelemetry-compatible SDKs for LLM Observability:

Both SDKs extend OpenTelemetry's capabilities to capture LLM-specific data.

Feature Comparison: Traceloop vs OpenLIT

FeatureTraceloopOpenLIT
LLM Providers
OpenAI
Azure OpenAI
Anthropic
Cohere
Ollama
Mistral AI
HuggingFace
AWS Bedrock
Vertex AI (GCP)
Google Generative AI (Gemini)
IBM Watsonx AI
Together AI
Aleph Alpha
GPT4All
Groq
ElevenLabs
Vector Databases
Chroma
Pinecone
Qdrant
Weaviate
Milvus
Marqo
Frameworks
LangChain
LlamaIndex
Haystack
LiteLLM
Embedchain
Hardware Support
NVIDIA GPUs

Getting Started

To start using LLM Observability with Middleware:

  1. Choose either Traceloop or OpenLIT SDK based on your requirements.
  2. Instrument your LLM application following the SDK's documentation.
  3. Configure the SDK to send data to your Middleware instance.
  4. Use Middleware's dashboards to monitor and analyze your LLM application's performance.

For detailed setup instructions, please refer to our specific SDK LLM Observability documentation.

Need assistance or want to learn more about LLM Observability in Middleware? Contact our support team in Slack