Traceloop for Ruby - Setup Guide

This guide walks you through wiring the Traceloop SDK to Middleware for LLM Observability in a Ruby app. The core idea: initialise Traceloop in your app, then (because Ruby doesn’t have auto-instrumentation yet) log prompts/completions around your LLM calls.

Before you begin

  • Credentials: Have your Middleware UID (for the ingest base_url) and Middleware API key (for the Authorization header).
  • Rails placement: If you use Rails, put the initialiser in config/initializers/traceloop.rb.
  • Manual logging: Ruby libraries are not yet auto-instrumented, and you’ll need to log prompts and completions explicitly (see the example below).

1. Install the SDK

Run this in your terminal to add the Traceloop SDK to your project:

1gem install traceloop-sdk

2. Initialize the SDK

Add this to your LLM application to set up the Traceloop tracer (fill in your tenant UID and API key):

1require "traceloop/sdk"
2
3traceloop = Traceloop::SDK::Traceloop.new(
4  base_url: "https://<MW_UID>.middleware.io:443",
5  api_key: "<MW_API_KEY>"
6)
7
8# Add headers
9traceloop.add_headers({
10  "Authorization" => "<MW_API_KEY>",
11  "X-Trace-Source" => "traceloop"
12})

Note: If you’re on Rails, place this initialization in config/initializers/traceloop.rb.

3. Log Your Prompts

Since Ruby doesn’t auto-instrument LLM libraries yet, wrap your calls and log the prompt/completion (example with OpenAI):

1require "openai"
2
3client = OpenAI::Client.new
4
5# This tracks the latency of the call and the response
6traceloop.llm_call(provider: "openai", model: "gpt-3.5-turbo") do |tracer|
7  # Log the prompt
8  tracer.log_prompt(user_prompt: "Tell me a joke about OpenTelemetry")
9
10  # Call OpenAI like you normally would
11  response = client.chat(
12    parameters: {
13      model: "gpt-3.5-turbo",
14      messages: [{ role: "user", content: "Tell me a joke about OpenTelemetry" }]
15    })
16
17  # Pass the response from OpenAI as is to log the completion and token usage
18  tracer.log_response(response)
19end

Viewing Your Traces

After setup and at least one LLM request, open the LLM Observability section in Middleware to view spans and traces across your Ruby service and any external calls.

For deeper options, see the Traceloop Ruby documentation.

Troubleshooting

  • No traces appearing: Make sure you executed at least one LLM request after initialisation and that the service can reach https://<MW_UID>.middleware.io:443.
  • Auth errors: Re-check the Authorization header (Middleware API key) and your tenant base_url.
  • Nothing around prompts/completions: Remember that Ruby requires manual logging, and you need to use the “Log Your Prompts” pattern above.

Need assistance or want to learn more about using Traceloop with Middleware? Contact our support team at [email protected].