Traceloop for Node.js - Setup Guide

Wire up the Traceloop SDK to Middleware to get end-to-end visibility into your Node.js LLM workflows. This guide will walk you through the installation, initialisation, optional workflow annotations, and how to confirm data is flowing into the LLM Observability UI.

Before you begin

You’ll need:

  • A Node.js LLM app you can modify (with administrative privileges).
  • Your Middleware UID (for the tenant ingest endpoint) and Middleware API key (for the Authorization header).
  • Important: In Node.js, import and initialise Traceloop before importing any LLM module (e.g., OpenAI), or you won’t see proper traces.

Quick tip (local/dev): To see traces immediately, you can temporarily disable batching. Use this only for quick verification during development.

1. Install the SDK

Run the following in your project directory to add the Traceloop SDK dependency:

1npm install @traceloop/node-server-sdk

2. Initialise the SDK in your app

Place this initialisation at the very start of your app’s entry file—before any LLM provider imports:

1import * as traceloop from "@traceloop/node-server-sdk";
2
3traceloop.initialize({
4  appName: "YOUR_APPLICATION_NAME",
5  apiEndpoint: "https://<MW_UID>.middleware.io:443",
6  headers: {
7    Authorization: "<MW_API_KEY>",
8    "X-Trace-Source": "traceloop",
9  },
10  resourceAttributes: { "key": "value" },
11});
12
13See traces immediately (dev)
14For quick local verification, you can momentarily disable batching to flush spans faster:
15traceloop.initialize({
16  // ... other parameters ...
17  disableBatch: true,
18});

What these fields mean (at a glance)

  • appName: the service name shown in Middleware.
  • apiEndpoint: your tenant ingest URL (keep :443).
  • headers.Authorization: your Middleware API key.
  • headers["X-Trace-Source"]: identifies Traceloop as the source.
  • resourceAttributes: optional labels (env, team, etc.).

Disable batch sending if you're testing locally and want to see traces immediately:

1traceloop.initialize({
2  // ... other parameters ...
3  disableBatch: true,
4});

3. (Optional) Annotate workflows for richer traces

Wrap multi-step functions with a workflow helper so traces read clearly and are easy to navigate:

1import { withWorkflow } from "@traceloop/node-server-sdk";
2
3async function suggestAnswers(question: string) {
4  return await withWorkflow({ name: "suggestAnswers" }, async () => {
5    // Your function logic here
6  });
7}

If you use Haystack, LangChain, or LlamaIndex, Traceloop can auto-instrument, and you may not need manual annotations.

View your traces in Middleware

Run your app and trigger at least one LLM request. Open the LLM Observability section in Middleware to see spans and traces. You’ll get instant visibility into LLM calls and dependencies (e.g., vector DBs, external services).

Troubleshooting & Common Pitfalls

  • Nothing shows up in the UI → Ensure the initialisation runs before any LLM SDK import; then actually execute a request after start-up.
  • Auth/tenant issues → Double-check <MW_UID> in apiEndpoint and the exact Authorization header value.
  • Slow/no spans during local tests → Use disableBatch: true briefly to flush spans quickly (dev only).

Need assistance or want to learn more about using Traceloop with Middleware? Contact our support team at [email protected].