Setting up Traceloop SDK for Middleware (Next.js)
This guide will walk you through the process of setting up the Traceloop SDK to work with Middleware for LLM Observability in a Next.js environment.
1. Install the SDK
Run one of the following commands in your terminal:
npm
npm install @traceloop/node-server-sdk
2 Initialize the SDK
Step 1: Create instrumentation.ts
Create a file named instrumentation.ts
in the root of your project (i.e., outside of the app
or pages
directory) and add the following code:
export async function register() { if (process.env.NEXT_RUNTIME === "nodejs") { await import("./instrumentation.node.ts"); } }
Step 2: Create instrumentation.node.ts
Create a file named instrumentation.node.ts
in the root of your project and add the following code:
import * as traceloop from "@traceloop/node-server-sdk"; import OpenAI from "openai"; // Make sure to import the entire module you want to instrument, like this: // import * as LlamaIndex from "llamaindex"; traceloop.initialize({ appName: "YOUR_APPLICATION_NAME", apiEndpoint: "https://<MW_UID>.middleware.io:443", headers: { Authorization: "<MW_API_KEY>",
On Next.js v12 and below, you’ll also need to add the following to your next.config.js:
/** @type {import('next').NextConfig} */ const nextConfig = { experimental: { instrumentationHook: true, }, }; module.exports = nextConfig;
npm
npm install --save-dev node-loader npm i [email protected]
Step 1: Edit your next.config.js file and add the following webpack configuration:
const nextConfig = { webpack: (config, { isServer }) => { config.module.rules.push({ test: /\.node$/, loader: "node-loader", }); if (isServer) { config.ignoreWarnings = [{ module: /opentelemetry/ }]; } return config;
Step 2: On every app API route you want to instrument, add the following code at the top of the file:
import * as traceloop from "@traceloop/node-server-sdk"; import OpenAI from "openai"; // Make sure to import the entire module you want to instrument, like this: // import * as LlamaIndex from "llamaindex"; traceloop.initialize({ appName: "YOUR_APPLICATION_NAME", apiEndpoint: "https://<MW_UID>.middleware.io:443", headers: { Authorization: "<MW_API_KEY>",
3. Annotate your workflows (Optional)
For complex workflows or chains, you can use Traceloop's methods or decorators to get a better understanding of what's happening:
Functions (async / sync)
async function suggestAnswers(question: string) { return await withWorkflow({ name: "suggestAnswers" }, () => { ... }); }
If you're using an LLM framework like Haystack, Langchain, or LlamaIndex, Traceloop will automatically instrument your code. No need to add annotations manually.
Viewing Your Traces
After setting up the Traceloop SDK with Middleware, you'll be able to view your LLM application traces in your Middleware LLM Observability Section.
This integration provides instant visibility into everything happening within your LLM, including calls to vector databases or other external services.
For more detailed information on setting up Traceloop with Next.js, please refer to the Traceloop Next.js documentation.
Need assistance or want to learn more about using Traceloop with Middleware? Contact our support team in Slack.