Traceloop for Python - Setup Guide
Get end-to-end visibility into your Python LLM workflows by wiring the Traceloop SDK to Middleware. This page shows the exact install, initialisation, and (optional) workflow annotations you need, plus how to confirm data is flowing into the LLM Observability UI.
Before you Begin
You’ll need:
- A Python LLM app you can modify (with admin privileges)
- Your Middleware UID (for the endpoint) and Middleware API key (for the
Authorizationheader). - Network egress from your app to
https://<MW_UID>.middleware.io:443. (If traces don’t appear, connectivity is a common culprit.)
Quick tip: When testing locally, you can see traces sooner by disabling batching (shown below). Use this only for local/dev checks.
1. Install the SDK
Run in your terminal:
1pip install traceloop-sdk2. Initialise the SDK in your app
Add the tracer initialisation to your LLM application. Replace placeholders with your app name, Middleware UID, and API key.
1from traceloop.sdk import Traceloop
2
3Traceloop.init(
4 app_name="YOUR_APPLICATION_NAME",
5 api_endpoint="https://<MW_UID>.middleware.io:443",
6 headers={
7 "Authorization": "<MW_API_KEY>",
8 "X-Trace-Source": "traceloop",
9 },
10 resource_attributes={"key": "value"},
11)Local testing: see traces immediately
Batching is efficient in production, but for quick local verification, you can disable it:
1Traceloop.init(
2 # ... other parameters ...
3 disable_batch=True
4)3. (Optional) Annotate workflows for richer traces
For multi-step chains and complex flows, annotate functions as workflows to make traces easier to read.
1from traceloop.sdk.decorators import workflow
2
3@workflow(name="suggest_answers")
4def suggest_answers(question: str):
5 # Your function logic here
6 pass- For async functions, use
@aworkflow. - If you already use Haystack, LangChain, or LlamaIndex, Traceloop can auto-instrument them; you may not need manual annotations.
View your traces in Middleware
Once your app is running and making LLM calls, open the LLM Observability section in the Middleware UI to see traces. You’ll get instant visibility into the LLM path, including calls to vector databases and other external services.
More detail: See the official Traceloop Python SDK docs if you want to dive into SDK-specific options and patterns.
Troubleshooting & Common Pitfalls
- No traces appearing
- Ensure your app is actually executing LLM calls after initialisation.
- Check connectivity to
https://<MW_UID>.middleware.io:443. - For local runs, try
disable_batch=Trueto flush spans quickly.
- Wrong tenant/endpoint: Double-check the
<MW_UID>value inapi_endpoint. A typo sends data nowhere useful. - Auth issues: Verify the
Authorizationheader is set to your Middleware API key exactly as shown. - Too little detail in traces: Add
@workflow(or@aworkflow) to key steps, or rely on auto-instrumentation if you use Haystack / LangChain / LlamaIndex.
Need assistance or want to learn more about using Traceloop with Middleware? Contact our support team at [email protected].