Traceloop for Go - Setup Guide
This guide demonstrates how to integrate the Traceloop SDK with Middleware for LLM Observability in a Go application and will walk you through the installation, initialization, and then (since Go lacks auto-instrumentation yet) log prompts/completions manually.
Before you begin
- Credentials: Have your Middleware UID (for the ingest
BaseURL
) and Middleware API key (for theAuthorization
header). - Manual logging in Go: Traceloop does not auto-instrument Go LLM libraries yet, so you’ll explicitly log prompts and completions (example below).
- Run a real call: After initialising, make at least one LLM request to generate traces you can see in Middleware.
1. Install the SDK
Run this in your terminal to add the Traceloop SDK to your Go project:
1go get github.com/traceloop/go-openllmetry/traceloop-sdk
2. Initialise the SDK
In your LLM service, import the SDK and create the client with your tenant URL and headers:
1import sdk "github.com/traceloop/go-openllmetry/traceloop-sdk"
2
3func main() {
4 ctx := context.Background()
5
6 traceloop := sdk.NewClient(config.Config{
7 BaseURL: "https://<MW_UID>.middleware.io:443",
8 Headers: map[string]string{
9 "Authorization": "<MW_API_KEY>",
10 "X-Trace-Source": "traceloop",
11 },
12 })
13 defer func() { traceloop.Shutdown(ctx) }()
14
15 traceloop.Initialize(ctx)
16}
3. Log your prompts
Because Go libraries aren’t auto-instrumented yet, log prompts and completions explicitly around your LLM calls (example with OpenAI):
1import (
2 openai "github.com/sashabaranov/go-openai"
3)
4
5func call_llm() {
6 // Call OpenAI like you normally would
7 resp, err := client.CreateChatCompletion(
8 context.Background(),
9 openai.ChatCompletionRequest{
10 Model: openai.GPT3Dot5Turbo,
11 Messages: []openai.ChatCompletionMessage{
12 {
13 Role: openai.ChatMessageRoleUser,
14 Content: "Tell me a joke about OpenTelemetry!",
15 },
16 },
17 },
18 )
19
20 // Log the request and the response
21 log := dto.PromptLogAttributes{
22 Prompt: dto.Prompt{
23 Vendor: "openai",
24 Mode: "chat",
25 Model: request.Model,
26 },
27 Completion: dto.Completion{
28 Model: resp.Model,
29 },
30 Usage: dto.Usage{
31 TotalTokens: resp.Usage.TotalTokens,
32 CompletionTokens: resp.Usage.CompletionTokens,
33 PromptTokens: resp.Usage.PromptTokens,
34 },
35 }
36
37 for i, message := range request.Messages {
38 log.Prompt.Messages = append(log.Prompt.Messages, dto.Message{
39 Index: i,
40 Content: message.Content,
41 Role: message.Role,
42 })
43 }
44
45 for _, choice := range resp.Choices {
46 log.Completion.Messages = append(log.Completion.Messages, dto.Message{
47 Index: choice.Index,
48 Content: choice.Message.Content,
49 Role: choice.Message.Role,
50 })
51 }
52
53 traceloop.LogPrompt(ctx, log)
54}
Viewing your traces
Once initialised and after at least one LLM request, open the LLM Observability section in Middleware to see spans/traces for your Go service.
You’ll get visibility into LLM calls and any external services involved. For more in-depth SDK options, refer to the Traceloop Go documentation.
Troubleshooting
- No traces appearing: Ensure you executed at least one LLM request after initialisation and that your service can reach
https://<MW_UID>.middleware.io:443
. - Auth errors: Re-check the
Authorization
header value (Middleware API key) and that theBaseURL
uses your correct tenant UID. - Nothing around prompts/completions: Go needs manual logging, so you are required to wrap your LLM calls as shown in the “Log your prompts” example.
Need assistance or want to learn more about using Traceloop with Middleware? Contact our support team at [email protected].