Trace with OpenTelemetry
LangSmith can accept traces from OpenTelemetry based clients. This guide will walk through examples on how to achieve this.
Logging Traces with a basic OpenTelemetry client
This first section covers how to use a standard OpenTelemetry client to log traces to LangSmith.
0. Installation
Install the OpenTelemetry SDK, OpenTelemetry exporter packages, as well as the OpenAI package:
pip install openai
pip install opentelemetry-sdk
pip install opentelemetry-exporter-otlp
1. Configure your environment
Setup environment variables for the endpoint, substitute your specific values:
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.smith.langchain.com/otel
export OTEL_EXPORTER_OTLP_HEADERS="x-api-key=<your langsmith api key>"
Optional: Specify a custom project name other than "default"
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.smith.langchain.com/otel
export OTEL_EXPORTER_OTLP_HEADERS="x-api-key=<your langsmith api key>,LANGSMITH_PROJECT=<project name>"
2. Log a trace
This code sets up an OTEL tracer and exporter that will send traces to LangSmith. It then calls OpenAI and sends the required OpenTelemetry attributes.
from openai import OpenAI
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import (
BatchSpanProcessor,
)
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
otlp_exporter = OTLPSpanExporter(
timeout=10,
)
trace.set_tracer_provider(TracerProvider())
trace.get_tracer_provider().add_span_processor(
BatchSpanProcessor(otlp_exporter)
)
tracer = trace.get_tracer(__name__)
def call_openai():
model = "gpt-4o-mini"
with tracer.start_as_current_span("call_open_ai") as span:
span.set_attribute("langsmith.span.kind", "LLM")
span.set_attribute("langsmith.metadata.user_id", "user_123")
span.set_attribute("gen_ai.system", "OpenAI")
span.set_attribute("gen_ai.request.model", model)
span.set_attribute("llm.request.type", "chat")
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{
"role": "user",
"content": "Write a haiku about recursion in programming."
}
]
for i, message in enumerate(messages):
span.set_attribute(f"gen_ai.prompt.{i}.content", str(message["content"]))
span.set_attribute(f"gen_ai.prompt.{i}.role", str(message["role"]))
completion = client.chat.completions.create(
model=model,
messages=messages
)
span.set_attribute("gen_ai.response.model", completion.model)
span.set_attribute("gen_ai.completion.0.content", str(completion.choices[0].message.content))
span.set_attribute("gen_ai.completion.0.role", "assistant")
span.set_attribute("gen_ai.usage.prompt_tokens", completion.usage.prompt_tokens)
span.set_attribute("gen_ai.usage.completion_tokens", completion.usage.completion_tokens)
span.set_attribute("gen_ai.usage.total_tokens", completion.usage.total_tokens)
return completion.choices[0].message
if __name__ == "__main__":
call_openai()
You should see a trace in your LangSmith dashboard like this one.
Logging Traces with the Traceloop SDK
The Traceloop SDK is an OpenTelemetry compatible SDK that covers a range of models, vector databases and frameworks. If there are integrations that you are interested in instrumenting that are covered by this SDK, you can use this SDK with OpenTelemetry to log traces to LangSmith.
To get started, follow these steps:
0. Installation
pip install traceloop-sdk
pip install openai
1. Configure your environment
Setup environment variables:
TRACELOOP_BASE_URL=https://api.smith.langchain.com/otel
TRACELOOP_HEADERS=x-api-key=<your_langsmith_api_key>
Optional: Specify a custom project name other than "default"
TRACELOOP_HEADERS=x-api-key=<your_langsmith_api_key>,LANGSMITH_PROJECT=<langsmith_project_name>
2. Initialize the SDK
To use the SDK, you need to initialize it before logging traces:
from traceloop.sdk import Traceloop
Traceloop.init()
3. Log a trace
Here is a complete example using an OpenAI chat completion:
import os
from openai import OpenAI
from traceloop.sdk import Traceloop
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
Traceloop.init()
completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{
"role": "user",
"content": "Write a haiku about recursion in programming."
}
]
)
print(completion.choices[0].message)
You should see a trace in your LangSmith dashboard like this one.