Skip to main content

Tracing Quick Start

You can get started with LangSmith tracing using either LangChain, the Python SDK, the TypeScript SDK, or the API. The following sections provide a quick start guide for each of these options.

First, create an API key by navigating to the settings page, then follow the instructions below:

1. Install the LangSmith library

Start by installing the Python library.

pip install langsmith

2. Configure your environment

export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY=<your-api-key>

# The below examples use the OpenAI API, though it's not necessary in general
export OPENAI_API_KEY=<your-openai-api-key>

3. Log a trace

We provide multiple ways to log traces to LangSmith. Below, we'll highlight how to use our simple @traceable decorator. See more in the Integrations section.

import openai
from langsmith.wrappers import wrap_openai
from langsmith import traceable

# Auto-trace LLM calls in-context
client = wrap_openai(openai.Client())

@traceable # Auto-trace this function
def pipeline(user_input: str):
result = client.chat.completions.create(
messages=[{"role": "user", "content": user_input}],
model="gpt-4o-mini"
)
return result.choices[0].message.content

pipeline("Hello, world!")
# Out: Hello there! How can I assist you today?

4. View the trace

By default, the trace will be logged to the project with the name default. You can change the project you log to by following the instructions

here

. An example of a trace logged using the above code is made public and can be viewed here .


Was this page helpful?


You can leave detailed feedback on GitHub.