Skip to main content

Add metadata and tags to traces

LangSmith supports sending arbitrary metadata and tags along with traces.

Tags are strings that can be used to categorize or label a trace. Metadata is a dictionary of key-value pairs that can be used to store additional information about a trace.

Both are useful for associating additional information with a trace, such as the environment in which it was executed, the user who initiated it, or an internal correlation ID. For more information on tags and metadata, see the Concepts page. For information on how to query traces and runs by metadata and tags, see the Filter traces in the application page.

import openai
from langsmith import traceable
from langsmith.run_trees import RunTree

client = openai.Client()

messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]

# Use the @traceable decorator with tags and metadata
# Ensure that the LANGCHAIN_TRACING_V2 environment variables are set for @traceable to work
@traceable(
run_type="llm",
name="OpenAI Call Decorator",
tags=["my-tag"],
metadata={"my-key": "my-value"}
)
def call_openai(
messages: list[dict], model: str = "gpt-3.5-turbo"
) -> str:
return client.chat.completions.create(
model=model,
messages=messages,
).choices[0].message.content

call_openai(
messages,
# You can also provide tags and metadata at invocation time
# via the langsmith_extra parameter
langsmith_extra={"tags": ["my-other-tag"], "metadata": {"my-other-key": "my-value"}}
)

# Alternatively, you can create a RunTree object with tags and metadata
rt = RunTree(
run_type="llm",
name="OpenAI Call RunTree",
inputs={"messages": messages},
tags=["my-tag"],
extra={"metadata": {"my-key": "my-value"}}
)
chat_completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=messages,
)
# End and submit the run
rt.end(outputs=chat_completion)
rt.post()

Was this page helpful?


You can leave detailed feedback on GitHub.