Skip to main content

Customizing Run Names

Open In Collab Open In GitHub

Every LangSmith run receives a name. This name is visible in the UI and can be employed later for querying a particular run. In the context of tracing chains constructed with LangChain, the default run name is derived from the class name of the invoked object.

For runs categorized as "Chain", the name can be configured by calling the runnable object's with_config({"run_name": "My Run Name"}) method. This guide illustrates its application through several examples.

Note: Only chains and general runnables support custom naming; LLMs, chat models, prompts, and retrievers do not.

Begin by installing the latest version of LangChain.

# %pip install -U langchain --quiet
import os

# Update with your API URL if using a hosted instance of Langsmith.
os.environ["LANGCHAIN_ENDPOINT"] = ""
os.environ["LANGCHAIN_API_KEY"] = "YOUR API KEY" # Update with your API key
os.environ["LANGCHAIN_TRACING_V2"] = "true"
project_name = "YOUR PROJECT NAME" # Update with your project name
os.environ["LANGCHAIN_PROJECT"] = project_name # Optional: "default" is used if not set
from langsmith import Client

client = Client()

Example 1: Simple Chain

from langchain import chat_models, prompts, callbacks, schema

chain = (
prompts.ChatPromptTemplate.from_template("Reverse the following string: {text}")
| chat_models.ChatOpenAI()
).with_config({"run_name": "StringReverse"})

with callbacks.collect_runs() as cb:
for chunk in{"text": "🌑🌒🌓🌔🌕"}):
print(chunk.content, flush=True, end="")
run = cb.traced_runs[0]
/Users/mukilloganathan/langchain/venv/lib/python3.11/site-packages/langchain_core/_api/ LangChainDeprecationWarning: The class `langchain_community.chat_models.openai.ChatOpenAI` was deprecated in langchain-community 0.0.10 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it run `pip install -U langchain-openai` and import as `from langchain_openai import ChatOpenAI`.


This will result in a trace that looks something like the following:


If you inspect the run object, you can see the run name is now "StringReverse". You can query within a project for runs with this name to see all the times this chain was used. Do so using the filter syntax eq(name, "MyRunName").

print(f"Saved name {}")
Saved name StringReverse
# List with the name filter to get runs with the assigned name
next(client.list_runs(project_name=project_name, filter='eq(name, "StringReverse")'))
Run(id=UUID('1960c94c-f860-4403-a1bd-3ebdd29099e0'), name='StringReverse', start_time=datetime.datetime(2024, 2, 13, 2, 33, 19, 580185), run_type='chain', end_time=datetime.datetime(2024, 2, 13, 2, 33, 20, 427903), extra={'runtime': {'sdk': 'langsmith-py', 'sdk_version': '0.0.87', 'library': 'langchain-core', 'platform': 'macOS-14.0-arm64-arm-64bit', 'runtime': 'python', 'py_implementation': 'CPython', 'runtime_version': '3.11.5', 'langchain_version': '0.1.6', 'thread_count': 15.0, 'mem': {'rss': 71892992.0}, 'cpu': {'time': {'sys': 2.74464128, 'user': 1.438423424}, 'ctx_switches': {'voluntary': 12209.0, 'involuntary': 0.0}, 'percent': 0.0}, 'library_version': '0.1.22'}, 'metadata': {}}, error=None, serialized={'id': ['langchain', 'schema', 'runnable', 'RunnableSequence'], 'kwargs': {'first': {'id': ['langchain', 'prompts', 'chat', 'ChatPromptTemplate'], 'kwargs': {'input_variables': ['text'], 'messages': [{'id': ['langchain', 'prompts', 'chat', 'HumanMessagePromptTemplate'], 'kwargs': {'prompt': {'id': ['langchain', 'prompts', 'prompt', 'PromptTemplate'], 'kwargs': {'input_variables': ['text'], 'partial_variables': {}, 'template': 'Reverse the following string: {text}', 'template_format': 'f-string'}, 'lc': 1, 'type': 'constructor'}}, 'lc': 1, 'type': 'constructor'}], 'partial_variables': {}}, 'lc': 1, 'type': 'constructor'}, 'last': {'id': ['langchain', 'chat_models', 'openai', 'ChatOpenAI'], 'kwargs': {'openai_api_key': {'id': ['OPENAI_API_KEY'], 'lc': 1, 'type': 'secret'}}, 'lc': 1, 'type': 'constructor'}, 'middle': [], 'name': None}, 'lc': 1, 'type': 'constructor'}, events=[{'name': 'start', 'time': '2024-02-13T02:33:19.580185+00:00'}, {'name': 'end', 'time': '2024-02-13T02:33:20.427903+00:00'}], inputs={'text': '🌑🌒🌓🌔🌕'}, outputs={'output': {'content': '🌕🌔🌓🌒🌑', 'additional_kwargs': {}, 'type': 'AIMessageChunk', 'example': False}}, reference_example_id=None, parent_run_id=None, tags=[], session_id=UUID('0c870ddb-53b3-4717-918f-8415aa308fe7'), child_run_ids=[UUID('5014b13d-b1a7-4ed7-8cb7-98e9a8929175'), UUID('7c549e33-bc49-4797-be96-4637383f2d19')], child_runs=None, feedback_stats=None, app_path='/o/ebbaf2eb-769b-4505-aca2-d11de10372a4/projects/p/0c870ddb-53b3-4717-918f-8415aa308fe7/r/1960c94c-f860-4403-a1bd-3ebdd29099e0?trace_id=1960c94c-f860-4403-a1bd-3ebdd29099e0&start_time=2024-02-13T02:33:19.580185', manifest_id=None, status='success', prompt_tokens=27, completion_tokens=15, total_tokens=42, first_token_time=datetime.datetime(2024, 2, 13, 2, 33, 20, 302698), parent_run_ids=[], trace_id=UUID('1960c94c-f860-4403-a1bd-3ebdd29099e0'), dotted_order='20240213T023319580185Z1960c94c-f860-4403-a1bd-3ebdd29099e0')

Example 2: Runnable Lambda

LangChain's RunnableLambdas are custom functions that can be invoked, batched, streamed, and/or transformed.

By default (in langchain versions >= 0.0.283), the name of the lambda is the function name. You can customize this by calling with_config({"run_name": "My Run Name"}) on the runnable lambda object.

from langchain.schema.output_parser import StrOutputParser

def reverse_and_concat(txt: str) -> str:
return txt[::-1] + txt

lambda_chain = chain | StrOutputParser() | reverse_and_concat
with callbacks.collect_runs() as cb:
print(lambda_chain.invoke({"text": "🌑🌒🌓🌔🌕"}))
# We will fetch just the lambda run (which is the last child run in this root trace)
run = cb.traced_runs[0].child_runs[-1]
from langchain.callbacks.tracers import langchain

# If you are using LangChain < 0.0.283, this will be "RunnableLambda"
print(f"Saved name: {}")
Saved name: reverse_and_concat

The lambda function's trace will be given the lambda function's name, reverse_and_concat, as shown below:

<a href="" target="_blank"><img src="static/reverse_and_concat.png" alt="reverse_and_concat" width="75%"></a>

Customize Lambda Name

In the lambda_chain above, our function was automatically promoted to a "RunnableLambda" via the piping syntax. We can customize the run name using the with_config syntax once the object is created.

from langchain.schema import runnable

configured_lambda_chain = (
| StrOutputParser()
| runnable.RunnableLambda(reverse_and_concat).with_config(
{"run_name": "LambdaReverse"}
with callbacks.collect_runs() as cb:
print(configured_lambda_chain.invoke({"text": "🌑🌒🌓🌔🌕"}))
run = cb.traced_runs[0].child_runs[-1]
print(f"Saved name: {}")
Saved name: LambdaReverse


The lambda function's name now will be LambdaReverse, as shown below:

Example 3: Agents

Since LangChain agents and agent executors are types of chains.

from langchain import agents, tools

agent_executor = agents.initialize_agent(
tools=[tools.ReadFileTool(), tools.WriteFileTool(), tools.ListDirectoryTool()],
/Users/mukilloganathan/langchain/venv/lib/python3.11/site-packages/langchain_core/_api/ LangChainDeprecationWarning: The function `initialize_agent` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use Use new agent constructor methods like create_react_agent, create_json_agent, create_structured_chat_agent, etc. instead.
with callbacks.collect_runs() as cb:
result = agent_executor.with_config({"run_name": "File Agent"}).invoke(
"What files are in the current directory?"
run = cb.traced_runs[0]
The files in the current directory are:
1. run-naming.ipynb
2. img
3. .ipynb_checkpoints
print(f"Saved name: {}")
Saved name: File Agent

The resulting agent trace will reflect the custom name you've assigned to it.

File Agent Trace


An easy way to customize run names is to use the with_config syntax on your LangChain chain or runnable lambda. This makes it easier to understand a trace at a glance.

Was this page helpful?

You can leave detailed feedback on GitHub.