wrap_openai#

langsmith.wrappers._openai.wrap_openai(client: C, *, tracing_extra: TracingExtra | None = None, chat_name: str = 'ChatOpenAI', completions_name: str = 'OpenAI') C[source]#

Patch the OpenAI client to make it traceable.

Supports:
  • Chat and Responses API’s

  • Sync and async OpenAI clients

  • create() and parse() methods

  • with and without streaming

Parameters:
  • client (Union[OpenAI, AsyncOpenAI]) – The client to patch.

  • tracing_extra (Optional[TracingExtra], optional) – Extra tracing information. Defaults to None.

  • chat_name (str, optional) – The run name for the chat completions endpoint. Defaults to “ChatOpenAI”.

  • completions_name (str, optional) – The run name for the completions endpoint. Defaults to “OpenAI”.

Returns:

The patched client.

Return type:

Union[OpenAI, AsyncOpenAI]

Example

import openai
from langsmith import wrappers

# Use OpenAI client same as you normally would.
client = wrappers.wrap_openai(openai.OpenAI())

# Chat API:
messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {
        "role": "user",
        "content": "What physics breakthroughs do you predict will happen by 2300?",
    },
]
completion = client.chat.completions.create(
    model="gpt-4o-mini", messages=messages
)
print(completion.choices[0].message.content)

# Responses API:
response = client.responses.create(
    model="gpt-4o-mini",
    messages=messages,
)
print(response.output_text)

Changed in version 0.3.16: Support for Responses API added.