Skip to main content

Configure LangSmith Proxy to talk to your Azure OpenAI Endpoint

Prerequisites

  1. Docker installed on your local machine
    • Instructions for installing Docker can be found here
  2. An Azure OpenAI API Key
  3. An Azure OpenAI endpoint

1. Deploy the LangSmith Proxy

The LangSmith Proxy is available as a Docker container. You can run it in your environment by running the following command:

docker pull docker.io/langchain/langsmith-proxy:latest # Force pull the latest version of the LangSmith Proxy
docker run -e AZURE_OPENAI_ENDPOINT <YOUR AZURE_OPENAI_ENDPOINT> -p 8080:8080 docker.io/langchain/langsmith-proxy:latest # Run the LangSmith Proxy on port 8080 and publish it to the host

You should see the following output:

2024-03-06 12:59:57,458 CRIT Supervisor is running as root.  Privileges were not dropped because no user is specified in the config file.  If you intend to run as root, you can set user=root in the config file to avoid this message.
2024-03-06 12:59:57,467 INFO supervisord started with pid 1
2024-03-06 12:59:58,503 INFO spawned: 'nginx' with pid 8
2024-03-06 12:59:58,552 INFO spawned: 'trace-processor' with pid 10
2024-03-06 12:59:59,562 INFO success: nginx entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2024-03-06 12:59:59,563 INFO success: trace-processor entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
Couldn't create langsmith client: API key must be provided when using hosted LangSmith API, will skip creating runs
Listening for traces at 0.0.0.0:9999
Connection from ('127.0.0.1', 47370)

Ignore the Couldn't create langsmith client message if you are not configuring tracing.

2. Update your app to make requests to the LangSmith Proxy

For this example, we'll be using your local proxy running on localhost:8080. You can replace this with the address of your proxy if it's running on a different machine.

You may need to install some packages for this example:

pip install langchain_openai

Now, you can use the LangSmith Proxy to make requests to Azure OpenAI. Here's an example of how you can do this in Python: Let's create a file called azure_openai_test.py and add the following code:

from langchain_openai import AzureChatOpenAI
import os

os.environ["OPENAI_API_VERSION"] = "2023-06-01-preview"
os.environ["OPENAI_API_KEY"] = "YOUR API KEY"
os.environ["AZURE_OPENAI_ENDPOINT"] = "http://localhost:8080/proxy/azure-openai"

llm = AzureChatOpenAI(deployment_name="gpt-35-turbo")
print(llm.invoke("Hello, world!"))

Run the script:

python azure_openai_test.py

You should see some output like this

content='Hello! How can I assist you today?'
response_metadata={'token_usage': {'completion_tokens': 9, 'prompt_tokens': 11, 'total_tokens': 20}, 'model_name': 'gpt-35-turbo', 'system_fingerprint': 'fp_2f57f81c11',
'prompt_filter_results': [{'prompt_index': 0, 'content_filter_results': {'hate': {'filtered': False, 'severity': 'safe'}, 'self_harm': {'filtered': False, 'severity': 'safe'},
'sexual': {'filtered': False, 'severity': 'safe'}, 'violence': {'filtered': False, 'severity': 'safe'}}}], 'finish_reason': 'stop', 'logprobs': None, 'content_filter_results': {'hate': {'filtered': False, 'severity': 'safe'},
'self_harm': {'filtered': False, 'severity': 'safe'}, 'sexual': {'filtered': False, 'severity': 'safe'}, 'violence': {'filtered': False, 'severity': 'safe'}}} id='run-0884458f-bccd-4444-8357-3f8becc7ea2c-0'

Nice! You have successfully configured the LangSmith Proxy to talk to your Azure OpenAI endpoint. You can now use your proxy endpoint as a drop-in replacement for the Azure OpenAI endpoint in your applications.


Was this page helpful?


You can leave detailed feedback on GitHub.