Azure ChatOpenAI
Azure OpenAI is a cloud service to help you quickly develop generative AI experiences with a diverse set of prebuilt and curated models from OpenAI, Meta and beyond.
LangChain.js supports integration with Azure OpenAI using either the dedicated Azure OpenAI SDK or the OpenAI SDK.
You can learn more about Azure OpenAI and its difference with the OpenAI API on this page. If you don't have an Azure account, you can create a free account to get started.
Using the OpenAI SDK
You can use the ChatOpenAI
class to access OpenAI instances hosted on Azure.
For example, if your Azure instance is hosted under https://{MY_INSTANCE_NAME}.openai.azure.com/openai/deployments/{DEPLOYMENT_NAME}
, you
could initialize your instance like this:
- npm
- Yarn
- pnpm
npm install @langchain/openai
yarn add @langchain/openai
pnpm add @langchain/openai
We're unifying model params across all packages. We now suggest using model
instead of modelName
, and apiKey
for API keys.
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({
temperature: 0.9,
azureOpenAIApiKey: "SOME_SECRET_VALUE", // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY
azureOpenAIApiVersion: "YOUR-API-VERSION", // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION
azureOpenAIApiInstanceName: "{MY_INSTANCE_NAME}", // In Node.js defaults to process.env.AZURE_OPENAI_API_INSTANCE_NAME
azureOpenAIApiDeploymentName: "{DEPLOYMENT_NAME}", // In Node.js defaults to process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME
});
API Reference:
- ChatOpenAI from
@langchain/openai
If your instance is hosted under a domain other than the default openai.azure.com
, you'll need to use the alternate AZURE_OPENAI_BASE_PATH
environment variable.
For example, here's how you would connect to the domain https://westeurope.api.microsoft.com/openai/deployments/{DEPLOYMENT_NAME}
:
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({
temperature: 0.9,
azureOpenAIApiKey: "SOME_SECRET_VALUE", // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY
azureOpenAIApiVersion: "YOUR-API-VERSION", // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION
azureOpenAIApiDeploymentName: "{DEPLOYMENT_NAME}", // In Node.js defaults to process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME
azureOpenAIBasePath:
"https://westeurope.api.microsoft.com/openai/deployments", // In Node.js defaults to process.env.AZURE_OPENAI_BASE_PATH
});
API Reference:
- ChatOpenAI from
@langchain/openai
Using the Azure OpenAI SDK
You'll first need to install the @langchain/azure-openai
package:
- npm
- Yarn
- pnpm
npm install -S @langchain/azure-openai
yarn add @langchain/azure-openai
pnpm add @langchain/azure-openai
You'll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following this guide.
Once you have your instance running, make sure you have the endpoint and key. You can find them in the Azure Portal, under the "Keys and Endpoint" section of your instance.
You can then define the following environment variables to use the service:
AZURE_OPENAI_API_ENDPOINT=<YOUR_ENDPOINT>
AZURE_OPENAI_API_KEY=<YOUR_KEY>
AZURE_OPENAI_API_EMBEDDING_DEPLOYMENT_NAME=<YOUR_EMBEDDING_DEPLOYMENT_NAME>
Alternatively, you can pass the values directly to the AzureOpenAI
constructor:
import { AzureChatOpenAI } from "@langchain/azure-openai";
const model = new AzureChatOpenAI({
azureOpenAIEndpoint: "<your_endpoint>",
apiKey: "<your_key>",
azureOpenAIApiDeploymentName: "<your_embedding_deployment_name",
model: "<your_model>",
});
If you're using Azure Managed Identity, you can also pass the credentials directly to the constructor:
import { DefaultAzureCredential } from "@azure/identity";
import { AzureChatOpenAI } from "@langchain/azure-openai";
const credentials = new DefaultAzureCredential();
const model = new AzureChatOpenAI({
credentials,
azureOpenAIEndpoint: "<your_endpoint>",
azureOpenAIApiDeploymentName: "<your_embedding_deployment_name",
model: "<your_model>",
});
Usage example
import { AzureChatOpenAI } from "@langchain/azure-openai";
const model = new AzureChatOpenAI({
model: "gpt-4",
prefixMessages: [
{
role: "system",
content: "You are a helpful assistant that answers in pirate language",
},
],
maxTokens: 50,
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);
console.log({ res });
API Reference:
- AzureChatOpenAI from
@langchain/azure-openai