Google Vertex AI
Langchain.js supports two different authentication methods based on whether you're running in a Node.js environment or a web environment.
Setup
Node.js
To call Vertex AI models in Node, you'll need to install the @langchain/google-vertexai
package:
- npm
- Yarn
- pnpm
npm install @langchain/google-vertexai
yarn add @langchain/google-vertexai
pnpm add @langchain/google-vertexai
You should make sure the Vertex AI API is enabled for the relevant project and that you've authenticated to Google Cloud using one of these methods:
- You are logged into an account (using
gcloud auth application-default login
) permitted to that project. - You are running on a machine using a service account that is permitted to the project.
- You have downloaded the credentials for a service account that is permitted
to the project and set the
GOOGLE_APPLICATION_CREDENTIALS
environment variable to the path of this file. or - You set the
GOOGLE_API_KEY
environment variable to the API key for the project.
Web
To call Vertex AI models in web environments (like Edge functions), you'll need to install
the @langchain/google-vertexai-web
package:
- npm
- Yarn
- pnpm
npm install @langchain/google-vertexai-web
yarn add @langchain/google-vertexai-web
pnpm add @langchain/google-vertexai-web
Then, you'll need to add your service account credentials directly as a GOOGLE_VERTEX_AI_WEB_CREDENTIALS
environment variable:
GOOGLE_VERTEX_AI_WEB_CREDENTIALS={"type":"service_account","project_id":"YOUR_PROJECT-12345",...}
You can also pass your credentials directly in code like this:
import { VertexAI } from "@langchain/google-vertexai";
// Or uncomment this line if you're using the web version:
// import { VertexAI } from "@langchain/google-vertexai-web";
const model = new VertexAI({
authOptions: {
credentials: {"type":"service_account","project_id":"YOUR_PROJECT-12345",...},
},
});
Usage
The entire family of gemini
models are available by specifying the modelName
parameter.
import { VertexAI } from "@langchain/google-vertexai";
// Or, if using the web entrypoint:
// import { VertexAI } from "@langchain/google-vertexai-web";
const model = new VertexAI({
temperature: 0.7,
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);
console.log({ res });
/*
{
res: '* Hue Hues\n' +
'* Sock Spectrum\n' +
'* Kaleidosocks\n' +
'* Threads of Joy\n' +
'* Vibrant Threads\n' +
'* Rainbow Soles\n' +
'* Colorful Canvases\n' +
'* Prismatic Pedals\n' +
'* Sock Canvas\n' +
'* Color Collective'
}
*/
API Reference:
- VertexAI from
@langchain/google-vertexai
Streaming
Streaming in multiple chunks is supported for faster responses:
import { VertexAI } from "@langchain/google-vertexai";
// Or, if using the web entrypoint:
// import { VertexAI } from "@langchain/google-vertexai-web";
const model = new VertexAI({
temperature: 0.7,
});
const stream = await model.stream(
"What would be a good company name for a company that makes colorful socks?"
);
for await (const chunk of stream) {
console.log("\n---------\nChunk:\n---------\n", chunk);
}
/*
---------
Chunk:
---------
* Kaleidoscope Toes
* Huephoria
* Soleful Spectrum
*
---------
Chunk:
---------
Colorwave Hosiery
* Chromatic Threads
* Rainbow Rhapsody
* Vibrant Soles
* Toe-tally Colorful
* Socktacular Hues
*
---------
Chunk:
---------
Threads of Joy
---------
Chunk:
---------
*/
API Reference:
- VertexAI from
@langchain/google-vertexai