Neon Postgres
Neon is a fully managed serverless PostgreSQL database. It separates storage and compute to offer features such as instant branching and automatic scaling.
With the pgvector
extension, Neon provides a vector store that can be used with LangChain.js to store and query embeddings.
Setup
Select a Neon project
If you do not have a Neon account, sign up for one at Neon. After logging into the Neon Console, proceed to the Projects section and select an existing project or create a new one.
Your Neon project comes with a ready-to-use Postgres database named neondb
that you can use to store embeddings. Navigate to
the Connection Details section to find your database connection string. It should look similar to this:
postgres://alex:AbC123dEf@ep-cool-darkness-123456.us-east-2.aws.neon.tech/dbname?sslmode=require
Keep your connection string handy for later use.
Application code
To work with Neon Postgres, you need to install the @neondatabase/serverless
package which provides a JavaScript/TypeScript
driver to connect to the database.
- npm
- Yarn
- pnpm
npm install @neondatabase/serverless
yarn add @neondatabase/serverless
pnpm add @neondatabase/serverless
- npm
- Yarn
- pnpm
npm install @langchain/community
yarn add @langchain/community
pnpm add @langchain/community
To initialize a NeonPostgres
vectorstore, you need to provide your Neon database connection string. You can use the connection string
we fetched above directly, or store it as an environment variable and use it in your code.
const vectorStore = await NeonPostgres.initialize(embeddings, {
connectionString: NEON_POSTGRES_CONNECTION_STRING,
});
Usage
import { OpenAIEmbeddings } from "@langchain/openai";
import { NeonPostgres } from "@langchain/community/vectorstores/neon";
// Initialize an embeddings instance
const embeddings = new OpenAIEmbeddings({
apiKey: process.env.OPENAI_API_KEY,
dimensions: 256,
model: "text-embedding-3-small",
});
// Initialize a NeonPostgres instance to store embedding vectors
const vectorStore = await NeonPostgres.initialize(embeddings, {
connectionString: process.env.DATABASE_URL as string,
});
// You can add documents to the store, strings in the `pageContent` field will be embedded
// and stored in the database
const documents = [
{ pageContent: "Hello world", metadata: { topic: "greeting" } },
{ pageContent: "Bye bye", metadata: { topic: "greeting" } },
{
pageContent: "Mitochondria is the powerhouse of the cell",
metadata: { topic: "science" },
},
];
const idsInserted = await vectorStore.addDocuments(documents);
// You can now query the store for similar documents to the input query
const resultOne = await vectorStore.similaritySearch("hola", 1);
console.log(resultOne);
/*
[
Document {
pageContent: 'Hello world',
metadata: { topic: 'greeting' }
}
]
*/
// You can also filter by metadata
const resultTwo = await vectorStore.similaritySearch("Irrelevant query", 2, {
topic: "science",
});
console.log(resultTwo);
/*
[
Document {
pageContent: 'Mitochondria is the powerhouse of the cell',
metadata: { topic: 'science' }
}
]
*/
// Metadata filtering with IN-filters works as well
const resultsThree = await vectorStore.similaritySearch("Irrelevant query", 2, {
topic: { in: ["greeting"] },
});
console.log(resultsThree);
/*
[
Document { pageContent: 'Bye bye', metadata: { topic: 'greeting' } },
Document {
pageContent: 'Hello world',
metadata: { topic: 'greeting' }
}
]
*/
// Upserting is supported as well
await vectorStore.addDocuments(
[
{
pageContent: "ATP is the powerhouse of the cell",
metadata: { topic: "science" },
},
],
{ ids: [idsInserted[2]] }
);
const resultsFour = await vectorStore.similaritySearch(
"powerhouse of the cell",
1
);
console.log(resultsFour);
/*
[
Document {
pageContent: 'ATP is the powerhouse of the cell',
metadata: { topic: 'science' }
}
]
*/
API Reference:
- OpenAIEmbeddings from
@langchain/openai
- NeonPostgres from
@langchain/community/vectorstores/neon