Momento Vector Index (MVI)
MVI: the most productive, easiest to use, serverless vector index for your data. To get started with MVI, simply sign up for an account. There's no need to handle infrastructure, manage servers, or be concerned about scaling. MVI is a service that scales automatically to meet your needs. Whether in Node.js, browser, or edge, Momento has you covered.
To sign up and access MVI, visit the Momento Console.
Setup
Sign up for an API key in the Momento Console.
Install the SDK for your environment.
2.1. For Node.js:
- npm
- Yarn
- pnpm
npm install @gomomento/sdk
yarn add @gomomento/sdk
pnpm add @gomomento/sdk
2.2. For browser or edge environments:
- npm
- Yarn
- pnpm
npm install @gomomento/sdk-web
yarn add @gomomento/sdk-web
pnpm add @gomomento/sdk-web
Setup Env variables for Momento before running the code
3.1 OpenAI
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY_HERE
3.2 Momento
export MOMENTO_API_KEY=YOUR_MOMENTO_API_KEY_HERE # https://console.gomomento.com
Usage
- npm
- Yarn
- pnpm
npm install @langchain/openai @langchain/community
yarn add @langchain/openai @langchain/community
pnpm add @langchain/openai @langchain/community
Index documents using fromTexts
and search
This example demonstrates using the fromTexts
method to instantiate the vector store and index documents.
If the index does not exist, then it will be created. If the index already exists, then the documents will be
added to the existing index.
The ids
are optional; if you omit them, then Momento will generate UUIDs for you.
import { MomentoVectorIndex } from "@langchain/community/vectorstores/momento_vector_index";
// For browser/edge, adjust this to import from "@gomomento/sdk-web";
import {
PreviewVectorIndexClient,
VectorIndexConfigurations,
CredentialProvider,
} from "@gomomento/sdk";
import { OpenAIEmbeddings } from "@langchain/openai";
import { sleep } from "langchain/util/time";
const vectorStore = await MomentoVectorIndex.fromTexts(
["hello world", "goodbye world", "salutations world", "farewell world"],
{},
new OpenAIEmbeddings(),
{
client: new PreviewVectorIndexClient({
configuration: VectorIndexConfigurations.Laptop.latest(),
credentialProvider: CredentialProvider.fromEnvironmentVariable({
environmentVariableName: "MOMENTO_API_KEY",
}),
}),
indexName: "langchain-example-index",
},
{ ids: ["1", "2", "3", "4"] }
);
// because indexing is async, wait for it to finish to search directly after
await sleep();
const response = await vectorStore.similaritySearch("hello", 2);
console.log(response);
/*
[
Document { pageContent: 'hello world', metadata: {} },
Document { pageContent: 'salutations world', metadata: {} }
]
*/
API Reference:
- MomentoVectorIndex from
@langchain/community/vectorstores/momento_vector_index
- OpenAIEmbeddings from
@langchain/openai
- sleep from
langchain/util/time
Index documents using fromDocuments
and search
Similar to the above, this example demonstrates using the fromDocuments
method to instantiate the vector store and index documents.
If the index does not exist, then it will be created. If the index already exists, then the documents will be
added to the existing index.
Using fromDocuments
allows you to seamlessly chain the various document loaders with indexing.
import { MomentoVectorIndex } from "@langchain/community/vectorstores/momento_vector_index";
// For browser/edge, adjust this to import from "@gomomento/sdk-web";
import {
PreviewVectorIndexClient,
VectorIndexConfigurations,
CredentialProvider,
} from "@gomomento/sdk";
import { OpenAIEmbeddings } from "@langchain/openai";
import { TextLoader } from "langchain/document_loaders/fs/text";
import { sleep } from "langchain/util/time";
// Create docs with a loader
const loader = new TextLoader("src/document_loaders/example_data/example.txt");
const docs = await loader.load();
const vectorStore = await MomentoVectorIndex.fromDocuments(
docs,
new OpenAIEmbeddings(),
{
client: new PreviewVectorIndexClient({
configuration: VectorIndexConfigurations.Laptop.latest(),
credentialProvider: CredentialProvider.fromEnvironmentVariable({
environmentVariableName: "MOMENTO_API_KEY",
}),
}),
indexName: "langchain-example-index",
}
);
// because indexing is async, wait for it to finish to search directly after
await sleep();
// Search for the most similar document
const response = await vectorStore.similaritySearch("hello", 1);
console.log(response);
/*
[
Document {
pageContent: 'Foo\nBar\nBaz\n\n',
metadata: { source: 'src/document_loaders/example_data/example.txt' }
}
]
*/
API Reference:
- MomentoVectorIndex from
@langchain/community/vectorstores/momento_vector_index
- OpenAIEmbeddings from
@langchain/openai
- TextLoader from
langchain/document_loaders/fs/text
- sleep from
langchain/util/time
Search from an existing collection
import { MomentoVectorIndex } from "@langchain/community/vectorstores/momento_vector_index";
// For browser/edge, adjust this to import from "@gomomento/sdk-web";
import {
PreviewVectorIndexClient,
VectorIndexConfigurations,
CredentialProvider,
} from "@gomomento/sdk";
import { OpenAIEmbeddings } from "@langchain/openai";
const vectorStore = new MomentoVectorIndex(new OpenAIEmbeddings(), {
client: new PreviewVectorIndexClient({
configuration: VectorIndexConfigurations.Laptop.latest(),
credentialProvider: CredentialProvider.fromEnvironmentVariable({
environmentVariableName: "MOMENTO_API_KEY",
}),
}),
indexName: "langchain-example-index",
});
const response = await vectorStore.similaritySearch("hello", 1);
console.log(response);
/*
[
Document {
pageContent: 'Foo\nBar\nBaz\n\n',
metadata: { source: 'src/document_loaders/example_data/example.txt' }
}
]
*/
API Reference:
- MomentoVectorIndex from
@langchain/community/vectorstores/momento_vector_index
- OpenAIEmbeddings from
@langchain/openai