Skip to main content
A vector store stores embedded data and performs similarity search.
Install dependencies:
npm i @langchain/openai
Add environment variables:
OPENAI_API_KEY=your-api-key
Instantiate the model:
import { OpenAIEmbeddings } from "@langchain/openai";

const embeddings = new OpenAIEmbeddings({
  model: "text-embedding-3-large"
});
Install dependencies
npm i @langchain/openai
Add environment variables:
AZURE_OPENAI_API_INSTANCE_NAME=<YOUR_INSTANCE_NAME>
AZURE_OPENAI_API_KEY=<YOUR_KEY>
AZURE_OPENAI_API_VERSION="2024-02-01"
Instantiate the model:
import { AzureOpenAIEmbeddings } from "@langchain/openai";

const embeddings = new AzureOpenAIEmbeddings({
  azureOpenAIApiEmbeddingsDeploymentName: "text-embedding-ada-002"
});
Install dependencies:
npm i @langchain/aws
Add environment variables:
BEDROCK_AWS_REGION=your-region
Instantiate the model:
import { BedrockEmbeddings } from "@langchain/aws";

const embeddings = new BedrockEmbeddings({
  model: "amazon.titan-embed-text-v1"
});
Install dependencies:
npm i @langchain/google-vertexai
Add environment variables:
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
Instantiate the model:
import { VertexAIEmbeddings } from "@langchain/google-vertexai";

const embeddings = new VertexAIEmbeddings({
  model: "gemini-embedding-001"
});
Install dependencies:
npm i @langchain/mistralai
Add environment variables:
MISTRAL_API_KEY=your-api-key
Instantiate the model:
import { MistralAIEmbeddings } from "@langchain/mistralai";

const embeddings = new MistralAIEmbeddings({
  model: "mistral-embed"
});
Install dependencies:
npm i @langchain/cohere
Add environment variables:
COHERE_API_KEY=your-api-key
Instantiate the model:
import { CohereEmbeddings } from "@langchain/cohere";

const embeddings = new CohereEmbeddings({
  model: "embed-english-v3.0"
});
Install dependencies:
npm i langchain
Instantiate the model:
import { MemoryVectorStore } from "langchain/vectorstores/memory";

const vectorStore = new MemoryVectorStore(embeddings);
Install dependencies:
npm i @langchain/community
Instantiate the model:
import { Chroma } from "@langchain/community/vectorstores/chroma";

const vectorStore = new Chroma(embeddings, {
  collectionName: "a-test-collection",
});
Install dependencies:
npm i @langchain/community
Instantiate the model:
import { FaissStore } from "@langchain/community/vectorstores/faiss";

const vectorStore = new FaissStore(embeddings, {});
Install dependencies:
npm i @langchain/mongodb
Instantiate the model:
import { MongoDBAtlasVectorSearch } from "@langchain/mongodb"
import { MongoClient } from "mongodb";

const client = new MongoClient(process.env.MONGODB_ATLAS_URI || "");
const collection = client
  .db(process.env.MONGODB_ATLAS_DB_NAME)
  .collection(process.env.MONGODB_ATLAS_COLLECTION_NAME);

const vectorStore = new MongoDBAtlasVectorSearch(embeddings, {
  collection,
  indexName: "vector_index",
  textKey: "text",
  embeddingKey: "embedding",
});
Install dependencies:
npm i @langchain/community
Instantiate the model:
import { PGVectorStore } from "@langchain/community/vectorstores/pgvector";

const vectorStore = await PGVectorStore.initialize(embeddings, {});
Install dependencies:
npm i @langchain/pinecone
Instantiate the model:
import { PineconeStore } from "@langchain/pinecone";
import { Pinecone as PineconeClient } from "@pinecone-database/pinecone";

const pinecone = new PineconeClient();
const vectorStore = new PineconeStore(embeddings, {
  pineconeIndex,
  maxConcurrency: 5,
});
Install dependencies:
npm i @langchain/qdrant
Instantiate the model:
import { QdrantVectorStore } from "@langchain/qdrant";

const vectorStore = await QdrantVectorStore.fromExistingCollection(embeddings, {
  url: process.env.QDRANT_URL,
  collectionName: "langchainjs-testing",
});
LangChain.js integrates with a variety of vector stores. You can check out a full list below:

All vector stores