Xata is a serverless data platform, based onThis notebook covers:PostgreSQL
andElasticsearch
. It provides a Python SDK for interacting with your database, and a UI for managing your data. With theXataChatMessageHistory
class, you can use Xata databases for longer-term persistence of chat sessions.
- A simple example showing what
XataChatMessageHistory
does. - A more complex example using a REACT agent that answer questions based on a knowledge based or documentation (stored in Xata as a vector store) and also having a long-term searchable history of its past messages (stored in Xata as a memory store)
Setup
Create a database
In the Xata UI create a new database. You can name it whatever you want, in this notepad we’ll uselangchain
. The LangChain integration can auto-create the table used for storying the memory, and this is what we’ll use in this example. If you want to pre-create the table, ensure it has the right schema and set create_table
to False
when creating the class. Pre-creating the table saves one round-trip to the database during each session initialization.
Let’s first install our dependencies:
https://demo-uni3q8.eu-west-1.xata.sh/db/langchain
.
Create a simple memory store
To test the memory store functionality in isolation, let’s use the following code snippet:session-1
and stores two messages in it. After running the above, if you visit the Xata UI, you should see a table named memory
and the two messages added to it.
You can retrieve the message history for a particular session with the following code:
Conversational Q&A chain on your data with memory
Let’s now see a more complex example in which we combine OpenAI, the Xata Vector Store integration, and the Xata memory store integration to create a Q&A chat bot on your data, with follow-up questions and history. We’re going to need to access the OpenAI API, so let’s configure the API key:docs
to your langchain
database using the Xata UI, and add the following columns:
content
of type “Text”. This is used to store theDocument.pageContent
values.embedding
of type “Vector”. Use the dimension used by the model you plan to use. In this notebook we use OpenAI embeddings, which have 1536 dimensions.
docs
table.
Let’s now create a ConversationBufferMemory to store the chat messages from both the user and the AI.