Is it possible to query vector DB without relying on the LLM?

For example, I want to show the chunks of documents from the vector db matching a text but without any llm generation

Is it possible ?

Hi @pierrealex, great question! Since I do not have the answer for that, I asked the AI model myself:

Q: If you were to have a TON of text data, like if someone fed you 1000 PDFs with all text. Would it be possible for you to just return chunks of the documents where my input matched what is on the PDF without any text generation of yours?

A: Yes, it is possible to extract chunks of text from a large volume of documents, such as 1000 PDFs, based on input queries without generating additional text.

I would try doing this by giving specific instructions on the input field for the AI Resource.

Let us know how it goes! :slightly_smiling_face: