Use a smart query for summarizing text over token limit

I'm currently building an app that uses a smart query that uses a .txt file as input and will summarize the information inside of it. The only problem is that most of these files are well over the token limit. Is there a way I could either change the token limit or break up the text into smaller chunks?

  1. Have you tried using the gpt-4-32k model to increase your token limit to 32k tokens?
  2. Another option is to break down your file into prompts/completions and train gpt 3/3.5 with it (AFAIK we currently can't train gpt-4, so if you need the advanced capabilities of gpt-4 this option is out)
  3. Finally you can break your text files down into vectors, and use embeddings to search through giant amounts of text, only passing in the relevant text into the completion: OpenAI Platform
2 Likes