It seems I've found a solution.
I just need to divide my dataset onto separate parts smaller that 1M tokens each and upload them as separate documents to Vector
@Oleksandr_Dovgopol Glad you're excited about our new AI features and thank you so much for sharing this solution - others may find this useful as well. That does seem like the right approach to stay under Open AI's token per min rate limit
In parallel, our engineers have been working with OpenAI to increase our limits as well, so this type of workaround may be less necessary in the future as larger inputs are accepted! Our limits were bumped to 10M tokens/min on Friday, for example