I have a retool workflow where a javascript code block is fetching an excel file from an S3 bucket and reading it into an array for a Postgres Bulk Insert query to execute using the {{ code.data }} value.
I am seeing that when I have an excel file with 3k+ rows, the bulk insert is taking over 50 seconds. I would like to improve this speed.
Is there a more performant way to bulk insert this volume of data to my Postgres db from an excel file?