Hi everyone,
I have a large table that pulls all data from a database of mine in one shot, and this can take sometimes a minute long.
I would like to pull it in batches, say 200 rows each time (out of ~2000 rows, increasing from time to time).
I can change the database itself to pull in batches, but I didn't manage to display it iteratively after each pull, batch by batch on top of the table's data.
I've seen also there is pagination configuration in the table, but this way a new page is queried on page swap, and global table filters and statistics cannot be used properly.
I tried to implement this functinality myself:
table_data_cache.setValue(table_data);
table_data.setValue({});
let offset = 0;
let batchSize = page_size.value;
let data;
try {
do {
const batch = await FetchTableBatch.trigger({additionalScope: {table_offset: offset}});
data = batch.data;
if (data) {
Object.keys(data).forEach((key) => {
const currentData = table_data.value || {};
const newData = data[key] || [];
table_data.setIn([key], [...(currentData[key] || []), ...newData]);
});
offset += batchSize;
} else {
break;
}
} while (data && Object.values(data).some(values => values.length === batchSize));
} catch (error) {
console.error('An error occurred while fetching data:', error);
}
While FetchTableBatch
is:
SELECT *, FROM candidates
ORDER BY date
LIMIT {{ batch_size.value }}
OFFSET {{ table_offset.value }}
It worked, and the table that used table_data
as source data increased the data dynamically.
But, it seems that retool is running in one thread (and handling the tasks asynchronously - therefore although it is pulling in batches, it is causing the application to be stuck untill it finishes to pull all database data (which is now for much longer time instead of pulling it once in a non limited query).
Is my use case possible to handle in retool?
I might try to optimize the database data storing for memory efficiency so I'll pull less data, but it's pretty optimized already.
Thanks