Update CSV prior to batch Upload

:wave: looking for expertise and guidance!

We've built a portal for external partners to submit/update mySQL data rows via a form. We're now looking to give them the ability to use a CSV file to insert several rows in the table.

I've followed this tutorial that works well: CSV to Database, and I've been able to get some of the columns in the table to lookup from other columns (e.g. returning an id based on a name) thanks to this community post

And I'm able to get some visual data validation (e.g. if the id isn't found, highlight in red):

The ideal workflow I'm envisioning would be:

  • Once the file is selected/parsed, the user can update the data in the table, and when they change a name, it looks up the corresponding id again
  • For each row, there would be a column with a data validation prompt based on certain criteria of data from the row (e.g. if the id isn't found, "id not found", if the date is not correctly formatted, "id not found, date incorrectly formatted", etc.)
  • Once everything is green, users can click one button to upload all the rows in the table (instead of doing it row by row like in this tutorial)

Question: I'm stuck at the first step and wondering if the structure I'm putting in place (table scripts/transformers and one query to insert data rows) could lead somewhere or if another route would be more efficient (e.g. API calls like this doc, or something I haven't thought of)?

Any documentation, tutorials or guidance you can provide to get on the right path would be much appreciated :pray:

Hello @salutcestcool ,

Thank you for your question.

Is your aim to edit the data in-line - you gave the example of an edit of the 'name' - and that after this name is changed, the ID would update and match the 'new' name?

I'm slightly unclear about the desired outcome, but would be eager to try and help you once I have a better understanding!

-Brett

Have you considered loading the parsed CSV file to a temporary/staging table and letting the user modify that (temporary) data with a save/submit query that saves the change to the DB and retrieves the updates? Once the user is happy with their data, they can click a button to trigger a query that pushes the staging table data to the target table. You could disable that button if your validation checks fail for any data.

Or perhaps (if it isn’t too much data and you’re comfortable with not persisting the data in the event of a disruption) you can skip the DB and use a variable to store the parsed data and use setIn() to manage changes. Could be more complex, but might be a workaround if managing a temp table is a bigger lift than you want to deal with (or you just prefer to work with variables :grinning:).

As @brettp mentioned, a bit of additional context on what you’re stuck on in the first step would help if we’re going down the wrong path.