I have a table filled with sql data. I am trying to send this to a workflow and loop through the table data and perform a sql insert with each record. I have imported a workflow into my app and am sending the data like so:
However, no matter what I do, I can't seem to get it to return valid data. I have tried to pass it differently when I am calling the workflow in hopes I can get "something" to return correctly:
{"copyData": {{ tblFilters.data }} }
I also tried to have this as my return node:
{"data": startTrigger.data.value.copyData[0].id }
But still, all I get is either no data returned as my output OR this error here:
error:true
messsage:"there was a problem parsing the JSON body of webhookReturn2"
Which is the same error that is in the logs. So, my main questions here would be: How can I get table data to push across to a workflow so I can loop through it? And, would there just be a better way to do a bulk insert with the table data than what I am doing? Thanks!
To test this, inside the workflow just make that object the test parameter and map your return over the copyData array, or call our the specific entry via index:
Hmm...The example worked. But, still not working with the {{ tblFilters.data }}
(which is the actual data I need to pass over).
This seems like this should be a straightforward task. I am just needing to either loop through the table data somehow and insert each row into a DB or I need to do a bulk sql insert.
Is your DB already setup as a resource (or are you using Retool DB)? If so, you should be able to use that SQL resource in your App and choose to to a Bulk Insert or Bulk Update/Upsert via Primary Key without needing a workflow or other query to process the table rows.
Thank you for this (and I have used this before). But, in this instance, I have to modify some of the data in the table first before it is bulk inserted in our MySQL DB.
For example, the table data is structured like this:
id
assigned_user
filter_content
another_field
123
4444
users-data1
some_data
456
4444
users-data2
some_data
789
4444
users-data3
some_data
However, I have to change the "id" fields and the "assigned_user" fields when performing the bulk insert. So, that it ends up something like this:
id
assigned_user
filter_content
another_field
random_new_uuid
5555
users-data1
some_data
random_new_uuid
5555
users-data2
some_data
random_new_uuid
5555
users-data3
some_data
I thought about just creating a hidden table that loops in this data and then generates the fields but then I thought that there has to be a better way than that.
By the way - side note - thanks for all your help so far! Really appreciate it
The thing you need is a Transformer. It updates automatically as the inputs change, so you can use it to do the changes you require with just a little bit of Javascript.
Something like this increased the ids by one in my rows:
That transform definitely worked well. Now the data is being sent across correctly. However, I am still not able to return the simple ID to test and see if I can get the workflow data.
I simplified the webhook return to this:
{"response": startTrigger.data.value[0].id }
But, it returned this:
messsage:"there was a problem parsing the JSON body of webhookReturn2"
I also tried this one as well:
{response: startTrigger.data.value.copyData.map(num => {return "This is " + num.id} )} }
Which is what you had suggested in an earlier post but that one produced an error as well.
Thought it might help to include here an example of the data that is coming in from the startTrigger below
Without seeing the exact setup of the workflow I am just hazarding a guess that the quotes around "response" caused the issue with this return block.
Is there a specific reason you need this to happen within a workflow? It seems like you should be able to use the transformer directly in your App with the table data and a Javascript query to loop through the results and trigger the update query using additionalScope (another thing I'd be happy to help you out with, if necessary).
I think, since you are sending the payload from the screenshot, you want to return:
{ response: startTrigger.data.map(val => { return val.id }) }
This should produce an array of just the id fields from every value.
I suggest taking a single entry from the value array in your screen shot and copying its JSON directly into the startTrigger Test Parameters area. Keep the form of an array, but only leave a single object entry. What you want is to see if the return node is capturing the id or other properties as part of the code complete (this can be a pretty reliable way to know you've accessed a JSON field properly).
I got on the office hours call yesterday and they were able to help me sort it out finally. We ended up using the transformer but just by itself (workflow was not needed) and they showed me how to correctly use the bulk sql insert from the transformer result. Works great now!