Looping through table data in a workflow

I have a table filled with sql data. I am trying to send this to a workflow and loop through the table data and perform a sql insert with each record. I have imported a workflow into my app and am sending the data like so:

In the workflow, I have created a quick response that is very simple (just so I can verify it is receiving data)

However, no matter what I do, I can't seem to get it to return valid data. I have tried to pass it differently when I am calling the workflow in hopes I can get "something" to return correctly:

{"copyData": {{ tblFilters.data }} }

I also tried to have this as my return node:

{"data": startTrigger.data.value.copyData[0].id }

But still, all I get is either no data returned as my output OR this error here:

error:true
messsage:"there was a problem parsing the JSON body of webhookReturn2"

Which is the same error that is in the logs. So, my main questions here would be: How can I get table data to push across to a workflow so I can loop through it? And, would there just be a better way to do a bulk insert with the table data than what I am doing? Thanks!

Hello!

I believe in the response node, the property key shouldn't contain the quotes since it is parsing it as raw JSON data.

This: {"copyData": {{ tblFilters.data }} }
Should become this: {copyData: {{ tblFilters.data }} }

Unfortunately, I received the same error message after making that change

Ah, yes. You also do not need the {{ }}. This works:
image

Don't be fooled by the linter telling you it is invalid JSON

hmmm....only have the one { and the one closing }

{copyData: startTrigger.data.value.copyData[0].id}

Can you test with a code block that returns a string or other object?

ETA: This could be related to startTrigger data being unavailable so the response block is caught up on a null reference

When I changed it to a code block I received a return response:

  • id:"doSomthing"
  • message:"this is also a message"

I still need to get the startTrigger data in my loop though

Okay, so the data you want would be sent to the workflow as an object with the form

{ value: { copyData: [ { id: 1 } , { id: 2 }, { id: 3 } ] } }

To test this, inside the workflow just make that object the test parameter and map your return over the copyData array, or call our the specific entry via index:

In an app, you can Import Workflow as a query and send the object the same way:

Hmm...The example worked. But, still not working with the {{ tblFilters.data }}
(which is the actual data I need to pass over).

This seems like this should be a straightforward task. I am just needing to either loop through the table data somehow and insert each row into a DB or I need to do a bulk sql insert.

Is your DB already setup as a resource (or are you using Retool DB)? If so, you should be able to use that SQL resource in your App and choose to to a Bulk Insert or Bulk Update/Upsert via Primary Key without needing a workflow or other query to process the table rows.

You want to use the table data in the area for Array of Records and select the appropriate PK to use.

image

1 Like

Thank you for this (and I have used this before). But, in this instance, I have to modify some of the data in the table first before it is bulk inserted in our MySQL DB.

For example, the table data is structured like this:

idassigned_userfilter_contentanother_field
1234444users-data1some_data
4564444users-data2some_data
7894444users-data3some_data

However, I have to change the "id" fields and the "assigned_user" fields when performing the bulk insert. So, that it ends up something like this:

idassigned_userfilter_contentanother_field
random_new_uuid5555users-data1some_data
random_new_uuid5555users-data2some_data
random_new_uuid5555users-data3some_data

I thought about just creating a hidden table that loops in this data and then generates the fields but then I thought that there has to be a better way than that.

By the way - side note - thanks for all your help so far! Really appreciate it :+1:t2:

No problem!

The thing you need is a Transformer. It updates automatically as the inputs change, so you can use it to do the changes you require with just a little bit of Javascript.

Something like this increased the ids by one in my rows:
image

1 Like

That transform definitely worked well. Now the data is being sent across correctly. However, I am still not able to return the simple ID to test and see if I can get the workflow data.

I simplified the webhook return to this:
{"response": startTrigger.data.value[0].id }

But, it returned this:
messsage:"there was a problem parsing the JSON body of webhookReturn2"

I also tried this one as well:
{response: startTrigger.data.value.copyData.map(num => {return "This is " + num.id} )} }

Which is what you had suggested in an earlier post but that one produced an error as well.

Thought it might help to include here an example of the data that is coming in from the startTrigger below

Any ideas on how I could just return the simple ID or any of the fields? Thanks!

Without seeing the exact setup of the workflow I am just hazarding a guess that the quotes around "response" caused the issue with this return block.

Is there a specific reason you need this to happen within a workflow? It seems like you should be able to use the transformer directly in your App with the table data and a Javascript query to loop through the results and trigger the update query using additionalScope (another thing I'd be happy to help you out with, if necessary).

I think, since you are sending the payload from the screenshot, you want to return:

{ response: startTrigger.data.map(val => { return val.id }) }
This should produce an array of just the id fields from every value.

I suggest taking a single entry from the value array in your screen shot and copying its JSON directly into the startTrigger Test Parameters area. Keep the form of an array, but only leave a single object entry. What you want is to see if the return node is capturing the id or other properties as part of the code complete (this can be a pretty reliable way to know you've accessed a JSON field properly).

I got on the office hours call yesterday and they were able to help me sort it out finally. We ended up using the transformer but just by itself (workflow was not needed) and they showed me how to correctly use the bulk sql insert from the transformer result. Works great now!

Thanks again for all your help!

1 Like