Converting Data from REST-Call with JS to insert it into Database

Hello,

I would like to create a workflow that performs the following three tasks:

  1. retrieve JSON data from a REST API.
  2. transform this data using JS
  3. subsequently write the converted records to a table.

Retrieving the JSON data works flawlessly. There are multiple records and each record consists of over 30 data fields. I already have the table to write the records to, with all the appropriate fields.

What I am still missing is the conversion of the JSON records with JS code so that they can be looped through and inserted into the table. Are there any suitable examples for this?

I realize that this must actually be very simple, but unfortunately nothing has worked so far.

1 Like

Hey @ConDev!

Would you mind sharing what the JSON data from the REST API looks like? How you need to transform it depends largely on what you're getting back but generally you'll want it to end up as one of the following two formats:

Array of objects
[
  {
    id: 1,
    name: "Hanson Deck",
    email: "hanson@deck.com",
    sales: 37,
    image: "https://picsum.photos/id/1003/200/200",
  },
  {
    id: 2,
    name: "Sue Shei",
    email: "sueshei@example.com",
    sales: 550,
    image: "https://picsum.photos/id/1016/200/200",
  },
  {
    id: 3,
    name: "Jason Response",
    email: "jason@response.com",
    sales: 55,
    image: "https://picsum.photos/id/1018/200/200",
  },
  {
    id: 4,
    name: "Cher Actor",
    email: "cher@example.com",
    sales: 424,
    image: "https://picsum.photos/id/1022/200/200",
  },
  {
    id: 10,
    name: "Erica Widget",
    email: "erica@widget.org",
    sales: 243,
    image: "https://picsum.photos/id/1025/200/200",
  },
]
Object of arrays
{
  id: [1, 2, 3, 4, 10],
  name: [
    "Hanson Deck",
    "Sue Shei",
    "Jason Response",
    "Cher Actor",
    "Erica Widget",
  ],
  email: [
    "hanson@deck.com",
    "sueshei@example.com",
    "jason@response.com",
    "cher@example.com",
    "erica@widget.org",
  ],
  sales: [37, 550, 55, 424, 243],
  image: [
    "https://picsum.photos/id/1003/200/200",
    "https://picsum.photos/id/1016/200/200",
    "https://picsum.photos/id/1018/200/200",
    "https://picsum.photos/id/1022/200/200",
    "https://picsum.photos/id/1025/200/200",
  ],
}

following this. same question. Would like to pull data from various explorers by address and store new records in a table.

Hello,

the JSON data from the REST API is an array of objects and looks like this:

JSON DATA
[
{
"Ort": "Vitoria",
"PLZ": "01015",
"Sel4": false,
"PfOrt": "",
"EMail2": "augusto@forwardergroup.com",
"Na1": "Firma",
"Sel5Kz": false,
"LandBez": "Spanien",
"Tel": "+3494555123",
"EMail1": "beatriz@forwardergroup.com",
"AspNr": 0,
"ErstBzr": "LS",
"Verteiler": 0,
"ID": 7241,
"StdLiKz": true,
"PfPLZ": "",
"Sel3": 0,
"AendBzr": "EK",
"PLZOrtInfo": "E-01015 Vitoria",
"Art": 0,
"LtzAend": "03.07.2020 14:18:49",
"Namen": "Firma FORWARDER Group, SL",
"Sel1": false,
"ErstDat": "01.09.2016 14:58:59",
"AnsNr": 0,
"LcManuellKz": false,
"StdReKz": true,
"Str": "Mendigorritxu, 12",
"Land": 724,
"LandKennz": "E",
"TelAbglManuellKz": false,
"IDString": "Adr.7040.7241",
"Sel2": false,
"AdrNr": "74464",
"Fax": "+3455598765",
"AendDat": "03.07.2020 14:18:49",
"Na2": "FORWARDER Group, SL",
"InfoKz": false,
"PLZInfo": "E-01015"
},
{
"Ort": "Přeštice",
"PLZ": "33401",
"Sel4": false,
"PfOrt": "",
"EMail2": "tereza@forwarder.com",
"Na1": "Firma",
"Sel5Kz": false,
"LandBez": "Tschechische Republik",
"Tel": "+42044555123",
"EMail1": "info@forwarder.com",
"AspNr": 0,
"ErstBzr": "LS",
"Verteiler": 0,
"ID": 7242,
"StdLiKz": true,
"PfPLZ": "",
"Sel3": 0,
"AendBzr": "EK",
"PLZOrtInfo": "CZ-33401 Přeštice",
"Art": 0,
"LtzAend": "03.07.2020 14:18:37",
"Namen": "Firma International Forwarder",
"Sel1": false,
"ErstDat": "01.09.2016 15:03:06",
"AnsNr": 0,
"LcManuellKz": false,
"StdReKz": true,
"Str": "Hlávkova 54",
"Land": 203,
"LandKennz": "CZ",
"TelAbglManuellKz": false,
"IDString": "Adr.7041.7242",
"Sel2": false,
"AdrNr": "74465",
"Fax": "+4204875467",
"AendDat": "03.07.2020 14:18:37",
"Na2": "International Forwarder",
"InfoKz": false,
"PLZInfo": "CZ-33401"
}
]

Thanks for sharing that! I think I may have misunderstood the question here. What are the transformations you're trying to do on each record? And does the table you're trying to write to exist in an external database? If so what kind of database?

I want to insert/update these records in a table, that exists in Retool database. What would be the easiest way to do this in a workflow? How do I need to convert these JSON records with JS so I can make this work?

It looks like it's already formatted correctly! You should be able to reference that data directly using something like {{yourRestQuery.data}} in a bulk upsert query. Is that something you've tried already?

Thanks, the Bulk upsert works, but only if I change the datetime fields in the database to text, because the JSON data datetime seem the be in the wrong format. What if I wnat to loop through the records in the workflow, to parse the JSON datetime fields into the correct format?

There's a doc here that goes over how you can iterate through data in workflows!

You can try using something like the following as JS code for your loop block:

value.LtzAend = moment(value.LtzAend).format(/* correct format here */),
value.AendDat = moment(value.AendDat).format(/* .. */)
return value;

Does that work?

You can read more about the moment library here!

Unfortunately not. I tried all possible formats and also some that are not in the moment.js desription. The result is always the same, "Invalid date".

I assume this is due to the German date format 'Day.Month.Year', but this is the string we get from the REST-API. Is there a way to convert/parse this specific datetime format in the JS code block. I tried some things like splitting the string and build a new datetime variable, but this does not work in the codeblock.

Maybe there is something else I could try?

Ah, it looks like it's expecting the date in MM.DD.YYYY format instead of DD.MM.YYYY. Can you try specifying the moment as described here?

That works, the datestrings get parsed correctly. Thank you very much, you have really been a great help so far.
Now there is one more thing to do to write the data to the database. We can't use the bulk upsert for some reasons related to the data source from which the REST API retrieves the data. Instead of using the ID of the records we need to use 2 fields for filtering, AdrNr and AnsNr.
Now when I try to insert/update in a loop all the records from the REST call into the database, I need an index for the data array.

If I specify an exact index in the data array, e.g. "0" or "17", then the insert/update works. When I don't use the specific index of a single data record, all the inserted values are null.

But what exactly do I need to specify as index so that all data records in the array can be inserted correctly in the loop?

Do you maybe have an idea how I can iterate through my data and insert/update the records in a loop into my database table?

In the loop, two fields (AdrNr, AnsNr) must be checked to see if the record exists and if an update must be performed instead of an insert.

When matching the fields in the loop, I need to specify an index, but this index would have to increment automatically. How could this work?

It might be worth using the value variable instead of index, e.g. {{ value.AspNr }} or just use the "Object" setting of the query change set and just pass {{ value }}:

You should be able to grab the index dynamically with the index variable though, I would expect for the syntax you're using to work:

Just for some clarification, are you only seeing this issue for records that don't already exist in your database?

Edit:
Was able to reproduce the bug specifically when using the "Key value pairs" setting for the changeset and have filed an internal bug report. Let me know if switching over to "Object" works for you for now!

Using the value varibale was indeed the solution for this problem. Now I can iterate through the data and insert/update the records. Thank you for your help.