Trouble with a Transformer

All I need is a very flat table with

meterid, date_time, T1, T2,....T48
meterid, date_time, T1, T2,....T48
meterid, date_time, T1, T2,....T48

with T being half hour periods (the data is half hourly) and in ExportDB as consumption_kWh

function transformExportDB() {
  const tableData = formatDataAsArray({{ exportDB.data }});
  const transformedData = _.reduce(tableData, (result, row) => {
    const { date_time, meterid, consumption_kwh } = row;
    const dateKey = moment(date_time).format('YYYY-MM-DD');
    
    if (!result[dateKey]) {
      result[dateKey] = { date: dateKey, meterID: meterid, kWh: {} };
    }
    result[dateKey].kWh[moment(date_time).format('HH:mm')] = consumption_kwh;
    
    return result;
  }, {});
  
  return Object.values(transformedData);
}

return transformExportDB();

I get the consumption nested as kWh. I jsut on't know JS well enough to unsort what the AI has given me.

There is a bonus if the first value could be 00:30 through to 00:00 of the next day, as the values are period ending so 23:30 to 00:00 today is timestamped 00:00 tomorrow and padded accordingly if times are missing (only ever likely when I first download ).

Help appreciated, trying to down load PV data and get it intio a useful format.

Dave

Hello @Footsore,

Very interesting use case. Coul you help me better understand the ideal final schema that you want the data to be formatted in?

From your screenshot it looks like you are getting the date_time as keys and the values as the consumption_kwh all nested inside of kWh. How would you like this to be set up instead?

For you bonus at the end, it sounds like you are looking to change the date_time values by offsetting them so that the first value is 00:30 instead of what it currently is in the screenshot (01:30) so this could be done by removing one hour from every date_time value correct?

Or are you thinking that you need to "rotate" the order because the correct value for the time of 00:30 is at the end and you just need a Javascript sorting algorithm to re-order the values instead of changing the times?

I would highly recommend continuing to leverage the purple robot AI query builder tool, as you can provide it the input data schema, give it an example of the output you want and specify ordering/sorting and test different options until you get the code you needed!

You might need to break it down into separate steps, as this is some tricky formatting/sorting but I am sure it can be done!

Hi, thanks for the comment. I have been making some progress. The data is utility consumption data in Half Hour increments, and is destined for a .csv file. Each line would be MeterID, Date, Value1, Value2,......Value48 in Base Time (GMT for us) not Daylightlight saving.

My JS is non-existent but my VB is good so learning fast hopefully.

The drop a level turned out to be easy, just needed to remove .kWh from

  result[dateKey].kWh[moment(date_time).format('HH:mm')] = consumption_kwh;

I then ran into some issues:

It was using just the date as dateKey so it was overwriting each meter so fabricated dateKey as date+meterId which worked.

    const dateKey = moment(date_time).tz("UTC").format('YYYY-MM-DD')+[meterid]

Some DaylightSaving issues, resolved with .tz("UTC")

And sorted the 00:00 issue. With utility data it's 'always' a period ends time stamp. So Today at 00:00 is for consumption yesterday between 23:30 and 00:00 so belongs to yesterday. And today starts with the 00:30 value which runs from 00:00. Some American systems use the abomination that is 24:00 which is today's date but the same as tomorrow's 00:00 (yuck). I deducted a microsecond from all the time stamps in the SQL read of data from DB. Then midnight less a microsecond has yesterdays date. Then just had to drop the first line from each output. (I need the previous value as the data is meter readings, so use lag to get the difference (consumption)

And I think I am there. Got some final testing to do to prove data matches all the way through but initial checks look good.

function transformExportDB() {
  const tableData = formatDataAsArray({{ exportDB.data }});
//const firstelement = tableData.shift()
//only gets rid of first midnight value
  const transformedData = _.reduce(tableData, (result, row) => {
    const { date_time, meterid, consumption_kwh } = row;
    const dateKey = moment(date_time).tz("UTC").format('YYYY-MM-DD')+[meterid];
    
// If statement to get rid of first orphaned value
if (moment(date_time).tz("UTC").format('YYYY-MM-DD') >
    moment({{  exportDB.data.date_time[0]}}).tz("UTC").format('YYYY-MM-DD' )) {
    if (!result[dateKey]) {
      result[dateKey] = { date: moment(date_time).tz("UTC").format('YYYY-MM-DD'), meterID: meterid,  };
    }
    result[dateKey][moment(date_time).tz("UTC").format('HH:mm')] = consumption_kwh;
} else {
  console.log(meterid + " - "
 + moment(date_time).tz("UTC").format('YYYY-MM-DD HH:mm:ss') + " - Orphan Line") //Comment of doing nothing
}
  return result;
  }, {});
  return Object.values(transformedData);
}
return transformExportDB();

Now just need to work out how the Workflow bit works. It seems I can't run the JS script that calls the API with the relevant meter details on the app to automate it. Do I need to rebuild the JS Script and associated queries that picks up all the meters that are queried via the API and build this in Workflow?

Of course no problem!

Very impressive work :raised_hands:

Working with times is a nightmare and very tricky so I am really happy you found a way to get that all set :sweat_smile:

Glad to hear you were able to get the sorting and data shaping aspects taken care of!!!

So we do have some great doc for better understanding workflows and how to fit them in to use cases along with Retool Apps. Check those out here to learn more about all the cool stuff workflows can do.

Let me give you some options and ask more questions about the use case and how you are getting those meter IDs/data that you are querying via the API :handshake:

As app's can't really be automated and can send data to a workflow when a user triggers an event like a button press but apps can't receive data from a workflow.

Workflows can run based on two triggers, one being a 'schedule' which can be set to certain times/days to run automatically.

The second being a 'webhook' which would be turning your app into an API and starting the workflow with data passed in from another computer making a request to your workflow's URL.

You could have your workflow, as a specific time(using the schedule trigger), make an API call to get the meter data you need. Then once it has the data, in the next block, run the JS script you have to format the data and then you can send that data wherever you want with a query(such as to a database, convert to a CSV/PDF or email).

Inside of your workflow, when you click the 'function' block on the left side, you should be able to see queries that you have already created and be able to select the query with the JS script you built out. Which saves you the time of having to copy everything over :sweat_smile: just make sure things are all named the same so the code can reference the right data!

1 Like

In short:

  • A JS script gets a list of the meters needed from the data based
  • Calls the API query (with suitable time pauses)
  • Which then calls another query to get the date of last data for the meter
  • Makes the API request
  • Upserts into a table
  • And finally a table is generated with a button for me to press to export CSV..

I have a Resource setup which is just:https://www.emig.co.uk/p/api/ & the api key that gets sent.

The API then calls
https://www.emig.co.uk/p/api/meter/{{dynamic_meterID}}/readingsAfter?startDate=20241020&startTime=000000.000000&minIntervalS=1800

There are two dynamic elements the meter {{dynamic_meterID}} and the startDate. startDate is called in the URL parameters so appears complete in the above query

{{dynamic_meterID}}
The {{dynamic_meterID}} is provided by a JS Query, which gets the list of meters from the database via a query (getMeters), so if I add meters they automatically get picked up. I will write another query to update the list of meters but it won't change fast, it at all.

async function runJuggleReadings() {
  const meterIds = getMeters.data.MeterID;
  const results = [];

  for (const dynamic_meterID of meterIds) {
    await new Promise(resolve => setTimeout(resolve, 2000));
    await juggleReadingsAPI.trigger({
      onSuccess: (data) => results.push(data),
      onFailure: (error) => console.error(error),
      additionalScope: { dynamic_meterID }
    });
  }
  return results;
}
return runJuggleReadings();

startDate
Is one of the URL parameters and calls this query getLastDate so as to only bring back data I don't have, although pulling back to the start of the day, and let upsert resolve.

SELECT
  date_time as date_time
FROM
  byss
WHERE
  meter = {{ dynamic_meterID}}
  AND date_time = (
    SELECT
      MAX(date_time)
    FROM
      byss
  )

I am guessing most of this needs to be built into Workflow, which is fine. Especially if thatch then export to csv automatically. That would certainly bump the value of retool for us.

Cheers
Dave

Hi @Footsore

Thank you for providing all the details of the use case :raised_hands: These details are incredibly useful for helping us to help you. Very impressive work as well Dave!

If you have these SQL queries and JS scripts built out already in an app, you should be able to reference them from the query drop down menu that I shared in the screen shot of my last post.


This will definitely save you time. You can also build this code into workflow code blocks/query blocks manually but this would of course take more time. I am sure most of it could be copy/pasted but I would recommend typing out the query name into the search bar at the top of a query block.

On your final point about exporting to csv, I just did some testing and found another form thread about how to automatically email out the CSV file containing the data using an SMTP workflow block. Check out this thread and the solution comment!

I do not believe it is currently possible to have a workflow download a CVS directly to your local machine/computer, but I will look into this. The next best option is to automate sending an email with the newly created CSV file attached.

The JS library you would use to parse the CSV from the data you have is already included in the workflow.

You just need to call const csv = Papa.unparse(data); then pass csv to the SMTP email block and provide that with the correct format in the 'Attachments' area, something like this [{ data: '{{btoa(code.data)}}', name: 'fileName.csv', contentType: 'csv' }]

Let me know if this works for you, I also love this use case and think our team would like to get in touch with you and your team to potentially do a case study on your set up for other similar utility companies to learn from!

1 Like

Jack,

Awesome, this has been a massive help.

I couldn't find the App query from within Workflow, just the raw API query which needs quite a bit added to it.

I have however managed to rebuild in Workflow. Some changes required but got there in the end, and improved some bits. I now have two workflows.

Getting Data

  • Gets list of meters and the last updated date for each meter from database
  • Run API query as a loop to get data (must be 1.2 seconds apart minimum)
  • Transform Data
  • Upsert into Database

Sending Data

  • Gets data for last 7-days
  • Calculates difference between readings to give consumption
  • Transforms into industry format in GMT
  • Sends email with .csv file attached

Getting data like a boss.

And it makes sense for me to separate the retrieve and send. If I add other sources I might want to mix meters from different sources in the send part. Or send meters from the same source to different people.

(Get a lot of data via email so PowerAutomate can strip, save file and do its normal thing).

Massive thanks for your help.

Dave

1 Like

So great to hear!

My pleasure, you did all the hard work :sweat_smile:

I just double checked and it seems that the Query Library is the best option for building a query in one location and having access to it in both apps and workflows :saluting_face: