Setting Types to Uploaded Data for Bulk Import to Big Query

Hi
I'm working on a bulk update for Big Query.
I have a button to upload and parse a csv.
The results populate a table.
Using 'table1' widget, I manually set the table types to what I need.
When I run the upload it is rejected saying 'type is string and not BIGNUMERIC'

But, if I make the columns editable, change all the numeric values a bit, and change the bulk import from {{ table1.data }} to {{ table1.recordUpdates }} everything works.

So it seems {{ table1.data }} is not respecting the types I set in columns 'table1' widget.
Is there a way to refresh the table data to update the types?
I checked various {{ table1.xxxxx }} which returned an array - no success. And if I don't make an update to each row, then {{ table1.recordUpdates }} won't populate all the rows.

Hey @James_76 :wave: Changing the column type in the table doesn't actually reflect back in the table1.data property :pensive: I think your best bet is to do this using JS, actually!

Something like:

{{ table1.data.map(row => ({...row, id: parseFloat(row.id)})) }}

Would that work for you?

Hi!
Thanks for the reply.
Where would I do the mapping?
As a transformer in the insert bulk records query, or somewhere in table1.
If in table1 will the reference name change?

Are there any tutorials on setting data types?

Found my solution in retool fundamentals lesson 5 :grinning:
But I'm using Big Query and trying to push nulls, so now facing this issue:

1 Like

:pensive: :pensive: darn. Let me check in on that request again!

So the mapping does not work consistently, it only works if the result is an integer. If I have decimals it will not convert. I tried slicing the string to cap the decimals - this fails too, interesting if I slice (0,1) it works as the result is an int.

I tried importing BigNumber and BigDecimal libraries, but could not get either to work.
Is there any way to get this package

loaded into retool - it will give me access to the bignumeric type :pray: