Dear community:
I am having a lot of trouble understanding timing of JS queries. I have read that it may have to do with promises and async functions, but truthfully, I don't understand the documentation.
I believe that what I'm trying to do is simple, just running a JS query after a certain block of code is finished running.
//User selects a product with a dropdown select
const productId = orderProductSelect.value
//Find the default cost of the product
const newCost = getProducts.data.find( prod => prod.id == productId ).default_cost
// update temporary state value for productId
orderItems.setIn([i,"productId"], productId)
// Set the default cost in the cost input
orderProductCost.setValue(newCost)
// Call JS query to update the product cost in the temporary state,
// using the new set value for the orderProductCost input value.
//This query should only be triggered when the orderProductCost input has been updated,
//so it uses this value.
updateItemCost.trigger()
This is the updateItemCost code. It works when I edit the input manually, but not when triggered by the other query. It seems that It is triggered when the input hasn't been changed yet.
orderItems.setIn(
[i,"productCost"],
orderProductCost.value
)
I have surrended on doing it this way and just copied the text from the other query. It works this way, but there are other parts of the project in which I will need to do something very similar than this, for example:
- I create an order
- This order has a detail, to be written on another table
- both queries are different queries. I first need to trigger the order query, so i can have the ID of the new order to be used on the items detail.