Why doesn't Retool support SQL transactions?

Not supporting SQL transactions opens the door to a lot of bugs that can easily be avoided otherwise. Plus, operations take more time as one needs to make several requests instead of just one.

Why does Retool supports writing SQL directly but not transactions?

4 Likes

Let me take stab at this, if I am incorrect or inaccurate, hopefully the Retool Gurus can correct me.

I am assuming this is because:

  1. Each query opens and closes a connection to the database and transactions cannot be held open across connections.

  2. The nature of parameterized queries does not allow multiple queries at a time. No reason to have transactions if you are only running single query.

So I think the limitations is with SQL itself when in a distributed environment (which is where it almost always lives) and database engine rather than with Retool.

One way around this which I have used myself is to create a stored procedure to wrap all the queries in your transaction. This of course only works if you have the right access levels to your database.

Now, if I am wrong about #1 above and a connection can he held open, then it may be theoretically possible for Retool to enable transactions.

I do believe this would be a mistake however. My philosophy states that if you need multiple queries for a business process (this defines most scenarios where a transaction is required) , they should be as "close to the metal" as possible. Retool adds layers of connections and abstractions between the client and the server, greater distance let's call it. This increases the latency and reduces the performance and reliability of your query stack.

3 Likes

This is assuming you want transactions across multiple different retool queries. Sometimes you just want multiple SQL queries within a single transaction in a single retool query.

That should be possible without needing to write a stored procedure to handle (which is the current case).

And also, why not a single transaction for perhaps a retool code query/js block? That would be easier to implement within the scope of a single connection, and cover most use cases.

1 Like

Hey All, just wanted to chime in. @bradlymathews has laid out a lot of the reasons here, thanks for your reply!

You can use transactions for multiple SQL queries in a single Retool query, as long as those are non-parameterized queries. In practice, I understand that those queries aren't really that common or helpful in a Retool app, but they do work.

Spoke with an engineer on the team responsible for our SQL integrations, and it looks like support for transactions in a single Retool query with parameters may be possible, but isn't high on the roadmap right now. Support for transactions over multiple Retool queries is likely not possible. I did link this thread to that internal ticket, and can update this thread as I get any more information to share. Thank you!

1 Like

@nick.durcholz I'm pinning you here so you get notifications when there are any updates. :slightly_smiling_face:

1 Like

+1 for support for transactions in a single Retool query with parameters

2 Likes

Hi @ShayaM,

Thanks for your interest in this feature request. This is not something our team is currently prioritizing on our roadmap at this time, but we appreciate feedback on the product and still log requests. If anything changes, a Retool employee will update this thread.

big sad face :frowning: -- I'm struggling with this now and don't want to turn prepared statements off.

Need to update 2 tables and insert records in another 3 as a single transaction that needs to fail as a whole if one operation fails. (think depleting stocks; accounting entries and log movements)

Workflows naturally don't help as they'll be either parallel or serial, leading to bugs, and same with chaining queries via event handlers.

Such a 101 thing to do, we can't :frowning:

Will explore the stored proc route as I really need this to work properly.

1 Like

Update on myself -- stored procs do work, and you can pass a whole set of arguments from the result of a JS Query or transformer as a JSONB argument to simplify the whole declaration and mapping process. I can now be at peace that my whole DB entry will execute as a single transaction and not leave the door open for so many bugs and unexpected behaviour that can definitely create many headaches, specially for solo dev like moi.... :slight_smile:

So I prepped my entry in a JS Query and then the result gets passed as a single argument into a call for the stored proc.

Within your stored proc you'd refer into keys as normal with any JSONB query with the '->' and '->> ' operators, but make 100% sure to define correct data types within the proc itself or otherwise it'll be prone to errors.

I defined the language as plpgsql and everything worked normally.

Given that transactions are often the last step before hitting the DB, I think it's a much better approach than creating a clunky workflow that could fail amid execution or worse, chaining sql queries via event handlers.

Maybe a GUI could be built to simplify this process and make it a bit more user friendly for everyone on Retool.

Oh and Cursor saved me a ton of time of writing repetitive stuff translating stuff from a json shape into a proper sql query. :smiley: "AI" sometimes does rock.

4 Likes

@the_irav thank you for your message. I have the same problem. Please could you report an example?

Greetings

Georg

You mean an example on how to achieve this?

Can try and come up with something a bit simpler/smaller than my current queries as they have become a bit too large/complex to post here while keeping everyone's sanity :sweat_smile:

1 Like

Yes, this would be wonderful. Thanks in advance.

In the meantime I did the game, too. Here I will share my implemented code with you:

I need a transaction in a procedure to disable data crash (two person will edit one row at the same time). If you edit the data, an modal dialog will open with a form. If you press the submit button a myql query will run:

The tricky work was to find the right syntax. I suggest to add a mysql resource with prepared statements for simple calls like a INSERT and without prepared statements for such complacted stuff seen below.

SET @tableName = 'definition_keyword_pcb';
SET @rowId = '{{ keywordPage_table_keyword.selectedRow.id }}';
SET @updatedAtLast = '{{ keywordPage_table_keyword.selectedRow.updatedAt == '' ? null : keywordPage_table_keyword.selectedRow.updatedAt }}';
SET @data = '{{ JSON.stringify({..._.mapValues(keywordPage_form_editKeywordPcb.data, v => v === '' ? null : v), ...{"updatedById": global_var_appUser_userId.value, "updatedAt": new Date().toISOString().slice(0, 19).replace('T', ' ')} }) }}';

CALL eddy_db.update_table_withoutId(@tableName, @rowId, @updatedAtLast, @data, @success);

SELECT @success AS success, @tableName AS tableName, @rowId AS rowId, @data AS data;

Yes a write a absolute generic mysql procedure to update a table with the data from the form adding explicit key-values. I think the tableName and the rowId explain themselves. The updatedtAtLast is the timestamp of the row of last refesh. I will compare this with the current value of the mysql table in the transaction procedure see below:

CREATE DEFINER=`admin`@`%` PROCEDURE `update_table_withoutId`(IN pi_tableName VARCHAR(255), IN pi_rowId INT, IN pi_updatedAtLast DATETIME, IN pi_json JSON, OUT po_success BOOL)
procedure_block: BEGIN
  DECLARE v_sql TEXT DEFAULT '';
  DECLARE v_key VARCHAR(255);
  DECLARE v_value TEXT;
  DECLARE done INT DEFAULT 0;
  DECLARE v_json_keys JSON;
  DECLARE v_json_key_count INT;
  DECLARE idx INT DEFAULT 0;

  -- Handler for end of cursor (when no more data is found)
  DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = 1;

  -- Security check: is JSON structure valid?
  IF JSON_VALID(pi_json) = 0 THEN 
      SIGNAL SQLSTATE '22000'
      SET MESSAGE_TEXT = 'Invalid JSON format';
  END IF;

  -- Start transaction
  START TRANSACTION;

  -- Check the current timestamp in the database
  SET @sql1 = CONCAT('SELECT updatedAt INTO @updatedAtLast FROM ', pi_tableName, ' WHERE id = ?');
  PREPARE stmt1 FROM @sql1;
  SET @id1 = pi_rowId;
  EXECUTE stmt1 USING @id1;
  DEALLOCATE PREPARE stmt1;
  
  -- If the timestamp does not match, abort
  IF pi_updatedAtLast <> @updatedAtLast THEN
      ROLLBACK;
      SET po_success = 0;
      SIGNAL SQLSTATE '45000'
      SET MESSAGE_TEXT = 'Update failed: timestamp mismatch';
      LEAVE procedure_block;
  END IF;

  -- Get JSON keys as JSON array
  SET v_json_keys = JSON_KEYS(pi_json);
  SET v_json_key_count = JSON_LENGTH(v_json_keys);  -- Number of JSON keys

  -- Initialize dynamic SQL query for update
  SET v_sql = CONCAT('UPDATE ', pi_tableName,' SET ');

  -- Iterate through the JSON array of keys
  key_value_loop: WHILE idx < v_json_key_count DO
    -- Get the current JSON key
    SET v_key = JSON_UNQUOTE(JSON_EXTRACT(v_json_keys, CONCAT('$[', idx, ']')));

    -- If the key is 'id', skip it
    IF v_key = 'id' THEN
        SET idx = idx + 1;
        ITERATE key_value_loop;
    END IF;

    -- Get the value for the current key from JSON
    SET v_value = JSON_UNQUOTE(JSON_EXTRACT(pi_json, CONCAT('$.', v_key)));

    -- Handle NULL values correctly
    IF v_value = 'null' THEN
        SET v_sql = CONCAT(v_sql, v_key, ' = NULL, ');
    ELSEIF v_value REGEXP '^-?[0-9]+(\\.[0-9]+)?$' THEN
        -- If it is a number (integer or decimal), store it without quotes
        SET v_sql = CONCAT(v_sql, v_key, ' = ', v_value, ', ');
    ELSE
        -- If it is a string, enclose it in quotes
        SET v_sql = CONCAT(v_sql, v_key, ' = ', QUOTE(v_value), ', ');
    END IF;

    -- Increment the index
    SET idx = idx + 1;
  END WHILE;

  -- Remove the last comma and space
  SET v_sql = LEFT(v_sql, LENGTH(v_sql) - 2);

  -- Add WHERE clause
  SET v_sql = CONCAT(v_sql, ' WHERE id = ?');

  -- Execute dynamic query
  SET @sql2 = v_sql;  -- Store the generated SQL statement in a new variable
  PREPARE stmt2 FROM @sql2;  -- Prepare the SQL statement
  SET @id2 = pi_rowId;
  EXECUTE stmt2 USING @id2;  -- Execute the prepared statement
  DEALLOCATE PREPARE stmt2;  -- Deallocate the prepared statement

  -- Successfully completed
  SET po_success = 1;
  COMMIT;

END

If the key 'id' is in the form, I will hide this for the update process. You could filter this out on javascript level, but in this way, I filtered it in the procedure compare. I struggeled a lot's of hours with the right syntax. Thats time enough.

If somebody changed the row in the meantime, the procedure runs into a time mismatch error. And I will handle this in the retool app:

I know this example is not very basic. But it might be a blueprint for somebody who has the same problem.

Cheers Georg

1 Like

Funny, you caught me preparing an answer to this --

So describe a bit in basic terms I made this chart. (at the bottom)

Essentially from whatever GUI you have; a form, a table or anything, you can create a js Query that will prepare your entry into the database.

Before you can submit it, you'll need to create whatever necessary stored procedures.

In my case, the form in the picture (and chart) essentially hits 4 tables (sometimes 5). Given that a couple of them are very mundane like accounting entries and activity logs, I created a stored procedure for each.

This is how my stored proc looks for the accounting entry.

 CREATE OR REPLACE PROCEDURE 
    proc_NEW_ACCOUNTING_ENTRY(
        entry_reference TEXT, 
        entry_timestamp TIMESTAMPTZ,
        amount_kg NUMERIC, 
        amount_money NUMERIC, 
        debit_account TEXT, 
        credit_account TEXT, 
        action_done_by TEXT, 
        weighing_session_id NUMERIC DEFAULT NULL,
        purchase_order_id NUMERIC DEFAULT NULL,
        movement_id NUMERIC DEFAULT NULL,
        entry_commentary TEXT DEFAULT NULL)
  AS $$
BEGIN
  
 -- Insert accounting entries
    INSERT INTO f_powder_accounting (
        ws_id,
        po_id,
        mov_id,
        entry_ref,
        entry_ts,
        mtype,
        amount_kg,
        amount_money,
        account_id,
        done_by,
        is_valid,
        entry_comments
    )
    VALUES 
    -- Debit entry
    (
        weighing_session_id,
        purchase_order_id,
        movement_id,
        entry_reference,
        entry_timestamp,
        'debit',
        amount_kg,
        amount_money,
        debit_account,
        action_done_by,
        true,
        entry_commentary
    ),
    -- Credit entry
    (
        weighing_session_id,
        purchase_order_id,
        movement_id,
        entry_reference,
        entry_timestamp,
        'credit',
        -amount_kg,
        -amount_money,
        credit_account,
        action_done_by,
        true,
        entry_commentary
    );

    RAISE NOTICE 'All operations completed successfully';
    EXCEPTION WHEN OTHERS THEN
        -- If any error occurs, raise
    RAISE NOTICE 'Error occurred: %', SQLERRM;
    RAISE;
END;
$$
LANGUAGE plpgsql

You can naturally pass info as arguments or create variables in any.

Given that I re-use this somewhat often, I can call this procedure from another procedure, a (very) simplified version of that looks like this:
(this contains errors, is missing variables, and a few things, but the idea is there)

CREATE OR REPLACE PROCEDURE proc_FORM_ENTRY(p_data JSONB)
  AS $$
DECLARE
    v_entry_ts TIMESTAMPTZ;
    v_debit_entry_id NUMERIC;
    v_credit_entry_id NUMERIC;
    v_pct_id TEXT;
    v_ct_qty INTEGER;
    v_is_new BOOLEAN;
    amt_money NUMERIC;

BEGIN
    v_entry_ts := (p_data->>'entry_ts')::TIMESTAMPTZ;
   amt_money := (p_data->>'amt_money')::NUMERIC;

   CALL proc_MOVEMENTS(p_data, amt_kg, amt_money);

    -- Accounting
    CALL proc_ACCOUNTING(entry_ref, entry_ts, amt_kg, amt_money, debit_acc, credit_acc, done_by)
    CALL proc_WAREHOUSE(entry_ref, entry_ts, box_id, amt_kg, amt_money, done_by);

    -- Check if new
    CASE WHEN v_is_new = TRUE THEN
        CALL proc_NEW_BOX(box_data::JSONB)
    END;

    RAISE NOTICE 'All operations completed successfully';
    EXCEPTION WHEN OTHERS THEN
        -- If any error occurs, raise
    RAISE NOTICE 'Error occurred: %', SQLERRM;
    RAISE;
END;
$$
LANGUAGE plpgsql

Then in the Retool editor, you would call this "proc_FORM_ENTRY" procedure from a regular SQL query that would look like this:

And in turn, that SQL gets executed using an event handler after the initial "form entry prep" JS query that in my case looks a bit like this:


It's a fair bit contrived, but it has worked quite well for me so far and it seems to be the most solid way to guarantee transactions going in full.

Given that you can call other procedures or functions from procedures I have been creating a few key procedures that sit before the table, and those are called from another one that takes the shape of whatever necessary transaction, so for something I just need 2 or 3, I just call those rather than writing a full procedure for each movement that then becomes more codebase to be maintained and serviced. (At the cost of breaking more things if the shape is changed, naturally).

Lastly, here's a diagram with the overview of all this.

Good luck and hope it helps!

2 Likes

I really appreciate the detailed back and forth, @the_irav and @bauergeorg! You've created some great documentation for the community. :+1:

To echo what others have shared previously, it's not architecturally feasible to execute multiple distinct queries as a single transaction. The recommended approach, as modeled above, is to define and CALL a stored procedure. I'm happy to put together a dedicated guide if anybody is interested.

If you have any other feedback, don't hesitate to reach out here!

2 Likes