Runtime error: exited workflow with code 139

While debugging something I got this error message in the workflow logs:

Runtime error: exited workflow with code 139

Googling it I saw:

"When a container exits with status code 139, it's because it received a SIGSEGV signal. The operating system terminated the container's process to guard against a memory integrity violation. It's important to investigate what's causing the segmentation errors if your containers are terminating with code 139."

As I had been editing a loop over an array I figured I must have screwed up and I wrote some guard code around the subscript on the last iteration and proceeded to try again w/o thinking much of it. And my next test worked so I figured that was it.

Later on I was looking at the logs for a different reason and I saw it again. This time in a different location. However the code is running to completion too and otherwise working.

I have no idea now if this message has been showing up for some time and I've just ignored it. The run history is showing a red X for everything....but nothing unusual about that. And you click into it and see that the webhookreturn is green. Again nothing too unusual about these conflicting signals....

Searching in https://community.retool.com/ I don't find anything reporting on this error.

I see the IDE "expand" button to open up editing Javascript in a bigger window has disappeared. So I suppose a new build got pushed...

I'm having the same issue

Our team is investigating these errors! If anyone can share the specific day/time that the exited workflow with code 139 error came in, that would be helpful

Here's one! Hope that helps!

1 Like

Thank you! Shared this with the team that is working on this issue :pray:

Hello,
I'm seeing this error as well. Has there been any update on how to avoid this error?

I am attempting to read a ~3 MB csv file before doing an update operation to the a retool db.
The workflow works with no issues a smaller test file.
Ultimately I need to be able to run with much larger files (> 30 MB)

Please advise. Thanks. Mike

Hi @Mike_Simmons thanks for checking in! It looks like our team is still investigating this error :disappointed:, but I'll add an internal update noting that we have another example of it.

I'll reach back out here with any updates that I get internally

Hi @Tess I believe I'm having a similar issue here Workflow - Runtime error: exited workflow with code 139, any workarounds or updates on this one?

Hi @Kovon Thanks for checking in! Unfortunately, I don't have any news on a fix yet :disappointed:

Since you mentioned it is working for smaller sets of data, could you loop through the data in smaller batches?

I programmed an error into my Javascript Code Block in a Workflow. That logic error of mine caused an infinite loop, and this code 139 error message. When I fixed my logic and the infinite loop, everything went back to functioning correctly

image

1 Like

Thanks for reporting this! Will update if I hear anything further internally, but I'm glad you were able to move forward

Any Update on this one? I start to get the same error just from today, and I didn't change any in workflow.

Anyone ever get this resolved? As of today Im seeing this on workflows that prior worked.

Hi there,

Thanks for checking in! We haven't shipped a fix for this error yet, but we are still tracking reports. It indicates a memory limit is getting breached. Has the amount of data being processed in the workflow changed? Any chance you could try adding limits or splitting some of the work into a separate workflow?

Have the same issue, trying to upload a 50mb file to retool storage, after downloading it from a remote URL.

Interesetingly, the download works but then if i try to upload either to retool storage or to our own s3 bucket, we get error 139.

If i split the download workflow and upload workflow, then at least i do not get the error 139, but instead: "The workflow run exceeded the memory limit of around 1GB"

Hmm thanks for chiming in @whynot does the upload work in an app query? It may be a more general file limit :thinking:

I've been having issues with this as well. Have had the same workflow running daily since september.

Every few weeks I needed to reduce the number of rows used in the workflow. I started with 25 row runs which worked fine for the first week, then it was 10, then 6, now I am down to 4 rows of data per workflow run to complete the workflow successfully. The size of data might vary by +/-10-20% daily.

Also seeing this issue with 1 of our several different workflows. This being our workflow with the largest dataset. See the screenshot. Note that the code we use definitely runs successfully for less rows.

I just had this issue when using retool to move data between a bigquery and postgres instance. I believe it is due to the amount of data processed. The last successful run before the workflow failed processed 32.4mb of data. I added a date filter to the query to only process recently updated records and that worked while processing 5.9mb of data.

1 Like

Thanks for sharing more examples of how this error is coming up! The error indicates an out of memory limit. We are still tracking instances of the error internally, but we recommend either adding limits to the amount of data or splitting some of the work into a separate workflow