Im using an s3 action to to upload a file.
I getting error "The workflow run exceeded the memory limit of around 1GB"
this an image of the workflow:
Im reading a file from s3, decoding it from base64 an uploading it again.
the file is 24MB (for a smaller file of 2MB it worked)
Hey @noamovich - thanks for reaching out. Unfortunately, there's not a lot that can be done to specifically circumvent the memory limit that you're seeing, unless you choose to self-host a Retool instance.
One thing that might help lower your overall RAM usage, though, is to use a JS block instead of a Python block for decoding the file. Give that a shot and let me know if it helps! If not, we might be able to explore other options.