Some logs show detailed steps under "Blocks" -- for instance, the start trigger and other steps, and I can click into the inputs / data / JSON / logs, but others do not and just show logs.
Hi @ChiEn ,
Yes, I was/am currently using an S3 resource. Here's the run ID for one that errored. Here's the
workflow id: d989639e-34c6-4fa7-955e-88a460bf1683
runId=0197f6e8-c927-76de-8bfe-dc28737380b3
I believe we were able to solve this issue by moving our s3 resource in a function within a workflow. However, I was more curious in how I could see in the logs what exactly was erroring or which block was having issues. Is this possible?
Thanks for sharing your run IDs @brpark and @npr! For performance reasons, we skip loading logs from S3 blocks when the total size exceeds 20MB. This helps ensure things run smoothly and avoids any potential crashes on your side.
The UI team is working on making this behavior clearer by showing a message when log data is truncated, so it’s easier to understand what’s happening.
Let us know if there’s anything else we can do to help, and thanks again for understanding!