When I click the link “streaming results to datadog”, The page does not exist. Anyone know how I can get the run history from my workflow if the data exceeds the 20mb limit? I am running the cloud version. What’s weird here, is the report clearly states my current metadata size is less then the limit so… I really hoping we are not about to forced into a higher tier package just to be able to use observability options to get at this indispensable reporting.
In my own testing today on the latest Cloud release ( v3.320.0 ) I generated a Workflow that has some basic JavaScript in it to create an object. For this object I was able to tweak it so the object could be over or under 20Mb pretty easily.
I just wanted to check in today and see how things are going and see if this is something you can still reproduce. If so, let me know if there is maybe something unique with yours. Here is a screenshot of what I did for reference.
Hi @John_V . I am getting the same results as you when I run your test script. The workflows I experience this in are large with lots of blocks. Some block are simple database storage, some blocks are retrieving file data from a resource and then storing that data to our file system. All the files are genrally small.
For my example above, does it still regenerate the error when you swap the math problem to 9 instead of 10? This should generate data that is 9mb which then gets doubled for the whole workflow when its returned. I still needed to generate more than 20mb to get it to fail which is expected.
I saw in yours and the other community post that it shows 17mb or other values less than the 20mb limit. Do you have any steps to reproduce this behavior in particular? I can use this to file the bug as I can't reproduce just yet.
I am going to mark this community post as waiting for the bug to be fixed since we are aware of the issue now and Dev is handling it. I will keep you posted in the other community thread on this topic that are currently active in.