Out of memory - how to circumvent - Workflow

Hi everyone,

I have read many posts in here about workflow being 32bit app and issue with out of memory when processing big jobs.

I have a process that gets triggered via Capture folder. I thought that if I could somehow split the process via branching, I can circumvent the limitation for eg:

First file dropped in a folder runs the process and 1-25% of the progress it will skip all branches until the end. Then drops another file in the folder capture but this time it will skip to 25% until 50% until the end , drops another file and so forth.

This approach doesnt work and Im still getting the out of memory error.

Is there any other way to circumvent this? So the process is not interrupted and finish by itself without intervention. I find that if I drop the file to trigger one at a time. It works. I suppose it gives workflow time to free up memory??

Thanks in advanced.

There is unfortunately too little information to provide with any useful tips. You’d have to tell us what kind of data files you are processing (type of data, average size, etc.). And you’d have to explain what kind of process each file goes through. Workflow can easily handle thousands of files, but then again it could very well choke up on just a few if the process is extremely complex.

I have found over the years that in most cases where memory becomes a problem, it’s because the processes have become very long and convoluted over time. It’s almost always better to have shorter, targeted processes. But there’s no magical wand to convert a Workflow configuration. The processes have to be analyzed and possibly redesigned.

I don’t think this is something we can do on this forum as it would probably require you to share sensitive information. You might want to contact our Support team instead.

Hi Phil,

Thanks for your response.

I have already contacted support prior to this. And have passed relevant info.

What Im looking from this post is to have collective/general knowledge of efficiency so that workflow doesnt hit the ceiling.

The process in question does have very intrinsic requirements:

  • Record count: 65000
  • It needs to process RAW(dirty) client file, converts/manipulates,extracts to separate channels (digital and print) , groups all items to DLX,C4 and Box sizes in consideration of inserts, sort them through postal software, reports on each channels and bundles for production and client purposes and be split the final output to individual pdfs for digital channels.

This is the summarized version of the process: Important note here the tasks in process are mostly running external program , only the connect tasks are pdf generation and data mapping configuration.

I have surmised that this should be an efficient process, as Im letting the bulk of the data process/routing of files/generation of reports to an external program.(.NET program) but without a concrete evidence I cant be sure.

For eg: Merge Task in workflow took 1.5 hours to finished, where I let an external program to do it, it only took 10 minutes. So I have replace the task with that.

These are the sort of thing Im looking for. And I thought splitting the process to section via Capture folder would work but from the log it seems only 1 process has ran. Meaning having a capture folder at the beginning of process to trigger still constitute as 1 process as more files are picked up.

Hope Im not talking non-sense If I am apologize. Any insight to this is appreciated.