I have a process that starts when Count.txt is found in a specific folder. It then compares the number in the Count.txt file to the number of files in another folder. If the numbers are not equal, it sends the Count.txt file back to the original folder.
I have this process setup with a 20 second polling interval and have the Self Replicating Process box unchecked (Screenshot below). The problem is, this process seems to be running every 1 second rather than the 20 second polling interval I have set. This is causing my log file to become much larger than needed and using additional system resources. Why is this process polling faster than every 20 seconds and is there a work around to correct this issue?
I have searched the forums with no success. Another post mentioned turning off the Messenger service and then turning it back on but that did not resolve the issue.
It’s because you’ve started the process with a Folder Capture, which acts as a loop on its own.
So for the first run, we have waited our 20 seconds and pick up the count file. We do our stuff and drop the file back in the input folder, finishing the job… or not quite. Because Folder Capture is a self contained loop, the process automatically goes back and checks for more files that meet the capture criteria, immediately running again.
There are lots of ways you could deal with this, but one very simple way would be to introduce a wait timer into the process itself. you could do this with the Watch.Sleep method, more on that here: PlanetPress Workflow Workflow 2018.2 User Guide
Alternatively, remove the Folder Capture and replace the first action with a Create File task. Then to pick up the file when you need it, rather than using a Folder Capture, simply use a Load External File task. Neither of these tasks loop, so when they’re done, they’re truly done and the polling interval is respected. Load External File also does not delete the file from the disk, so you also wouldn’t need to worry about putting it back in it’s original location and potentially deleting a newer count file in the process.
Each file capture by the input is sent down through the process, one at a time. When the file is finished, the process goes back to the input which feeds another file down, as long as there are files in the queue. Once all the files are gone, the task polls the input folder again to see if new files are present and, if so, the process continues with these files. Otherwise, the process ends.
My guess is that this process goes so fast that the file is back in the original folder before the process is finished, therefore, triggering the process again. You just made an infinite loop…