We have setup a watch folder on the network to print statements. However,currently we can’t always get the statements to print in ascending order. We do have the self Replicating off. How do we make the workflow delay enough to give the system time to get all of the files then sort and print?
We increased the polling interval, and it seemed to let the files get to the location then printed a lot more in order. However if we set it to 60 sec instead of 4, How do you know what part of the 60 seconds your going to hit when you send the files?
We want to send a large amount of pdf statements to this folder, and consistantly get the correct order.
Hey Beth,
If you don’t want your process to capture files as soon as they’re in the capture folder then you’ll need to either:
a) Implement a schedule that your process will run on. So let’s say you schedule the process to run at 10am: as long as all the data files are in the capture folder by then you should be good.
or
b) Have the initial input for your process look for a “trigger” file. Once it sees the trigger file, that will be placed by someone manually after all the files are in then capture folder, it will then do a second folder capture on the folder containing your variable data.
Since you have an opened ticket with me I’ll call you and show you how you can implement these over a remote session.
What about the Workflow controller software on a desktop? My boss read something about that. Does that give a user the ability to make inactive/active a process?
By “Workflow controller software” I’m assuming you’re referring to the PlanetPress Workflow Configuration.
You can toggle the active state of your process that way sure, but that would be a manual procedure every time you wanted to run the job. So you’re process would be inactive by default, and then once all the files have been placed in the capture folder you would set the process to active and send your workflow configuration. Then once your job completes you’d have to go back into the Workflow Configuration and set it back to inactive so that it doesn’t affect your next job.
In my opinion if you’re willing to do a bit of manual work every time you run this job, in order to make sure that all the data is ready before it starts processing, then the trigger file solution will be optimal here.
Here’s how it would work:
- -Your current folder capture input would become the second task of your process. This folder capture will be what picks up your variable data.
- -The first input task would be a new folder capture(either on the same folder or a separate folder) that looks for a file called “start.trigger”. However the file name and extension here can be whatever you want, it just has to be distinct from your variable data if you’re capturing from the same folder. Use the “file mask” field inside of the folder capture to have it look only for your trigger file.
- -You’ll have to create your trigger file using a text-editor like notepad. Open notepad and simply type “start” and then hit Enter to include a CRLF character (new line character). Then save and name the file “start.trigger”. I’d recommend you keep a copy of it somewhere so that you can simply copy and re-use it every time you want to run the job.
- -Once all the variable data has been placed into the capture folder and you’re ready to start the job. Make a copy of your trigger file and drop it into whatever folder your initial input task is monitoring. It will meet the file mask condition and then start your process.
- -Your process will then begin to capture and process your variable data in whatever order you have set, using the second folder capture task in your process.