That behavior is normal. I have seen the OL devs explain this a few times. The only way is for your client to upload the files until all files are done uploading. They then upload a text file called trigger.txt .(which can be empty but then if your are using a folder capture, it must be set to “Include empty files”) You then set your process FTP Input Mask to trigger.txt. Once Workflow sees that file it will grab it and you can then use another FTP Input task to grab the files you want.
An alternative to sharne’s advice would be to have your FTP process pick up at a specified schedule. For example, your client has until 10am to drop their files for the day. At 10:30 (gives you a little leeway) your process runs and picks the files. As long as it’s clear with your client that they aren’t allowed to drop after the cutoff period, you shouldn’t encounter any issues.
There are at least 2 more options, beside the already proposed, I can think of.
The uploader could upload the file under a name, you are not watching for (for example “data.csv.writing”) and rename it, after the upload finishes (to “data.csv”, you are watching for). This is fairly easy to do with almost every (s)ftp client.
An extension to the trigger file proposal: If you cannot ask the client to supply the trigger, create 2 Processes watching the same input folder. The second one checks for the trigger file and does the work, when the file is there. The first one checks the last modified modified date of the uploaded file, if there is one, and if the file was not modified in a some predefined period of time, creates the trigger, so the process nr. 2 can start.
The site was down so I could not reply. I had another thought, but untested. What if your process monitors an email. The client can setup the FTP to email you once the files are uploaded. Your process then waits for an email and then processes the files. I read a few suggestions here.
But again, I have not tested this theory, just an idea.