Trigger file to start process

Hi - I have a process where I drop three files into the input folder. Two pdf files and one excel file. I have the folder capture set to look for ‘trigger.txt’. I do not want the process to start until all three files are in the folder. This worked for awhile then started getting stuck in the input folder. I would copy the three files in then copy the trigger file in separately. Sometimes it picks up the files and sometimes they all just sit there. I stopped and started the services and they get picked up immediately. I ran three jobs through and the third job got stuck again. Also, when I try to stop the services, I first check to make sure nothing is running. then I stop. the lst time I did this it literally took over 25 minutes.

Any ideas as to what is happening? if / After the files get picked up the process runs great.
Thank you!

EDIT - I think the reason the trigger.txt does not get picked up sometimes is because the process isnt ending properly. Not sure if it is because of bad data or what. I have a _.pdf file that I keep trying to delete in my output folder and it keeps coming back. Like the process wont stop running. I do not see it running in the console though. How can I find out if or why it is still running or at least still producing the _.pdf file.

Most likely, one of your files takes longer to process in some occasions.

If, when that happens, you can reproduce it by resubmitting the exact same three files, then you should look into them and see what is unique to theses.

You also could turn on the sequential logger. This will log everything as soon as it is happening as oppose to once the process is done (normal log).

To turn this on:

  • Click on the Workflow button in your configurator (big orange W).
  • Then Preferences->Plug-in-.General.
  • Then hit simultaneously SHIFT-CTRL-ALT-F12 key. A checkbox will appear labeled Log each process events synchronously in a separate log file or something quite similar, depending of your Workflow version.
  • Resend your configuration.

That will start logging each processes in its own log file (labeled as the process name) and at every step.

BE WARNED…THIS OPTION MUST BE TURNED OFF AT THE END OF DAY. It will tax your resources. The created logs aren’t day stamped, only time BUT you will be able to see in them where, in your problematic process, it hangs.

To turned it off, simply go back to your Preferences and uncheck the box and resend your config.

These new log can be found in the same folder as the normal ones (which are still generated).
C:\ProgramData\Objectif Lune\PlanetPress Workflow 8\PlanetPress Watch

That should help you troubleshoot your behaviour.

Thank you - I have gotten to the point where I resend my config. This happened last night also, where it just seems to be stuck at the ‘stopping’ point with the bars going across. There are no jobs running that I can see in the console. However, it states over and over and over - 'Process “FundChangeNotices” is still running in the log file. This is the process I am having trouble with. I don’t know how to get past this part. I don’t see where the process is running.

Kill everything, then turn on the sequential logger.

Then when it get stuck again, go in the FundChangeNotices log file and see the last line written…this is where it is getting stuck.

ok - I did all that and it looks like it keeps looping through. I have an emulated data splitter and currently my test file has only 5 records in it. It is on loop 35. My process is like this -
emulated data splitter then
set job variables then
load external file ( finalmerged.pdf) then
execute data mapper then
create print then
create job then
create output.

Is it getting stuck in the loop because the finalmerged.pdf is still in the folder after all data files have been processed?

When it is stuck…what is the last line written in the sequential log of the process?

Once you have that info, it will be easier to answer you. I can tell you that the faulty step will be the following.

If it is looping more than it should do then the problem is probably with the splitter.

Would it not be the load external file? Maybe use a folder capture with archive attribute on or off be a better method? EDIT: Seems like an infinite loop of sorts.

it isn’t stuck per say. It keeps looping and wont stop. So, I think the issue is the placement of the load external file plugin. But if I move it outside of the emulator splitter. then it obviously doesn’t work

emulated data splitter then
set job variables then
load external file ( finalmerged.pdf) then
execute data mapper then
create print then
create job then
create output.

Can I ask why you are using load external file? You are splitting your original data using the emulated splitter, what type of data? Because after that you are picking up a PDF via load external and that should require a change emulation if your execute data mapping is a PDF based mapping config. Surely, or am I lost.

The it is as I supposed…the problem is in your emulated data splitter which is the source of your loop.

You could try your process in debug and look at the data everytime you go back in your loop. That should give you a hint as to why that original data file is looping longer.

I have two pdf files and an excel file I place into an input folder. The excel file contains the addresses. I merged the two pdf’s into one. I place the merged pdfs in a folder to use as the datamapping congif. Thats where I am loading the pdf from another folder to use with the OL plugins I am using the splitter to split the excel data to place the addresses on the pdf.

I think it keeps looping through because it is looking at the load external file folder and finds the pdf file in there.

The loop is purely control by the emulated data splitter. So this is where you need to look for the source of the problem.

Take a look at the Excel file. Clients are notorious for selecting large amounts of rows and hitting the delete key. This makes x amount of empty rows in the Excel file. (one needs to select rows, right click, select delete from context menu and if prompted select shift rows up before saving) Perhaps your Excel file is plagued with thousands of empty rows. I had one with over a million empty rows.

@hamelj Is splitting of Excel files supported by the Emulated Data Splitter? I thought only text/csv etc.

I thought this is what we were talking about (CSV file).

If it is a Excel sheet, although it is supported by Connect as a valid input, Workflow will not know what to do with it. there is no emulation in Workflow that can handle that type of data. That could very well be the problem here.

Many do see CSV and automatically think Excel file. Once OP replies, I’m out of ideas. Still would like to see a pic of OP’s Workflow process.

here is a screen shot of my process. Hopefully you can see it.

Ahaddad, what is the Output file Emulation of your Database query plugin?

Have you tried as I asked, going through it in Debug step by step and look at the data right after the Emulated data splitter?

My output is a pdf. And yes I have gone through it in debug. I think I finally figured it out. After catching the text file just right I saw that after the last line of actual data there is - “”,“” in the rest of the file ( this example had 606 rows of nothing but “”,“”). So now I have to figure out trim that and I think it will be good to go.

Thank you for helping me. I like getting help on how to troubleshoot something for the future. I kept focusing on the load external file plugin and it was the actual data file itself.

Is your data CSV or TXT? If so would you like a script that OL gave me that will remove the unwanted commas?