Folder pick up and reroute

When OL Automate picks up a file from a folder does it use an event driven model where two servers running OL Automate could be watching the same folder.

The goal would be to have two servers watching the same folder and then move the file to local folder that Workflow would monitor. Effectively load balancing.

If a server is down it won’t pull files, if the services are down then OL Automate would not run and not pull files. Technically the workflow and server services would be critical. With two servers the load would be effectively balanced and with one server down files would still process, just slower.

Can’t do this with workflow because of how workflow polls files and would result with errors. Is this possible with OL Automate?

I assume that this can be achieved by a flow containing the nodes inject and file operations:

  • The Inject node to make sure that the flows on both servers are being executed at a different time1
  • The File operations node to move a file to another location in the file system

1Caution: Make sure there is enough time in between so that both servers do not get in each other’s way, resulting in them trying to move the same file.

These files are critical so we can’t have duplicate files on both servers or a missing file and also a time sensitive process.

Considering these factors it sounds like having enough time between the servers could potentially make the process too slow. However if there was not enough time allotted or there was some sort of latency, how would OL Automate react?

In this scenario the source folder would be a UNC path and the destination would be a folder local to each instance of OL Automate. I’m trying to determine if it is feasible or too risky.

Another thought is a PowerShell script running via Windows Task Scheduler that monitors the folder that checks if the OL Connect server and Workflow services are running and if true places a file lock to prevent the other server running the same script from pulling the server. Of course the script would check if there is a file lock before performing any move.

The desire if possible is to keep the process within the OL Connect tool kit.

I’m not an expert, but I think you could set up both machines to access the same folder and write entries to a central database (if that’s an option). You could use a hash of the file name as a unique identifier (like a primary key) and store it with the file path in the database to create a working list.

Then, each machine could run a process to pick up entries from the database one by one and write the files to its own file system. You could mark each entry as processed or delete it once it’s done.

I tried this concept using SQLite, and it seemed to work well, even though I was running everything from a single machine. I hope this helps or gives you some ideas!

Erik