Is there a file size limit for zip files to un-zip using the standard decompress plugin?
Looking at a situation where the average zip file is just over 3gb.
There is no limit per se, but given that Workflow is a 32bit application, I would be weary of such large files (it’s not the compressed size that concerns me, it’s the uncompressed result of the individual files in the ZIP). I don’t know if the internal library we use uncompresses first in memory before writing the files to disk, or if everything is streamed directly to disk.
I guess the only way to find out is to try it.
I understand that there is an unzip component for Node Red. For these large zip files would this be viable? Essentially we would be storing an archive, unzipping, potentially renaming the video files, and saving the extracted video files.
Would this be in the 64 bit environment. I believe that the server has 32 gb ram and 8 cores.
Yep, Node-RED should be able to handle that without a hitch.
Do you know of a zip node distribution that can handle files over 3gb?
Using node-red-contrib-zip we crash are after 40mb.
I just took a look and I am surprised that there aren’t more contributions in that area. The one you mention is based on buffers, which seems to me like a recipe for memory errors. I would have expected a contribution based on streams instead.
So it looks like NR is not the answer either. At this stage, your best option might be to call a command line tool from Workflow. The command-line tool will run out-of-process, which means it won’t be limited by Workflow’s 32-bitness.
I’m working on a workflow process that uses 7Zip (64bit).
I’m creating a bat file that I execute on the fly.
It seems to decompress the files quicker that I expected.
I’ve gone past the 400 MB that we crashed with node-red-contrib-zip at 518 MB that expanded to 4.5 GB
Need to add in some stuff like deleting or moving the original zip to an archive location.