Use pigz to Multi-Thread tar and zip with all available CPU Cores
Using zip and tar to compress files and directories can take up a lot of your time. Many times, DevOps CI/CD pipelines involve zipping the artifact build the build stage, then deploying and unzipping the artifact once it is copied to its destination. There are several such Jenkins to AWS pipelines that I work with where this is the case:
* build and zip the artifact
* move the artifact to ec2 instances
* unzip the artifact on ec2 instance
* run post-deployment steps, etc
This is fairly common in modern software engineering. You may be or probably are running a similar setup.
(If you’re unfamiliar with the “artifact” terminology, it just means the files of the application to-be deployed).
But do you know about pigz?
pigz is a unix shell utility that allows you to use all the available processing power at your disposal to minimize the amount of time it takes to compress a file or directory.
If you’re not using pigz you are limiting the compression routine to one cpu core. Pigs allows you to make use of all available cores of the machine compressing the artifact. Usually this will be the server running Jenkins. These aren’t usually too under powered, so leveraging pigz can really speed up your deployments.
Or maybe you just have a hoss laptop and want to minimize the amount of time it takes to compress your backups?
This is another obvious use of pigz. Anytime you want to compress something.. think pigz if you want to save time and maximally leverage available processing power.
Use pigz with tar like this
$ tar -c --use-compress-program=pigz -f name-of.tar /directory/to/compress
That’s it. I wrote this article because I recently started using pigz, and after asking around to my colleagues, I found out that none of them had heard of it either. Spread the love.
Timothy D Beach