7-zip -v switch will fail to use cpu properly?

Posted on

QUESTION :

I have been using this script for years, back from when above 2 cores was a challenge in 7z and I had just offered myself a shiny i7:

"C:Program Files7-Zip7z.exe" u -m5=lzma2 -mmt=8 %1.7z %1

This has the advantage of using 8 cores, lzma2 compression, and set the cpu frequency (usually in some speedstep reuced state) to max + turbo. Awesome speed and convenience (in Windows SendTo).

I recently was reminded the hard way, that files over 4GB are hard to recover after accidental deletion (NTFS). I had just wiped a backup volume by mistake…

Lesson learned, I decided to amend my script to span the archive across 2GB volumes.

"C:Program Files7-Zip7z.exe" a -m5=lzma2 -mmt=8 %1.7z %1 -v2g

Losing the update convenience in the process.

What I found disturbing, and this is the topic of my question, is that I couldn’t hear the cpu and case fans kick in, as they used to, while the xeon westmere would get hotter. I thought I had a hardware problem… Checking, it appeared that not only the cpu was about 50% busy (on all cores) but it hadn’t stepped out of the reduced state it was in (12x of 22). This is aggravating: just 25% of the available processing power being used instead of 100% without volumes span. I estimate that archiving time was multiplied by a factor of 10, rather than 4.

Bug or feature? Do I miss something ? This is Windows 8.1/64, 4 core Westmere xeon with 24 GB RAM. 7Z 9.20 /64.

I use this script quite often, sometimes for multiple folders archives:

for %%i in (%*) do call "....archive.cmd" %%i

Quite handy in Windows, life will be harder without it 🙂

Does someone have an idea to fix this ? I did quite some googling without any luck…

Thanks for any input. Have a nice day.

ANSWER :

7-zip is probably spending more time on I/O than compressing data. My suggestion is to use -mx9 (Ultra mode) and -md30 (1 GB dictionary size).

The default dictionary size is 16MB. In ultra mode, the default dictionary size is 64MB. Z-zip uses 10.5 time the size of the dictionary for memory buffers (-md30 raises the total memory use to around 11GB). Of course, large dictionary size means that the computer extracting the archive has to allocate that large dictionary also. You could also try -d28.


Increasing the number of threads or using the -v switch reduces compression (according to the command line documentation). Setting -v4095m should still allow recovery of lost files?

The -slp (Set Large Page mode) option might speed compression. Review the description and cautions at http://sevenzip.sourceforge.jp/chm/cmdline/switches/large_pages.htm


In general, with that much memory you do not need a paging file (except for saving crash dumps). If you have a large paging file, Windows will try to move most of the program’s memory to the paging file and page in parts of the program’s memory as needed. That will also slow the program considerably.

Leave a Reply

Your email address will not be published.