Macos – How to split a .zip file into multiple segments

macosterminal

I have all the command line utils installed, and need to split an existing .zip (or) new file(s) into (50MB) .zip segments in Terminal.

i.e. Folder X = 900MB > Create self extracting .zip archive > Split .zip archive into 50MB Segments (i.e. Folder.X.001.zip)

According to the man page here are the commands:

Copyright (c) 1990-2008 Info-ZIP - Type 'zip "-L"' for software license.
Zip 3.0 (July 5th 2008). Usage:
zip [-options] [-b path] [-t mmddyyyy] [-n suffixes] [zipfile list] [-xi list]
  The default action is to add or replace zipfile entries from list, which
  can include the special name - to compress standard input.
  If zipfile and list are omitted, zip compresses stdin to stdout.
  -f   freshen: only changed files  -u   update: only changed or new files
  -d   delete entries in zipfile    -m   move into zipfile (delete OS files)
  -r   recurse into directories     -j   junk (don't record) directory names
  -0   store only                   -l   convert LF to CR LF (-ll CR LF to LF)
  -1   compress faster              -9   compress better
  -q   quiet operation              -v   verbose operation/print version info
  -c   add one-line comments        -z   add zipfile comment
  -@   read names from stdin        -o   make zipfile as old as latest entry
  -x   exclude the following names  -i   include only the following names
  -F   fix zipfile (-FF try harder) -D   do not add directory entries
  -A   adjust self-extracting exe   -J   junk zipfile prefix (unzipsfx)
  -T   test zipfile integrity       -X   eXclude eXtra file attributes
  -y   store symbolic links as the link instead of the referenced file
  -e   encrypt                      -n   don't compress these suffixes
  -h2  show more help

with -h2 I get:

Splits (archives created as a set of split files):
  -s ssize  create split archive with splits of size ssize, where ssize nm
              n number and m multiplier (kmgt, default m), 100k -> 100 kB
  -sp       pause after each split closed to allow changing disks
      WARNING:  Archives created with -sp use data descriptors and should
                work with most unzips but may not work with some
  -sb       ring bell when pause
  -sv       be verbose about creating splits
      Split archives CANNOT be updated, but see --out and Copy Mode below

…..

Using --out (output to new archive):
  --out oa  output to new archive oa
  Instead of updating input archive, create new output archive oa.
  Result is same as without --out but in new archive.  Input archive
  unchanged.
      WARNING:  --out ALWAYS overwrites any existing output file
  For example, to create new_archive like old_archive but add newfile1
  and newfile2:
    zip old_archive newfile1 newfile2 --out new_archive
  Cannot update split archive, so use --out to out new archive:
    zip in_split_archive newfile1 newfile2 --out out_split_archive
  If input is split, output will default to same split size
  Use -s=0 or -s- to turn off splitting to convert split to single file:
    zip in_split_archive -s 0 --out out_single_file_archive
      WARNING:  If overwriting old split archive but need less splits,
                old splits not overwritten are not needed but remain

Best Answer

You have existing.zip but want to split it into 50M sized parts.

zip existing.zip --out new.zip -s 50m

will create

new.zip
new.z01
new.z02
new.z03
....

To extract them, you should first collect the files together and run zip -F new.zip --out existing.zip or zip -s0 new.zip --out existing.zip, to recreate your existing.zip. Then you can simply unzip existing.zip.


You'd expect unzip new.zip would work, but unfortunately it's not implemented

warning [new.zip]:  zipfile claims to be last disk of a multi-part archive;
  attempting to process anyway, assuming all parts have been concatenated
  together in order.  Expect "errors" and warnings...true multi-part support
  doesn't exist yet (coming soon).

and in my tests, concatenating the parts as it suggests, i.e. with cat, and running unzip, failed to extract all my files.