Split an[d] archive large backup file in constant working space

backuppipetar

I'm currently attempting to use tar and split to get DVD-sized archives of my Time Machine backup database. It's going to work this time, as I have a partition on my external drive large enough to hold two copies of the massive Backups.backupdb folder, but I want to know if there is a smaller and faster method.

What I'd like to do:
Create 4.7GB archives of my folder one by one, and write them each to disk in turn. The end result would be a set of DVDs, the contents of which could be copied to a hard drive, and then have an extract command run, to form a copy of my original 100+GB folder. Also, if the archive could retain its integrity if one (or more) of these disks was destroyed, that would be awesome. The entire process would never require more than 4.7GB of memory. Even better would be piping the data into my disk burning program, and never writing anything to the hard disk. Can I really pipe tar into split into drutil (or other disk-burning app), and expect this to work over a couple dozen DVDs with all those interruptions?

What I'm doing now:
I run tar to make the archive, then I split its output into a bunch of smaller archives:

sudo tar cvjsp /Volumes/BackupDisk/Backups.backupdb/ | split -d -b 4480m - Backups.backupdb.tar.bz2.


This gives me a bunch of files (Backups.backupdb.tar.bz2.01, Backups.backupdb.tar.bz2.02, …), ready to be written to DVDs. These take up as much space as the compressed archive (of course). However, it now exists both as these files and as the original data, which adds up to almost double the size of my backups, requiring me to have lots of extra space. I've got it for now, but this won't last much longer…

Any ideas? How do you back up hard drives to optical media?

Update: Started process early this morning, still working when I had to leave for work, came back to this: