[00:42] http://www.thedailyshow.com/watch/mon-june-18-2012/parmy-olson [00:42] friendster joke:) [01:50] whoa- you can now buy a 90'' TV that's not rear projection- Sharp has an LCD TV at that size [02:03] Can't imagine the price [02:04] Or for that matter, heat [02:05] 11k [09:01] since i failed to find a working compressed fuse fs i guess i can extract all the forums 7z files, make one tar and use a archive fs on that [10:59] and extracting will take a day or so [10:59] :) [16:17] Schbirid: compressed FS is kinda tricky. I might be able to hack up something that works on gz or bz2, though. (and if there is tar inside it, I guess you'd need a separate layer for that, too) [16:18] i'll probably use avfs. used it successfully in the past. i wonder how it will handle an archive this big though [16:18] or archivemount [16:18] http://sourceforge.net/apps/mediawiki/fuse/index.php?title=ArchiveFileSystems says it supports tar.g [17:16] extraction is about half done :D [17:41] my implementation of the bz2 or gz fuse layer would essentially serve up a single file (the decompressed file) with full seeking capabilities, without having to keep a full decompressed copy. [17:43] (as it would do a decompress pass and build an index. for bz2 it would simply be an index of bz2 position to decompressed position. for gz it would be gz position, decompressed position, and however many bytes are needed in the window to begin decompression at that point) [17:45] the tar fuse module would keep an index of the files in the tar file, so it could serve up directory listings quickly, as well as get file contents quickly [17:45] even with multi-gigabyte tar files [18:07] http://news.ycombinator.com/item?id=4136682 [18:07] Oh god [18:50] "oh god" what? [18:54] imo full of facepalminess [19:34] yay, 10 hours later, the forums are extracted [19:39] is it better to tar first and gzip later? or how does gzip know how to compress well? does it even care? [19:40] i have a LOT of duplication [19:40] usually close together on a filename basis though [19:54] gzip just compresses everything it gets as one stream. tar.gz files are tarred then gzipped, and the method is referred to as a solid archive [19:55] you might get results by sorting the list of files in some way and feeding the list to tar, which will result in placing the similar files close together, so the compressor might benefit (depending on window size. gzip's deflate will only have a 32KB window at most) [20:00] i just did "tar -czf forums.tar.gz forums" so far, let's wait what comes out of it :) [20:02] the z causes tar to pass its own output to "gzip -9c" and take the output of that to pass as the output of tar [20:10] (moving from #archiveteam) this particular article has a supposed "original set" of rules. 1 was "do not talk about rules 2-33" rather than "do not talk about /b/ [20:10] " [20:10] and then further down they have a later set with the usual 1/2/34/35