[00:32] hahahaha [00:32] Just found a tiny cache of friendster. [00:32] 1.7 Terabytes [00:32] TINY CACHE [00:32] A LITTLE PILE IN THE CORNER [00:33] * chronomex gets the broom [00:41] I will need help soon [00:43] more brooms? [00:46] Specifically Berlios and Splinder, but likely others. [00:46] I have to start setting them up to be uploaded, and soon. [00:46] alard: ping [00:57] hi [01:00] underscor: see my cmoment eralry [01:00] comment earlier [01:02] which? [01:02] i got a euro bank account [01:02] so can get euserv sometime maybe :p [01:02] would only do it with a few people who wanted to share though and pay paypal i think [01:02] oh okay, awesome! [01:02] yeah, that'd be p cool [01:02] you'd be interested? [01:02] yeah [01:03] not for a month or so though [01:03] and in case i hadnt mentioned i speak german so no probs there :) [01:03] as I don't have cash inflow at the moment [01:04] the two boxes i got from server4you are pretty good [01:04] that's cool [01:04] been maxing the 100mbit on each since i got them basiaclly [01:04] hahaha [01:04] nice! [01:06] wonder if you can upgrade those Filer boxes eaily [01:06] e.g. if you start at lower one you can slowly add more hds [01:06] im guessing its not easy cause the higher ones have 3ware controllers [01:06] yeah, prolly not [01:07] they might let you swap the system disks though [01:07] yeah that shouldnt cause dramas really, but later if you go from one without 3ware controller to one with it [01:07] then there'd be the hiccup [01:07] that said by the time you get to that level they could just deploy a new one, migrate all across, cancel old one [01:08] lol another gigabit port is 10 euro per month [01:09] on the filer m giga [01:09] er, filer l giga i meant [01:10] my ears are burning [01:11] SketchCow: i found out you have maximum cds from 1999 [01:11] but i can't find them on your cd.textfiles.com [01:18] cd occasionally lags now. [01:19] also i did found torrents for maximum cds but most are dead (no seeds) [01:20] one of them works though [01:21] sweet [01:21] its a special issue from summer of 2007 [01:21] the cd i mean [01:22] wait its 2010 [01:22] there was one torrent with 9.2gb of maximum cds but saidly thats dead [02:15] Why you should not let underscor have your boxen: http://dl.dropbox.com/u/230717/otherstuff/Screen%20Shot%202012-04-10%20at%208.43.01%20PM.png [02:15] :D [02:15] (it's fully condoned) [02:19] * yipdw bans underscor from all projects [02:20] an equally precipitous drop! [02:23] So, what the fuck happened from proust? [02:23] No clear marking it's going away now. [02:23] I mean, fuck'em, we're putting up the download. [02:24] But I can't find a cite [02:30] http://www.proust.com/story/proustfuckedyou [02:30] haha i actually love you dude [02:35] but that's weird, allthingsd reported it staying up and it's like "oh yeah it's not shutting down btw" WHERE DID THAT COME FROM [02:35] yipdw: :D [02:35] huh, proust is still alive? [02:36] FOR NOW [02:36] *cue, dramatic music* [02:37] Proust only got 1k Facebook likes. [02:37] No wonder they are in trouble [02:37] SketchCow: the best reference I could find is http://allthingsd.com/20120131/proust-will-live-on-separate-from-iac/ [02:37] I never got the email, though [02:37] Ah hah [02:38] it's run by anonymous! oh hehhe [02:41] I've never really looked at Proud. That's quite a complex website. [02:41] *proust [02:43] frame_at: luckily, it seems to have been developed by people who tried to adhere to progressive enhancement principles [02:43] archiving it was actually pretty easy [02:43] well, the main profiles, anyway [02:43] if you wget (wgot?) the CSS, Javascript, and HTML, you could recreate it in stages or pretty much anywhere else [02:44] the whole sites gives me the creeps. [02:44] well, the contents [02:45] all those private questions... facebook is a monk in comparison. [02:47] what'll we do when people get bored of facebook and we need to archive the site? :P [02:47] haha they really cover all typical "secret questions" from big sites. [02:48] what's your favorite food when you were a kid, etc :) [02:49] i'm uploading my pc advisor archive cd [02:49] it has issues from 2002 to 2004 of pc advisor [02:50] but looks like it only works if you use there software [02:50] cause the pdf issues are broken up [02:50] excellent. [02:51] i have a lot of cds/dvds i have upload [02:52] with this one i also have a scan image of the cd [02:55] http://archive.org/details/mp3com-skeleton [02:55] Any of you that are smart and have the space should immediately grab that. [02:55] (10gb) [02:57] ahha mp3.com i remember that site [02:57] * oli downloads [02:58] It is very raw. It needs a lot of love. [02:58] But I want to have it captured. [02:58] I am grabbing it now [03:00] same [03:00] needs MOAR bandwidth [03:00] replicating to different locations >.> [03:00] same here underscor :P [03:00] hah [03:00] I'm pushing 185mbps to batcave atm >:D [03:01] By different locations, I do NOT mean to batcave or an archive.org machine [03:01] of course [03:01] rofl [03:01] the 185mbps to batcave is s3 traffic [03:01] What, mobileme? [03:01] yeah [03:01] We really really have to stop using batcave for this. [03:02] I'm trying to take batcave down. [03:02] I'd rather we track down The Thing That Sucks [03:02] yeah [03:02] it's weird that proxying through batcave fixes it [03:02] Although these boxen are on ISC, so it seems to behave fine not-proxied too [03:02] what is batcave? one of your boxes at archive.org ? [03:03] what's ISC [03:03] (I just need to figure out how to change it with alard) [03:03] batcave.textfiles.com = teamarchive-0.us.archive.org [03:03] It's one of Jason's machines at IA [03:03] ISC is a peering consortium [03:03] k cool [03:03] Sorta like a non-profit hurricane electric or nlayer [03:14] alard kennethre: are all the users downloaded using seesaw-s3 marked in a big batch at the end? [03:14] just wondering cause I don't see any on the dashboard as they finish in the terminal [03:14] where's the link to that seesaw-s3 stats page? :/ [03:14] http://memac-tamer.heroku.com/ [03:15] that? [03:15] thanks, that's it [03:15] underscor: it uploads the names once they're successful [03:15] ah ok [03:15] thanks! [03:15] the "batch" of 5 or 10GB [03:15] np :) [03:16] i'll be devastated if i cant get ahead of underscor [03:17] oli: <3 [03:17] im too lazy to go get coke from the fridge so i just opened the window behind me and have the bottle on the window sill [03:18] since its 4c outside anyway, about the same as the fridge [03:18] problem is now the window is open so its getting kind of cold in ehere [03:18] *first world problems* [03:18] lol [03:18] where: &w_identifier=archiveteam-mobileme-hero* | size: 62,193,803,965 KB [03:18] good job guys [03:18] :D [03:47] > x-archive-meta-title:Archive Team: The SOPA World Tour [03:50] I guess we'll need another one for CISPA [03:51] VAST majority of uploads to fos are oli [03:51] Oli NEEDS to get onto direct s3 [03:51] SketchCow: I just emailed you [03:52] OH BOY [03:52] * SketchCow stops everything [03:52] * SketchCow chucks the ballast out of the ballooon [03:52] * SketchCow calls the press [03:52] at least i email you [03:52] And then tell me on here you mailed me [04:02] ok [04:02] hang on [04:03] godane: ? [04:03] hey [04:03] hi :) [04:04] emailed me? [04:04] SketchCow: just STOP'd mine will switch it over as they end [04:05] email you for what? [04:05] 13:21:35 at least i email you [04:05] ok [04:05] slitazemulator gmail.com [04:05] sorry are you asking me to email you? what for? hehe [04:06] try slaxemulator gmail.com [04:06] No, I meant godane. [04:06] oli: swap over as you can, it will be appreciated. [04:07] ok [04:07] FOS is really for small fry who aren't shooting for the moon. [04:07] hmm im getting errors anyway [04:08] you know me, always shooting for the moon :p [04:10] haha [04:18] http://archive.org/details/archiveteam-sopa-world-tour [04:18] That was awesome work. [04:20] SketchCow: it would be nice if Archive Team did this for the yearly april fools jokes. [04:22] haha that is great [04:22] 13.6GB of anti SOPA pages [04:22] amazing [04:47] oh [04:47] I remember getting those [04:48] I tried to find pro-SOPA articles that were not written by Chris Dodd [04:48] I can't remember if I succeeded [04:48] that or there just weren't any [04:49] also: "Santorum pulls out after consistently coming in number two" [04:49] I am juvenile, so find that hilarious [04:49] hahahahahaha [04:49] nice [04:58] Oh wow [05:05] http://www.codinghorror.com/.a/6a0120a85dcdae970b0167648b3804970b-800wi <- Fucking server racks! [05:13] SketchCow: if i download stuff like that mp3com thing what's the best way to help/make it available etc? [05:13] apart from just storing it on my drives :P [06:02] 4.6T . [06:02] MOBILEME-SETS# du -sh . [06:02] Niiiice [06:07] :o [06:09] what, is that all [06:13] That's all I've found SO FAR [06:14] oh boy [06:14] * ersi shrugs [06:46] SketchCow, what about splinder? [06:47] http://www.archiveteam.org/index.php?title=Splinder was incomplete so we don't know if everyone uploaded everything [07:10] I still have some splinder, I think [07:11] hey. you should upload that. [07:11] okay. what's the proper thing to give to upload-finished.sh ? [07:12] what do you mean [07:12] $ ./upload-finished.sh chronomex [07:12] chronomex does not look like a proper rsync destination. [07:12] Usage: ./upload-finished.sh [dest] [bwlimit] [07:12] whatever the download script downloaded is what it expects [07:12] remind me what I give it for [07:12] oh... ehhh, I think you have to ask sketchcow [07:15] don't know if anyone remembers, but 3dporch.com was archived by us because the owner threatened shutdown due to costs [07:15] he was so flattered that we wanted a copy of his site that he kept it going [07:15] and it's still going today [07:17] haha, I remember 3dporch [07:19] Yes [07:25] chronomex, batcave.texfiles.com::chronomex/splinder/ [07:25] if that's your slot name [07:25] sweet, ok [07:26] oh dear, "name or service not known" [07:27] godane: you might want to write par2 files to your DVDs instead of md5sums, that way you'll be able to recover it if it bitrots slightly [07:27] sensible [07:28] hey SketchCow, do I have a slot on batcave? [07:29] I hope not [07:29] Nobody should [07:29] oh, right [07:29] so where should one upload splinder now? [07:30] exactly [07:31] One does not simply upload into batcave [07:34] chronomex: there was also a typo [07:34] oh hah [07:39] ivan`: how do i do that? [07:40] godane: I use QuickPar on Windows, and there is par2cmdline and many optimized forks [07:40] i have par2 cmdline [07:40] i'm on linux [07:41] something like: par2 c -r10 filename filename [07:55] see: par2 --help [08:12] anyone here experienced with bind? i have two ns setup and the slave is responding to requests but the master is not [08:12] not sure how to diagnose it, it appears to be running but dnstop shows nothing [09:57] oli: Is your wget-warc compiled with gzip support? The mobileme files you're uploading have non-gzipped warcs. [10:00] wuh woh [10:22] alard: no idea, i just used the thing that came with the git package [10:22] just followed hte steps there :/ [10:23] You probably didn't apt-get install zlib1g-dev ? (One of the other 'essential' development tools. :) [10:24] i did yum groupinstall 'Development Tools' before [10:25] Perhaps try yum install zlib-devel ? [10:26] (I thought you were using apt-get.) [10:26] zlib-devel was not installed and i am installing it now [10:27] Ah, great. If you recompile wget-warc and swap it for the current version it should probably work. [10:27] It's a bit more efficient than uncompressed warcs. [10:27] you msg'd me that command for the seesaw s3 script [10:27] how many threads of that should i run? [10:28] 100mbit link [10:28] and i need to duplicate the directory after re-running the compile now for each thread, right? [10:28] Yes, but you can just copy or symlink the files. [10:29] A seesaw-s3 script is equal to one seesaw script, it's downloading one file at a time. So you can run a few. (Just keep in mind that each instance needs a bit of disk space.) [10:29] what like ln -s mobileme m1 then m2 etc ... ? [10:30] ls -alrt [10:30] No, the files themselves. ln -s wget-warc [10:30] er wrong window [10:31] Each seesaw-s3 should be running in a separate directory, because they can't share a data/ directory. [10:31] sorry i dont follow, im in a dir and have mobileme-grab, then inside it is the stuff i compiled [10:32] The easiest way is to copy the mobileme-grab dir. [10:32] how can i check if this new wget-warc hsa zlib installed, just recompiled it after doing the yum install zlib-0devel and im guessing that's ok, but wouldnthurt to be sure :p [10:32] And then you run one seesaw-s3 in each mobileme-grab-# directory. [10:33] ./wget-warc --warc-file=test http://www.archiveteam.org/ [10:33] yes ok, and how many threads you suggest [10:33] It should make a .warc.gz [10:33] 10? 20? [10:34] I think 10 is enough. [10:34] lame, it didnt compress it: [oli@falcon m1]$ file test.warc [10:34] test.warc: WARC Archive version 1.0\015 [10:34] oh hangon [10:37] test.warc.gz: gzip compressed data, extra field, from Unix [10:37] much better :) [10:42] thx alar you are hte man [10:46] oli: Thanks for adding the gzip. [10:47] If you haven't started copying the scripts: I'm working on a small update that will allow multiple seesaw-s3's to run in the same directory. [10:47] ive started it on one system but i will wait to do it on the other one till you have completed that small update, just let me know when its done please :) [10:54] oli: Okay, done. Here's how to run it: [10:54] 1. git pull to get the latest versions [10:55] 2. Run DATA_DIR=data-$N ./seesaw-repeat.sh $youralias $accesskey $secret , where the alias, accesskey and secret are what you already have, and $N is the number of the instance. [10:56] eg. DATA_DIR=data-1 ./seesaw-repeat.sh ... ; DATA_DIR=data-2 ./seesaw-repeat.sh .... [10:57] ok cool, interent connection is fucked here at the moment [14:27] "WikiTeam is the Archive Team subcommittee on wikis" [14:28] subcommittee, really? :-O [15:07] https://www.societyinforisk.org/content/sira-monthly-webinar-4122012-17-gmt12-est9-pst-caroline-wong-security-metrics-risk-and-compl [15:07] oops wrong channel :) [15:10] what is that [15:11] nothing worth clicking on, I assure you [15:11] clicked [15:11] doh! [15:11] who is that chinese womenz [15:19] > @brewster_kahle: RT @edwardbetts: The best value hard drive you can buy is now 3TB, up from 2TB (price data from Newegg). http://t.co/nrfdDi9R [15:19] keen-o [15:46] SketchCow: Didn't you link a picture with a truckload of portable harddisks a few days ago? [15:49] http://home.us.archive.org/~edward/unloading_truck/ [15:49] thanks, that's it. [17:38] Emergency Download, who wants it: [17:38] Hi, [17:38] Some of you have probably heard this already, but Digiplay Initiative and [17:38] its valuable Bibliography of game studies (http://digiplay.info/search) will [17:38] be closing down any day now. Dr Jason Rutter who has been doing all the hard [17:38] work is moving on, he's not working for academia anymore and Jase has [17:38] announced that the Digiplay domain will soon be replaced by a link to his [17:38] new project (http://vintagetwists.co.uk/). The registered users can still [17:38] today download the database (of more than 3000 references) in a variety of [17:38] formats (BibText, Tagged, XML). [17:38] Since I am sure many of us will be missing this kind of interdisciplinary [17:38] source of publication data in our work, it would be great to see someone [17:38] else taking up the torch, and build a new initiative, preferably utilizing [17:38] the data from Digiplay. It has been possible for author's to add their work [17:44] it is only a bunch of links [17:44] the articles are on external sites [17:44] download this http://digiplay.info/node/447 from 1 to 9999 [17:45] it is probably in the wayback machine [17:50] Well, I didn't think it was difficult, I just have to sit here cleaning up mobileme uploads. [17:50] Also, emijrp, when you have time, I'd like suggestions of anti-spam tools we can re-add that are compatible with this version of mediawiki. [17:50] Right now new user adds are disabled, because we just had the dogs of rape loosed [17:52] "Modern dressing with vintage style" is the new project of Dr. Jason Rutter? [17:52] that sounds like some late april joke. [17:55] SketchCow: use this https://www.mediawiki.org/wiki/Extension:QuestyCaptcha enablig this on localsettings.php $wgCaptchaTriggers['createaccount'] = true; [17:58] Thank you, I'll do it, hopefully today [17:58] add a Cap. Obvious question, bots are stupid for questions [18:00] SketchCow: I have to admit, some of that spam was so ridiculously absurd I'll kind of miss getting articles about "Potent Love Spells" or "Advantages of Having Additional Information About Jogos Da Polly Pocket" in my rss [18:01] Misty [18:01] Misty Mateo? [18:03] De Meo. For some reason I can't even remember it made sense to use the second part of my last name. [18:03] (I only assume it made sense at the time.) [18:03] Yes, yes, De Meo! [18:03] One of my favorite followers on twitter. [18:03] <3 Well thanks [18:03] I always know I've twittered something more generally clear if you retweet it [18:05] Haha [18:05] Hey, I meant to ask. What software do you use for analogue video capture? [18:06] Blackmagic's own software sucks p. bad, and Virtualdub was desyncing my audio. I can probably fix the issue in Virtualdub but figured I should check what else is worth using too. [18:09] Well, believe it or not, I use something a little more pedestrian - Live2USB [18:10] Oh, and it's got its own software? [18:10] yeah, very straightforward, blows it right into .ts files. [18:10] Now, that said, for BetaCAM SP I use a go-between M-Audio XLR to USB [18:10] And I have to use a second machine or do a second audio-only run [18:10] And this is for standard-def tapes [18:10] Also, I make sure the tapes are kept around after, I do not throw them out [18:11] So if something is much more critical, then someone can blackmagic 24-bit that shit [18:11] throwing_shit_away-- [18:12] Same when I saw the guy who was digitizing Byte magazine [18:12] And was doing it by ripping out pages, and then throwing them away [18:12] When I called him on it, he really literally was completely confused. [18:12] Yeah, I don't get it but that's been a deeply ingrained idea for ages. [18:13] cf. those libraries who microfilmed decades of historic newspapers and then threw the originals out. [18:14] I read about that and I feel pained. Actual physical pain! Yet someone weighed all the options and thought that was a good idea. Okay. [18:15] Oh, that's the stuff from Nicholson Baker [18:15] re: video, I'm doing all standard-def right now, but kind of an exotic source. I just *know* I'm going to utterly destroy frame timing on one of the Laseractive discs and have to do it over. But I guess it's better to actually run into the problems and solve them than try to make sure everything's perfect before I even do anything. [18:17] I should probably find out what MAME people do/did for arcade LDs. [18:17] aaron giles of the mame project has done work on proper laserdisc game archiving methods [18:17] efb [18:17] DFJustin: Will look into that, thanks. [18:18] Yes, I agree [18:18] Non-destructive shitter-than-perfect scanning is better than nothing [18:18] And then you store for later and try again [18:18] There are some especially crazy cases here, with things like multiple videos multiplexed together on even/odd scanlines. [18:19] you also need to capture overscan for some of the laserdisc stuff because they encode stuff there [18:20] Yeah, I capture the overscan area. But not so much of a problem here because Laseractive games have an actual, honest-to-goodness on-disc data track, so there wasn't a need to hide stuff in overscan. [18:20] if you wanna talk laseractive there are probably folks in #messdev who would be quite interested [18:21] Will do, thanks! [18:21] OK, I'm going to do it, I'm going to put the Bytes up. [18:21] Going to regret it. [18:21] There's a guy on a message board who seems very close to capturing the data track, too. [18:21] Or, I should say, it's not going to last. [18:21] But let's do it. [18:23] in before legal shitstorm [18:23] hi cow, misty [18:23] Hi winr4r [18:30] Won't be a shitstorm [18:30] It'll be a letter and a takedown [18:31] That's less exciting. [18:38] I need less exciting [18:51] SketchCow: yay europe, we have http://geizhals.at/eu/?cat=hde7s&sort=r which does sort by price per TB too [19:34] for my SD video conversions, i'm thinking of using a grass valey (formerly canopus) advc110 or advc300 and linux using dvgrab [19:36] prefering the 300 for the noise reduction and tbc features [19:40] http://archive.org/details/byte-magazine-1981-09&reCache=1 [19:40] it's coming along [19:41] It's 500 pages, so it's going to take a while to work through the machine. [19:41] They're so fucking huge. [19:44] mmm [19:45] grass valley no longer lists the 300 on their site :-\ [19:52] Coderjoe: DV isn't my delivery/storage format so not necessarily the best fit in my workflow. What's your use case, out of curiosity? [19:55] Holy crap, I do believe the contribution from Nemo_bis has arrived. [19:55] Jesus holy crap [19:59] SketchCow: what is it? [19:59] Hundreds of CD-ROMs, DVD-ROMs, floppies [20:01] oooh [20:01] btw may I PM you? [20:01] I got an email I want to share [20:01] WHO ASKS PERMISSION TO PM [20:01] it's nothing bad [20:01] I do [20:01] May I use this keyboard? [20:01] This one, this one in front of me [20:01] because some people get pissy if I pm them and not ask [20:01] Just checking, it is a keyboard, after all [20:01] Some people get pissy if you ask [20:01] i.e. you're screwed [20:01] ergo fuck them [20:01] ergo just do what you want [20:01] ergo be me [20:03] A guy wants to store a Gopher crawl into Stanford and the university reply with a document to sign. [20:04] hahaha [20:04] By the way, the conversation started in 2010, and the document to sign arrived some days ago. [20:04] Stanford rocks. [20:04] lol [20:06] http://article.gmane.org/gmane.network.gopher.general/3843 [20:06] The Gopher crawls are those 2 famous gopher torrents. [20:06] Saved at iBiblio and probably IA. [20:07] SketchCow, already?? :-O [20:08] emijrp: lol and sigh at the same time [20:08] Do you like my stuff? :) [20:08] it's a lot of stuff. [20:08] Did it arrive in a good state? [20:08] Seems to have. [20:08] I tried to place things in a smart way. [20:08] Good. [20:08] I mean, the box got the usual mailing love [20:08] but it's essentially a brick of shit [20:09] Hard to break [20:09] lol [20:09] :D [20:09] http://archive.org/details/2007-gopher-mirror [20:09] http://archive.org/details/quux-gopher-mirror [20:11] Yes, I'm going to reply that guy. [20:17] Hi John; [20:17] Don't sign any document you don't want to. [20:17] Thanks for your work crawling all that Gopher stuff. [20:17] Those Gopher crawls were uploaded to Internet Archive some time ago.[1][2] So, they will be preserved many many years. [20:17] Regards [20:21] in the other side of the interwebz http://www.ufodigest.com/article/wikipedia-considering-dropping-exopolitics-author-alfred-lambremont-webre [20:21] "exopolitics" is a fantastic word. [20:22] i prefer exolinguistics, there is an article at WP about that [20:23] But for HOW LONG [20:23] nobody knows [20:24] until some 13 year old looks at it, hasn't heard of it, quietly nominates it for deletion [20:24] mistym: lots of VHS and SVHS (including SVHS-ET) tapes that I would like to digitize and possibly upload parts of. though I also might try re-encoding into a lossless format and use that for long term storage. (the audio could definitely use it, as DV uses raw PCM. the video data could very likely expand) [20:26] Coderjoe: Video data would definitely expand in size. If you want lossless storage, I wouldn't go to DV as an intermediate since you've already introduced a lossy step. (Conversely, if you capture to DV, I don't know that you have much to gain by reencoding to lossless.) [20:27] i know dv isn't lossless [20:28] it just looked like the easiest means, at this time, to get the tapes digitized. I wasn't planning on discarding them, as nice as it would be to get the space back [20:28] Makes sense. [20:28] shrug, video tapes conondrum [20:29] And there's a value in easy means. A tape that's digitized imperfectly is infinitely better than a tape that isn't digitized. [20:29] (I'd been doing stuff in the field of digital video since the late 90's) [20:30] mistym: true that [20:32] expansion is going to depend on the codec involved and the quality and type of footage. I was thinking I would try the lossless compression and if it didn't work, just keep the dv video. [20:35] I'd be curious to see, but my instinct tells me that DV's going to be smaller than even, say, highest-efficiency FFV1. I ought to check though, think I have some DV lying around. [20:35] i have other crazy ideas I would like to try out, like sampling the signals coming from the tape and reconstructing the video frame digitally, or sampling the ntsc signal into a waveform and reconstructing the frames programmatically. [20:35] Cool! [20:36] the latter to get away from having to tweak brightness/contrast/etc settings in hardware before capturing [20:36] I am pro-crazy for sure. [20:37] one of the x264/libav/ffmpeg developers has a patch for an ffv2, which looks pretty cool. I wish they would finish it an put it in the main codebase. [20:37] I played around with the patch on my own build [20:38] ffv1 is only just getting its documentation officially written up and took ages to solidify, so I guess it'll be a similarly long route. [20:38] Haven't heard too much about ffv2, what's it looking like? [20:40] neat, I've seeded 500GB+ of ED [20:41] ffv2 has per-plane block decisions for P or B, hpel motion vectors, and a variable length code system. plus for blocks with small enough differences per pel can pack the data quite well [20:41] iirc [20:41] I'm not completely sure I am remembering the per-plane part properly [20:42] er [20:42] and I mean I or P [20:43] hmm [20:44] I wish I had recorded uncompressed sizes for the sample clips. I have a page comparing the two macroblock decision mode option values for several clips [20:45] to point out what appeared to be a bug with the decision process. (the "bits" mode was supposed to see which coding came out smaller and go with it, but it didn't come out with a smaller file in most cases) [20:45] http://wegetsignal.org/tmp/ffv2compare.php [20:46] Thanks! [20:49] here's a graph Dark Shikari posted on doom9 back in feb 2009 : http://i39.tinypic.com/2uojolv.png [20:50] not that it really helps much [20:50] and I can't find the patch location at the moment [20:51] though I have a copy of the patch from dec 2010 [22:15] got the last 2 call for help episodes upload to archive.org [22:31] Know what's awesome? Sliding your computer out to get to the USB ports in back and accidentally mashing the power button :| [22:32] know what's better than that? a computer that randomly powers off when you plug something in one of the USB ports [22:34] Haven't found one of those yet. But as old as this thing is you may have just predicted the future. [22:56] we should copy our stuff to usenet [22:56] :> [23:28] underscor: would you be able to get a high enough retention period? [23:33] interesting point I just noticed on that ssd price-per-tb page edward betts made: the best deal is for an add-on card rather than a sata-interface [23:37] hell, it isn't an SSD, but a hybrid rust/ssd [23:45] dashcloud: supernews has ~1300 days [23:46] so you just have to reup every 3 years [23:54] giga claims 1345 days of bin (groups) and 8.5 years of text (groups) [23:55] wow [23:55] greetings all to the newest VM runner :) [23:56] or maybe better called Virtual Appliance?