[01:09] just noticed that youtube-dl can actually grab whole blip.tv channels [01:09] though I still need to patch to get the HD versions [01:13] how can you get it to download whole channels? [01:39] Famicoman: just give it a blip.tv/username [01:39] at least it works with http://blip.tv/linuxconfau [01:40] that does work, thanks [01:46] I'm not sure how many of the sites listed in this book: https://archive.org/details/cdrom-internetgamesdirectorycd are still around, but there could be some good things in the links there [01:50] here's a useful site that tracks (and keeps copies of) youtube videos that get flagged or taken down: http://youtomb.mit.edu/ [01:52] neat, do they get the advance drop or something? [01:53] oh, hm [01:53] here's the about page: http://youtomb.mit.edu/about [01:54] iirc, IA sucks in copies of youtube videos on twitter's abbreviated firehose [01:54] *videos mentioned on [01:56] so I was wrong about it keeping copies of the videos- it doesn't [01:57] aye [02:28] I am curious if they still do [02:29] and just don't publicize it for obvious reasons [02:41] mostly dicks [02:54] there are 3.5 billion dicks in the world [14:05] hah, my rsync to SketchCow's box is still going [18:52] hmm... I wonder if I should pack up this old mirror of ftp.cavedog.com [18:52] cavedog was a game developer, behind the Total Annihilation series. the hostname no longer resolves. [19:57] I want to maintain a mirror of all of github, but I don't think I have the disk space, and it isn't really suitable for a one-time archiving job [19:57] anyone have 10TB+ available? [19:58] the main reason to do is that people rm their repos all the time [19:58] Even by todays standards 10 TB is just damn hue [19:58] huge [19:59] 10T is big, but not huge [19:59] ;) [19:59] well, for you maybe :) [19:59] I have about 60000 repos already [20:00] seems like the kind of thing that could really benefit from a compressing filesystem [20:00] git repos are already pretty compressed [20:01] git packs are already very compressed [20:01] dedup might be useful [20:01] ninja'd [20:01] wait who was first? :P [20:01] balrog was [20:01] you can actually share objects/ between repos, but I haven't tried it [20:01] I think at that point you never gc [20:02] I was going to say something but then I didn't [21:02] norbert79: 10TB is actually just 3.3 hard drives.. Without redundancy [21:02] that's HYUUGE [21:02] which is like 500$ [21:10] ersi: Over there maybe, it's a bit more expensive around my place [21:12] oops, I was calculating on 2TB disks, so that's too low actually [23:13] ivan`: there is some code out there that will allow you to put all the forks into one .git folder [23:13] i would think that will save you some space [23:14] hm, that would [23:14] git-gc would take forEVer though [23:15] there is stuff like this: https://github.com/joeyh/github-backup [23:15] joeyh == closure ;) [23:15] cool [23:17] i wish he made it not in bash or python or perl [23:17] * mean made in bash/python/perl [23:17] * chronomex shrugs [23:18] maybe you should learn haskell ;) [23:18] haskell needs itself just to compile haskell [23:19] woop woop woop off-topic siren [23:22] http://git-annex.branchable.com/ [23:22] you guys seen that project? [23:22] git-annex is my lord and saviour [23:23] :) [23:25] closure really loves his haskell! ;D