[00:11] Where's my hug! [00:12] * arrith looks in couch cushions [00:24] http://imgur.com/bestof2011 [00:39] SketchCow: is there any kind of official policy or procedure for contacting sites about database dumps for Archive Team? as in, are there any people that do it or just anyone? or should it at least be run by someone like you first? [00:41] no official policy that I know of. I've contacted owners a few times with results ranging from: Wow, this is awesome I am flattered! to: fuck you fuck you fuck you [00:41] don't ever claim you represent archive.org or jason scott [00:42] yeah for sure. but what about claiming to be kinda part of Archive Team? i mean AT is quite loose afaik [00:43] sure why not [00:43] I think that is ok [00:43] when I emailed AO3 I said that I was "a volunteer with" AT [00:43] they of course refused to provide dumps, but I liked that phrase [00:43] it'd be a good idea to run such letters by several other people, at least one or two of whom have been around for a while [00:44] chronomex: is there a list of people like that besides the OPs in this chan? [00:45] not really [00:45] ah alright [00:45] so run idea and letters by ops [00:47] if a database dump of a site is acquired, is there an Archive Team server for such things or should the person who has the dump just try to hold onto it, and maybe mention it on the wiki, or make a torrent or something? [00:47] what is the goal? [00:48] well for fanfiction.net and reddit.com to maintain a live mirror. if that isn't possible then to maintain an offline mirror. [00:50] i'm thinking like the reocities-type sites. but if a torrent is the best one can do then i'll take the best there is [00:52] first of all, there is no way that reddit or fanfiction are going to be in any way okay with you hosting a mirror of their site [00:53] archiveteam doesn't exactly have a server, but there is some hardware at archive.org that SketchCow uses to ingest the stuff we grab. If you download something and jason wants it, you'll most likely be uploading it there [00:54] alright [01:45] Archiveteam has a tank [01:46] archivetank? [01:46] You can be part of archive team in terms of asking. [01:51] neat :) [01:51] * arrith pins honorary badge on self [02:00] SketchCow: i'm making something called slitaz-tank [02:00] its a archive of linux sources that can maybe rebuild it self without the internet [02:01] or very little of it [02:57] godane: to what degree? [02:58] godane: as in does it build binaries for a distro or is it a git repo of the kernel? [03:28] arrith: it builds binary in order of depends [03:28] it also has packages in iso too [03:29] i recompressed the source tarballs with .tar.lzma and compressed png with optipng [03:45] Hey, everyone. [03:46] Pleased to say.... the google group upload of pages and files begins. [03:48] I started in the beginning: kz [03:48] http://www.archive.org/details/archiveteam-googlegroups-kz [04:04] woo! [04:04] godane: what distro? [04:10] my own [04:10] called slitaz-tank [04:10] based on the slitaz project [04:10] ah [04:11] the full iso is like 8gb [04:11] well if you have the sources for any distro you can rebuild it without the internet [04:12] yes but sometimes its not has clear how to do it [04:12] godane: a side project might be to put together documentation on how to do it for major distros [04:13] i also mirror the slitaz sites too [04:13] i even fit xkcd and linuxgazette [04:14] that's good [04:53] Oh yeah, this is going to be NUTS. [04:53] NUTS. [04:53] what is NUTS? [04:53] Now Uploading This Stuff? [04:54] Blowing Google Groups into archive.org. [04:54] ooooh [04:54] Delicious. [04:54] http://www.archive.org/details/archiveteam-googlegroups-00&reCache=1 [04:54] What's the derive process for tarballs? [04:54] my posts from 1991 will live FOREVER [04:54] Or...zips, it looks like? [04:54] Is there one? [04:55] Not posts. [04:55] These are just the page files, and the file collections, all of them destroyed by Google this year. [04:55] oh [04:55] wow [04:56] On September 22, 2010 Google announced plans for turning off the group pages suggesting users to move their content to Google Docs or Google Sites. Starting in November 2010, the group pages became read-only (allowing only viewing/downloading existing content) while in February 2011 they were turned-off completely.[16] [04:56] sounds sizeable [04:58] I've got scripts, calling scripts, calling scripts. [04:58] It's just going to keep running. [04:58] I am worried about that problematic buffer thing. [04:59] But, like http://www.archive.org/details/archiveteam-googlegroups-00 - that's DONE. [04:59] can it resume? [04:59] Yeah, it can resume fine. [05:01] you've done a man's job...too bad she won't live (blade runner) [05:02] 985383 [05:02] root@teamarchive-0:/3/googlegroups# find . -name \*.zip | wc -l [05:02] 985,000 individual sets of files (many groups have a pages.zip and a files.zip) [05:02] That's before the second-third wave of google group uploads. [05:03] impressive [05:03] then again this must be old hat for you now [05:04] Just have to step carefully. [05:04] But then, yeah, I have a program called Groupgrope that makes a collection, then assembles the file, then shoves them into grouphug, which uploads the individual files into the collection and slaps it on the ass to derive. [05:05] Haha, script naming after my own heart! [05:05] hehe [05:06] The main limit is that you can't have more than 1000 items in a given directory, which means I need to create special cases. [05:06] But a minor thing I can work around. [05:08] that's an odd limit [05:09] It's related to archive.org and not any filesystems. [05:10] 2242164919 2011-12-04 23:56 ggroups_zipdl-wyatt.tgz [05:10] See? I have that of yours to integrate too. [05:11] SketchCow, is that first number a file number or unix time or something? [05:12] size [05:12] ah [05:12] then date/time [05:12] Penis length [05:12] in lightyears [05:12] lol [05:12] * SketchCow thrusts into the horsehead nebula [05:12] haha [05:12] take it horsey, take it [05:12] ha [05:13] It'll be centuries before you hear the pitiful whinny [05:13] in space no one can hear you upload all of google groups [05:14] imagine how long it takes to reel it all in when flacid [05:14] It finished 02!! [05:14] http://www.archive.org/details/archiveteam-googlegroups-02 [05:15] http://www.archive.org/details/archiveteam-googlegroups-03 [05:15] Also just finished. It's gearing up for 04. [05:15] And so it will go. [05:15] So other than making sure it doesn't go into conniptions, which it will eventually, this is what I'll be having it do for probably two weeks. [05:19] 05. [05:19] Enough live updating. The summary is this is now happening. [05:23] huzzah [05:24] http://www.techdirt.com/articles/20111229/00243317220/as-godaddy-deals-with-sopa-fallout-hollywood-wants-to-punish-godaddy-enabling-infringement.shtml [05:25] SketchCow: could script that for an #at-status chan [05:30] SketchCow: why are the google groups made up of lots or tiny zip files? wouldn't it make more sense to bundle them up, till they were significant sizes, maybe 50mb at least [05:33] what's the benefit? [05:34] how bug is this torrent of url shorteners [05:34] I want someone who wants "files from group whinybitches" to know they just need to go down to archiveteam-googlegroups-wh and it'll be there. [05:34] chronomex: less files shorter etc [05:35] bsmith093: I don't see the benefit there :P [05:35] Otherwise I might as well make one massive-ass tar.bz2 [05:35] Which I might [05:36] well ok then. thats what I would do, is all I'm saying :d [05:36] and how big would that tar actually be? [05:36] But first, I don't want our stuff to keep following the trend of "12 people in the planet would suffer the pain to extract what they need", espcially with something like this, where millions of people come at it from different angles. [05:37] That tar is at least half a terabyte, at least. [05:37] oy, well ok then, i hadnt realized it was so freakin huge, wow, thats rather inpressive [05:38] Google. Groups. [05:38] You expected a USB key? [05:39] hey random thought you know wikitaxi, and hos it takes the pages articles bz2 file from a wikidump, and turns it into a taxi file which is then basicallly portable? [05:39] Ive got the latest dump all taxied up; wat a copy? [05:40] its faster than making another and this way they ( wikitaxi users) dont have to do it [05:45] So SketchCow, what are your plans for MAGFest? Is it just a field test for your new gear, or do you have some interviews lined up? [05:55] Just field test, capture interviews as I can [05:56] Cool. [05:56] A lot of musicians I respect end up going there, so I was curious. [06:49] SketchCow: geocities should be sorted like google groups [06:50] mostly cause you can get a a site with out extacting the full geocities backup [10:06] But why didn't Google just automatically move all those files to new Google Sites...? (Which is what I did by hand with multiple groups.) [15:42] http://i.imgur.com/nYanb.jpg [15:52] Soojin, solution here: fireworks forbidden to reduce smog (Milan and other big cities in Italy) [15:52] :) [16:19] Nemo_bis: http://kuvaton.com/kuvei/chris_and_sun_comic.jpg [16:31] :D [16:31] I guess that's chronomex after moving, to upload Splinder data [16:51] the stick guy would need a computer rather than two halves to a shirt box on top of some other empty furnature box acting like a desk [17:12] Brewster stepped in, I'm doing the google groups slightly differently. [17:26] he even restarted a derive of mine today [19:45] Archive.org is looking into making public 798 infomercials. [19:45] Ranging from an hour to multiple hours. [19:46] whoa [19:56] wtf? http://www.us.archive.org/log_show.php?task_id=92391694 22 h of deriving to derive nothing? [19:56] hm, JPEG Thumb [19:58] and 14 h of crwaling [20:56] Splinder under maintenance [21:04] Nemo_bis: is today the day? [21:06] no, still a month, but not available now [21:45] bsmith093: Really? the file is named "applediskimages", no extension? [21:45] its a zip [21:45] Yeah, just sussed [21:46] sorry for the possibly horrible organization, wasnt me. [21:49] SketchCow, the new format for Google Groups is tidy, but what's the size limit of the zip before zipview.php fails? [21:51] Not clear [21:52] It handled a 3gb fine. [21:59] I think the limit is not much more, perhaps 5 or 7 GB [22:04] 5 works http://ia600506.us.archive.org/tarview.php?tar=/31/items/wiki.guildwars.com/wikiguildwarscom-20110717-images.tar [22:04] (but that's tar) [22:06] 19 definitely don't :) http://ia700508.us.archive.org/tarview.php?tar=/29/items/Infictive.com/infictivecom-20110712-images.tar&file=infictivecom-20110712-images/Ztar.jpg [22:55] I must say that the fireworks prohibition is not being respected very much here. [23:55] interesting observation: commercial music video totally made up of material taken from archive.org (with the exceptio of the dude singing) https://www.youtube.com/watch?v=fK0_PVaF8Pg