[06:24] im finishing downloading wikitravel wikis [06:24] wikitravel fails to export entire histories, so, im doing a --curonly dump [06:32] aww [13:21] 97 wikis at WikiTeam. [18:54] shoutwiki: 136 wikis completed, 1023 MiB downloaded [19:11] nice [19:13] share your scripts [19:15] they're just shell scripts [19:16] (and I don't know anything about shell scripts) [19:16] i haz a bash interpreter too [19:16] like http://p.defau.lt/?OoyUMOe2_WF0sgY5lPg_6g [19:17] --logs doesnt work, is a dummy option [19:17] yes I noticed, but doesn't harm :-p [19:17] a TODO feature (download list of deleted pages, pave moved, and other Special:log) [19:17] page* [19:18] I don't know how to append --path=foo to those commands easily :-/ [19:20] emijrp, there are still several dumps you didn't upload in https://docs.google.com/leaf?id=0By9Ct0yopDdVNzExNzIxMWQtN2Q5Ny00NzQzLTgyOWQtMTdkZjcwNDNhY2E0&sort=moddate&desc=true&layout=list&num=50 https://docs.google.com/leaf?id=0B0bTq2pGEiCoNTBiOTE5ZDgtZjEwMS00MzVhLTliOWItNWNkNGVkMTQ4MjY2&sort=moddate&desc=true&layout=list&num=50 [19:20] yes, i have a big backlog [19:20] Are we near the disk quota? [19:20] in my todo list [19:20] no [19:21] good [19:21] we have 4GB, not sure how much wasted, but no more than 500mb [19:21] ok. [19:21] Well, in the worst case I'll have to create some more Google accounts :-D [19:22] by the way i asked google code team to rise the max upload file size and more quota, no reply by now [19:22] i guess they are doing circles in Google+ [19:22] lol [19:23] Or it's just that your request is a bit unusual. [19:23] I don't know how many projects use the download section that way. [19:23] (i.e. to store *output* of their code) [19:23] google code projects hava a link to request more quota, but no to modify max file size [19:23] :-/ [19:24] Oh, perhaps you could ask sumanah [19:24] i think they have backlog too [19:24] I guess she'll be in contact with them, due to GSoC [19:25] uh, *he [19:25] Projects needing more download space may request more quota. http://code.google.com/p/support/issues/entry?summary=Download%20quota%20increase&comment=Project%20location:%20http://code.google.com/p/wikiteam/%0dRequested%20new%20quota:%20%0dReason:%20%0d&labels=Type-Task,Component-Downloads [19:26] oh they reeplied [19:26] http://code.google.com/p/support/issues/detail?id=5553&can=1&q=wikiteam&colspec=ID%20Type%20Status%20Milestone%20Priority%20Stars%20Owner%20Summary [19:26] We're happy to continue hosting your project, but I don't think we will raise your quotas for this purpose. [19:26] no, I was, right, *she [19:26] :-( [19:27] Creating a item for each small wiki could be overkill [19:27] Although, if done in batches, not so much thanks to the super-sweet perl bulk uploader [19:29] IA items also have the benefit that you can upload more recent dumps in the same item and it's easy to find them [19:31] Moreover, you can handle metadata better [20:00] so, where can we host the dumps? [20:00] internet archive is a bit chaotic [20:00] i liked the google code download section because is all well sorted [20:25] why chaotic? [21:02] Nemo_bis: do you know how to create a collection? [21:03] emijrp, on archive.org? [21:03] i think only admins an [21:04] you can ask SketchCow, he mentioned he was going to create one [21:07] for wikiteam? [21:25] emijrp, no, for another thing [21:25] just to say that he can