[00:59] Nemo_bis: Good. 2GB so far [00:59] only 4 finished wikis though [00:59] geez these take a while [00:59] lots of Server is slow... Waiting some seconds and retrying... [00:59] :( [01:12] underscor: you need many instances at a time [01:13] some hundreds probably [01:18] yeah, I'm spinning up a bunch [05:01] Nemo_bis: Uploader.py isn't doing anything [05:01] do you know why it would do that? [05:01] http://p.defau.lt/?0vmgCtwo8derfr3oPU8zDQ [05:02] same output even if I run it multiple times [05:03] oh, maybe my mistake [05:03] one second [06:26] underscor: ok? [06:27] seems it si [06:30] yeah [06:30] my fault [06:31] fixed it [06:31] some of these wikis are dead though [06:31] do I need to mark that somewhere or just ignore it and move on? [06:33] ^ Nemo_bis [06:54] underscor: just move on [06:54] not some, most of them [06:54] yeah [06:54] :( [06:54] okay, will do [06:54] there's lots of DB errors and other broken configs [06:55] yeah [06:56] underscor: when you're done with a batch, it's useful to run launcher.py over it again (keeping all 7z on the dir) and paste the output like this: http://p.defau.lt/?xFZdHVoVpJBhI5fA5icN3g [06:56] so that we can see what are the errors for the wikis which have not been downloaded [06:57] note that you can't leave launcher.py completely alone, quite often it will be stuck [06:57] yeah [07:04] Nemo_bis: Is p.defau.lt your site? [07:14] underscor: no [07:14] oh okay [07:14] I just always see you use it :) [08:29] Nemo_bis: does this actually download the media too? [08:29] seems like not a lot of data going through the pipe [08:48] hi emijrp :) [08:51] hi underscor [10:49] 12:40 It was a real heavy weight taken off my shoulders when Wikiteam took the action of backing http://develop.consumerium.org/wiki/ onto archive.org .. before that it was me and my neck on the line should backups fail and catastrophical wiki data loss encurs [10:49] 12:40 that was 1.5 months ago [10:49] 12:42 for over 10 yrs I needed to worry about looming data loss catastrophe should like the server hall and my flat and my friend's flat burn down simultaneously [10:50] from #wikimedia-tech on freenode [11:21] A little dramatic for such a small wiki [11:40] 1000+ pages [11:47] Is it? I couldn't determine that from the page listings. [11:47] It looked to be 200 at most. [11:53] he's quite a dramatic guy, yes [11:54] and newbies too [11:54] *newbiesh [18:10] wow, he's more fanatical about data loss than even I am [18:50] Nemo_bis: you canc laim my list in the task force [18:51] up bandwidth in spain is shitty [18:51] and i can not upload a damn [19:55] 2500 wikis... [19:56] using a script of 1000 lines [19:56] that is a performance of 2,5 wikis/line [19:56] and growing, go go go [21:35] does this actually download the media too? [21:35] seems like not a lot of data going through the pipe [22:36] underscor: yes it does, most wikis don't have much files [22:36] except some I already downloaded of course :/ [22:38] underscor: like this which you promised to download for me a while ago http://archive.org/details/wiki-citywiki.ugr.es