#wikiteam 2012-08-09,Thu

↑back Search

Time Nickname Message
00:59 🔗 underscor Nemo_bis: Good. 2GB so far
00:59 🔗 underscor only 4 finished wikis though
00:59 🔗 underscor geez these take a while
00:59 🔗 underscor lots of Server is slow... Waiting some seconds and retrying...
00:59 🔗 underscor :(
01:12 🔗 Nemo_bis underscor: you need many instances at a time
01:13 🔗 Nemo_bis some hundreds probably
01:18 🔗 underscor yeah, I'm spinning up a bunch
05:01 🔗 underscor Nemo_bis: Uploader.py isn't doing anything
05:01 🔗 underscor do you know why it would do that?
05:01 🔗 underscor http://p.defau.lt/?0vmgCtwo8derfr3oPU8zDQ
05:02 🔗 underscor same output even if I run it multiple times
05:03 🔗 underscor oh, maybe my mistake
05:03 🔗 underscor one second
06:26 🔗 Nemo_bis underscor: ok?
06:27 🔗 Nemo_bis seems it si
06:30 🔗 underscor yeah
06:30 🔗 underscor my fault
06:31 🔗 underscor fixed it
06:31 🔗 underscor some of these wikis are dead though
06:31 🔗 underscor do I need to mark that somewhere or just ignore it and move on?
06:33 🔗 underscor ^ Nemo_bis
06:54 🔗 Nemo_bis underscor: just move on
06:54 🔗 Nemo_bis not some, most of them
06:54 🔗 underscor yeah
06:54 🔗 underscor :(
06:54 🔗 underscor okay, will do
06:54 🔗 Nemo_bis there's lots of DB errors and other broken configs
06:55 🔗 underscor yeah
06:56 🔗 Nemo_bis underscor: when you're done with a batch, it's useful to run launcher.py over it again (keeping all 7z on the dir) and paste the output like this: http://p.defau.lt/?xFZdHVoVpJBhI5fA5icN3g
06:56 🔗 Nemo_bis so that we can see what are the errors for the wikis which have not been downloaded
06:57 🔗 Nemo_bis note that you can't leave launcher.py completely alone, quite often it will be stuck
06:57 🔗 underscor yeah
07:04 🔗 underscor Nemo_bis: Is p.defau.lt your site?
07:14 🔗 Nemo_bis underscor: no
07:14 🔗 underscor oh okay
07:14 🔗 underscor I just always see you use it :)
08:29 🔗 underscor Nemo_bis: does this actually download the media too?
08:29 🔗 underscor seems like not a lot of data going through the pipe
08:48 🔗 underscor hi emijrp :)
08:51 🔗 emijrp hi underscor
10:49 🔗 emijrp 12:40 <jubo2> It was a real heavy weight taken off my shoulders when Wikiteam took the action of backing http://develop.consumerium.org/wiki/ onto archive.org .. before that it was me and my neck on the line should backups fail and catastrophical wiki data loss encurs
10:49 🔗 emijrp 12:40 <jubo2> that was 1.5 months ago
10:49 🔗 emijrp 12:42 <jubo2> for over 10 yrs I needed to worry about looming data loss catastrophe should like the server hall and my flat and my friend's flat burn down simultaneously
10:50 🔗 emijrp from #wikimedia-tech on freenode
11:21 🔗 SketchCow A little dramatic for such a small wiki
11:40 🔗 emijrp 1000+ pages
11:47 🔗 SketchCow Is it? I couldn't determine that from the page listings.
11:47 🔗 SketchCow It looked to be 200 at most.
11:53 🔗 Nemo_bis he's quite a dramatic guy, yes
11:54 🔗 Nemo_bis and newbies too
11:54 🔗 Nemo_bis *newbiesh
18:10 🔗 underscor wow, he's more fanatical about data loss than even I am
18:50 🔗 emijrp Nemo_bis: you canc laim my list in the task force
18:51 🔗 emijrp up bandwidth in spain is shitty
18:51 🔗 emijrp and i can not upload a damn
19:55 🔗 emijrp 2500 wikis...
19:56 🔗 emijrp using a script of 1000 lines
19:56 🔗 emijrp that is a performance of 2,5 wikis/line
19:56 🔗 emijrp and growing, go go go
21:35 🔗 underscor <underscor> does this actually download the media too?
21:35 🔗 underscor <underscor> seems like not a lot of data going through the pipe
22:36 🔗 Nemo_bis underscor: yes it does, most wikis don't have much files
22:36 🔗 Nemo_bis except some I already downloaded of course :/
22:38 🔗 Nemo_bis underscor: like this which you promised to download for me a while ago http://archive.org/details/wiki-citywiki.ugr.es

irclogger-viewer