[15:38] https://archive.org/details/wikia_dump_201402-b and https://archive.org/details/wikia_dump_201402-c are up [15:40] i passed --header 'x-archive-meta01-collection:wikiteam', not sure why it did not end up there [15:41] will transfer ownership to you guys once it is done if youw ant [15:58] Schbirid: because you're not an admin of the collection; we'll get them moved to the collection later [15:58] ok [19:35] Can an wiki be split into different parts and uploaded part by part? [19:37] dud1: yes, but what for [19:38] One I have atm will be ~50gb for the titles alone (before compressing) and there are about 55k images to download as well. [19:39] 50 GB for the titles?????? [19:40] That's 210 millions pages even if they all had a title of the max length allowed by MediaWiki, I doubt it [19:42] BuY VI4grA Now! (1542) [19:42] Well it had gotten to 7.2bg after getting ~55k titles (out of382947). and sorry there are 58.5k images [19:42] ;) [19:45] If you mean XML, that will probably compress 100-1000 times [19:52] Sorry yes the xml. [21:59] should I start trying to backup the wikis in https://wikiapiary.com/wiki/Category:Website_not_archived ? [21:59] or are they done already or special cases or something, or is there another list I should try to work on? [22:44] gui7: most of them, we have no idea why they fail [22:45] gui7: I could make a list of some wikis I gave up with, if you want; probably not this evening though, so you can just start with any site of those [22:46] gui7: just avoid the big wikifarms, like Wikia, as well as wiki-site and editthis which are very hard to deal with