#wikiteam 2014-02-24,Mon

↑back Search

Time Nickname Message
15:38 🔗 Schbirid https://archive.org/details/wikia_dump_201402-b and https://archive.org/details/wikia_dump_201402-c are up
15:40 🔗 Schbirid i passed --header 'x-archive-meta01-collection:wikiteam', not sure why it did not end up there
15:41 🔗 Schbirid will transfer ownership to you guys once it is done if youw ant
15:58 🔗 Nemo_bis Schbirid: because you're not an admin of the collection; we'll get them moved to the collection later
15:58 🔗 Schbirid ok
19:35 🔗 dud1 Can an wiki be split into different parts and uploaded part by part?
19:37 🔗 Nemo_bis dud1: yes, but what for
19:38 🔗 dud1 One I have atm will be ~50gb for the titles alone (before compressing) and there are about 55k images to download as well.
19:39 🔗 Nemo_bis 50 GB for the titles??????
19:40 🔗 Nemo_bis That's 210 millions pages even if they all had a title of the max length allowed by MediaWiki, I doubt it
19:42 🔗 Schbirid BuY VI4grA Now! (1542)
19:42 🔗 dud1 Well it had gotten to 7.2bg after getting ~55k titles (out of382947). and sorry there are 58.5k images
19:42 🔗 Schbirid ;)
19:45 🔗 Nemo_bis If you mean XML, that will probably compress 100-1000 times
19:52 🔗 dud1 Sorry yes the xml.
21:59 🔗 gui7 should I start trying to backup the wikis in https://wikiapiary.com/wiki/Category:Website_not_archived ?
21:59 🔗 gui7 or are they done already or special cases or something, or is there another list I should try to work on?
22:44 🔗 Nemo_bis gui7: most of them, we have no idea why they fail
22:45 🔗 Nemo_bis gui7: I could make a list of some wikis I gave up with, if you want; probably not this evening though, so you can just start with any site of those
22:46 🔗 Nemo_bis gui7: just avoid the big wikifarms, like Wikia, as well as wiki-site and editthis which are very hard to deal with

irclogger-viewer