Time |
Nickname |
Message |
02:13
🔗
|
SketchCow |
I think you know |
04:01
🔗
|
Danneh__ |
whoo, backing up wikis is fun |
07:52
🔗
|
Nemo_bis |
Danneh__: I'm glad you think so! :) |
07:52
🔗
|
Nemo_bis |
did you already upload some to archive.org? |
07:58
🔗
|
Danneh__ |
Nemo_bis: Yeah, just a few game ones. Downloading Combine Overwiki now, about 12720 pages in (been running for two-three days so far!) |
07:59
🔗
|
Danneh__ |
Around Should be done today or tomorrow, depending on how long it takes to grab the images |
08:02
🔗
|
Nemo_bis |
Danneh__: nice, and you know about uploader.py right? |
08:03
🔗
|
Nemo_bis |
if you use that one, you easily add all the metadata and then, for instance, wikiapiary.com will automatically create a page on that wiki (which they don't have) and link the dump |
08:03
🔗
|
SketchCow |
aa/iwin 14 |
08:04
🔗
|
* |
Nemo_bis just *knew* someone was going to use the metadata in the end |
08:05
🔗
|
Danneh__ |
haven't messed around with much in the way of metadata yet, to be honest |
08:05
🔗
|
Nemo_bis |
SketchCow: I use this finnish trick http://niklas.laxstrom.name/page/eng/irssi |
08:05
🔗
|
Nemo_bis |
Danneh__: one more reason to let the script handle it :P |
08:05
🔗
|
Danneh__ |
I know if I throw a list of URLs in a text file and point uploader.py to it, I can get them on archive.org, haven't yet looked into it more in depth yet :P |
08:05
🔗
|
Nemo_bis |
let me know if you have problems with it, it's quite hacky still |
08:06
🔗
|
Nemo_bis |
I just added some docs for you ;) http://code.google.com/p/wikiteam/wiki/NewTutorial#Automatic_publishing long overdue |
08:06
🔗
|
Danneh__ |
ah, thank you very much! |
08:07
🔗
|
Danneh__ |
I will need to add that --help option and submit a patch, but been super busy lately |
08:10
🔗
|
Nemo_bis |
sure |
08:11
🔗
|
SketchCow |
NICE INNOCULATION SCAR, PERFECT ROBNOT |
08:14
🔗
|
Danneh__ |
just wondering, if they're using Semantic MediaWiki, know if that's currently backed up using dumpgenerator.py ? |
08:22
🔗
|
Nemo_bis |
Danneh__: SMW help pages say that all the properties and stuff are stored on the wiki |
08:23
🔗
|
Nemo_bis |
what I'm not sure about is what happens with features using a different contenthadler |
08:23
🔗
|
Nemo_bis |
contenthandler |
08:24
🔗
|
Nemo_bis |
hmm seems straightforward enough, no real problems https://meta.wikimedia.org/wiki/Special:Export/Schema:MediaWikiVagrantUsage |
08:25
🔗
|
Nemo_bis |
though that adds to the complexity of the installation configs... we save special:version but that's hardly enough |
08:26
🔗
|
Danneh__ |
fair enough, was looking into a wiki that uses SMW, wanted to make sure before I back it up |
08:26
🔗
|
Danneh__ |
well, before I try at least :P |
08:28
🔗
|
Nemo_bis |
:) |
08:29
🔗
|
Danneh__ |
just using home server, 'bout 10 seconds delay, hasn't given me any problems yet |
08:29
🔗
|
Danneh__ |
:) |
09:30
🔗
|
Nemo_bis |
ah yes that's slow :) |
09:59
🔗
|
Danneh__ |
aha, slow but safe, don't wanna get myself banned on here! |
10:00
🔗
|
Nemo_bis |
the trick is to be fast enough that they don't notice in time :P |
10:00
🔗
|
Nemo_bis |
does someone want to extract all the domains of their wikis from the page in http://www.shoutwiki.com/wiki/Category:Flat_list_of_all_wikis ? we need to update https://code.google.com/p/wikiteam/source/browse/trunk/listsofwikis/shoutwiki.com |
10:04
🔗
|
Danneh__ |
aha, fair enough |
16:36
🔗
|
Nemo_bis |
https://archive.org/details/wikia_dump_20140125 |
21:08
🔗
|
Nemo_bis |
I'm so tempted to just start download of a few thousands wikis on my shared server... |