Time |
Nickname |
Message |
17:10
🔗
|
Ryz73 |
Someone by the name of ooogabooga requested wanted to archive https://apple.fandom.com/ - can Fandom wikis be archived through here? |
17:10
🔗
|
phuzion |
Ryz73: This channel has moved to Hackint. Please ask there. |
17:10
🔗
|
phuzion |
(See the topic) |
17:11
🔗
|
Ryz73 |
Oops, I somehow clicked EFNet version and not the hackint version >_<; |
17:11
🔗
|
|
Ryz73 is now known as Ryz |
17:11
🔗
|
phuzion |
No worries. Just figured I'd let you know since you're unlikely to actually get a response in here. |
19:07
🔗
|
|
kiska has joined #wikiteam |
21:33
🔗
|
Nemo_bis |
actually he's more likely to get an answer here |
21:34
🔗
|
Nemo_bis |
the answer is that Wikia wikis work with dumpgenerator, yes, but sometimes there are weird limits so it's better to use --xmlrevisions |
21:35
🔗
|
Nemo_bis |
Which should be easier than ever now that they upgraded to MediaWiki 1.33, a leap of ten years or something from the version they were previously using |
21:36
🔗
|
Nemo_bis |
That said, the most complete dump is from 2020 https://wiki.archiveteam.org/index.php/Wikia (see infobox) and it's not like anything important happens at Apple recently in the space of one year. ^_^ |
21:37
🔗
|
Nemo_bis |
I'm debating whether it's worth archiving the Wikia images again. It's a mass of unfree content mostly. I think the past two image dumps were made by underscor and spirit. |
21:38
🔗
|
Nemo_bis |
Ryz: ^ |
21:39
🔗
|
Nemo_bis |
People at the University of Washington were loading the XML dumps in their parsing system a few days ago, so I might soon know whether they actually work or not. :) |
21:43
🔗
|
JAA |
There have been 179 messages in hackint #wikiteam this year vs 12 in here. So no, about 15 times more likely to get a response on hackint. |