[14:58] So what wiki work is there? Finding more wikis [14:58] downloading more? Looking at the wiki page it looks like things are just sailing along [17:12] omf_: what we'd need now is a crawler to list all MediaWiki instances on the web [17:12] alternatively, someone to do this: https://meta.wikimedia.org/wiki/WikiTeam/Dumpgenerator_rewrite [18:12] So a crawler just for mediawiki [19:51] omf_: like those alard makes to find all users/IDs in a website to archive, but looking in Google for all mediawiki websites existing (we need only the URL to each api.php)