[00:29] :o [00:31] it annoys me that emijrp leaves [00:31] lol [08:09] yeah, hehe [09:59] hello emijrp [10:00] I had to cleanup some xml with broken ends. Drop the part after the last , delete 7z and resume. :/ [10:09] you dont need to drop any [10:09] that is dumpgenerator.py work [10:13] anyone want to help me to write a post about WikiTeam for the Internet Archive blog? [10:19] you will get laid for that, for sure [10:28] Nemo_bis: wikibase will be the name for the extension? no wikidata https://www.mediawiki.org/wiki/Extension:Wikibase [10:28] wikidata the project [10:30] no idea [10:30] that's not the script's work, it's not able to fix xml which don't end as they should [10:30] lul [10:30] or at least it didn't [10:32] oh boy, are you going to write a blog post em ? [10:36] yes Nemo_bis, i won that discussion, you remember [10:36] and read this [10:36] The following requirements are not negotiable: [10:36] Wikidata will be a Wikimedia project, eventually maintained and operated by the Wikimedia Foundation. [10:37] WIN. [10:37] https://meta.wikimedia.org/wiki/Wikidata/Notes/Requirements [10:37] you don't remember well [10:37] [[Wikimedia project]] [10:38] that's the wikidata website, not the wikidata project [10:38] the result of wikidata project, not what the wikidata project currently is [10:38] HAHA. [10:38] or rather, a byproduct of the wikidata project [10:38] HAHAHAHA. [10:39] : ) [10:39] no need to laugh, I already told you that you were confusing terminology [10:39] Wikidata as in the software development project is a WM-DE project. Period. [10:39] Its aim is obviosuly to produce something usable by Wikimedia projects run by the WMF, including a specific wiki. [10:40] yes ersi [10:40] The fact tat we call wikis "Wikimedia projects" doesn't mean that everything which happens on the, around them, or under them is a WMF project. [10:40] Although of course that's what the WMF loves to think. [10:41] Cool logos https://commons.wikimedia.org/wiki/Category:Wikidata_logo_proposals [10:44] Nemo_bis: dont get angry, i dont want to delete all the wikis you have downloaded [10:44] you to* [10:44] ? [10:46] : ) [10:53] emijrp, I have a wiki in infinite loop like this: [10:53] MediaWiki error for "Main_Page", network error or whatever... [10:53] Trying to save only the last revision for this page... [10:53] XML for "Main_Page" is wrong. Waiting 20 seconds and reloading... [10:53] XML for "Main_Page" is wrong. Waiting 40 seconds and reloading... [10:53] XML for "Main_Page" is wrong. Waiting 60 seconds and reloading... [10:53] XML for "Main_Page" is wrong. Waiting 80 seconds and reloading... [10:53] We have retried 5 times [10:53] me too [10:54] skip it, or file you #1000000 bug [10:54] heh [10:54] I'll surely file a bug if there isn't one [10:55] hmpf, 4 days in that loop [10:56] what a nice message http://gig-project.3d0g.org/wiki/ [11:00] emijrp, I also committed some checks for the 7z's content, did you notice? [11:07] I dont read the issues section... [11:08] makes me sad [11:19] heh [11:40] File "/usr/lib/python2.7/socket.py", line 571, in create_connection [11:40] raise err [11:40] IOError: [Errno socket error] [Errno 101] Network is unreachable [11:41] first image of this dump is http://images.uncyc.org/commons/b/b2/!!!Pedia.jpg [11:41] can those characters give problems? [11:55] network error [12:10] emijrp, what do you mean? [12:10] network works perfectly, I can load that image in Firefox [12:10] network is unreachable, says [12:11] I know what it says [12:11] resume the dump, probably a temporal glitch [12:11] but that's not the problem [12:11] no [12:11] Or rather, I was already resuming [12:11] I can try again... [12:13] http://uncyc.org/ doesnt open for me [12:27] emijrp, doesn't resolve? [12:28] on the subdomain I get 2012-04-14 14:28:03 ERRORE 403: Forbidden. [12:28] but that's normal [12:57] Nemo_bis: what is the size of all your -history.xml.7z files? [12:59] Nemo_bis: du *-history.xml.7z -ch [13:01] Nemo_bis: http://wikiindex.org/index.php?title=ShoutWiki&curid=21505&diff=116617&oldid=115949&rcid=128906 [13:02] I've deleted most of them [13:02] oh, 7z [13:02] 1,8 GB [13:03] and 43 GB the wikidump [13:03] emijrp, ^ [13:04] But I'm redoing some large wikis whose dump was ended incorrectly. [16:10] emijrp> skip it, or file you #1000000 bug [16:10] hahaha [18:35] kamelopedia dump consuming 3 GB of memory before starting? :/ [18:39] dumpgenerator.py is optimized for supercomputers [18:42] lol [18:49] The problem is that for some reason the script got a lot of titles, so many points on my screen that I didn't see anything else, but the wiki has only 40k pages.