#archiveteam 2012-05-03,Thu

โ†‘back Search

Time Nickname Message
01:23 ๐Ÿ”— DFJustin http://consumerist.com/2012/05/rock-band-for-iphone-game-will-self-destruct-in-29-days.html
01:30 ๐Ÿ”— BlueMax let's archive it! and break a million laws in the process!
01:33 ๐Ÿ”— underscor http://i.imgur.com/IP8pI.jpg
01:46 ๐Ÿ”— dashcloud actually- no it won't: http://www.theverge.com/gaming/2012/5/2/2995299/ea-rock-band-ios
02:48 ๐Ÿ”— chronomex yaay laws
02:49 ๐Ÿ”— SketchCow Nemo_bis: There are gaps in some of these runs of magazines - that's known, right
03:05 ๐Ÿ”— SketchCow AWWWWWWW shit
03:05 ๐Ÿ”— SketchCow I got the scanner running.
03:49 ๐Ÿ”— SketchCow http://archive.org/details/gamesmachine-034
06:06 ๐Ÿ”— Nemo_bis SketchCow, what gaps? I sent you a list of all the stuff I was shipping you, with numbers included; that's all I know
06:10 ๐Ÿ”— Nemo_bis hm or maybe I didn't
06:11 ๐Ÿ”— Nemo_bis the main part of it was
06:11 ๐Ÿ”— Nemo_bis *Pc Action: 93 magazines, 46 floppy disks, 55 CD-ROMs/DVDs (1-7, 9-94)
06:11 ๐Ÿ”— Nemo_bis *Pc World: 160 CD-ROMs/DVDs, 3 booklets (1996-2010)
06:11 ๐Ÿ”— Nemo_bis *The Games Machine: 94 magazines (87-180), 197 CD-ROMs/DVDs
06:11 ๐Ÿ”— Nemo_bis Content: 187 magazines, 412 CD-ROMs/DVDs, 106 floppy disks, 6 books:
06:12 ๐Ÿ”— Nemo_bis I don't remember what I checked and I added some more stuff later.
06:32 ๐Ÿ”— SketchCow No worries
06:32 ๐Ÿ”— SketchCow I just wanted to make sure that when I was digitizing shizzle and then I was missing here and thee, I wasn't, you know, losing stuff.
06:46 ๐Ÿ”— SketchCow http://archive.org/details/gamesmachine-034
06:46 ๐Ÿ”— SketchCow Tah dah!!!
06:52 ๐Ÿ”— Nemo_bis :D
06:52 ๐Ÿ”— Nemo_bis The file listing is empty
06:53 ๐Ÿ”— SketchCow Yeah, the new machine isn't loving ISO life.
06:53 ๐Ÿ”— SketchCow Anyway, minor setback.
06:56 ๐Ÿ”— Nemo_bis Internet Explorer 4.0
09:46 ๐Ÿ”— SmileyG err
09:46 ๐Ÿ”— SmileyG how long ago was the "migrate videos from Google video to youtube" thing?
09:50 ๐Ÿ”— SmileyG as I *just* got notifcation of it and switched mine.
11:23 ๐Ÿ”— underscor SmileyG: Looong time
11:23 ๐Ÿ”— underscor Over 6 months ago
11:23 ๐Ÿ”— underscor probably like 9
11:25 ๐Ÿ”— SmileyG heh weird.
11:25 ๐Ÿ”— SmileyG tho my videos were uploaded....
11:25 ๐Ÿ”— * SmileyG can't rememder the date now, nov 2009 I think.,...
13:42 ๐Ÿ”— Schbirid i need wget help. i would like to mirror http://forum.jamendo.com/ but exclude any URLs that contain /comment/ /profile/ or /entry/
13:42 ๐Ÿ”— Schbirid those are NOT directories, but arguments to index.php, eg http://forum.jamendo.com/index.php?p=/profile/331639/509
13:44 ๐Ÿ”— aggro I thought the "--reject" argument took care of that. Can use it to get rid of links that have things like "&action=print," and aren't really different content.
13:45 ๐Ÿ”— Schbirid i thought that would only look at the extension (as in end of the url string)
13:45 ๐Ÿ”— Schbirid i totally asked this before i think
13:45 ๐Ÿ”— aggro I can test it right quick. Thought it could match patterns as well.
13:45 ๐Ÿ”— Schbirid yeah
13:45 ๐Ÿ”— Schbirid Note that if any of the wildcard characters, *, ?, [ or ], appear in an element of acclist or rejlist, it will be treated as a pattern, rather than a suffix.
13:45 ๐Ÿ”— Schbirid sweet
13:54 ๐Ÿ”— Schbirid can't figure it out. will resume later
16:06 ๐Ÿ”— Schbirid fileplanet is shutting down
16:07 ๐Ÿ”— Schbirid http://www.fileplanet.com/
16:07 ๐Ÿ”— Schbirid "Note: FilePlanet is no longer being updated and is in the process of being archived."
16:07 ๐Ÿ”— Schbirid ha ha ha
16:07 ๐Ÿ”— Schbirid fuck
16:07 ๐Ÿ”— mistym "archived"
16:07 ๐Ÿ”— Schbirid fuckfuckfuck
16:07 ๐Ÿ”— Schbirid it's IGN
16:07 ๐Ÿ”— Schbirid should be fairly easy to grab, you can wget urls you get from the site without cookies
16:07 ๐Ÿ”— Schbirid iirc even the timestamps are correct
16:09 ๐Ÿ”— Schbirid the terrible thing is that many files are from the old hosted planet* sites and probably not listed on the fileplanet pages
16:09 ๐Ÿ”— mistym fileplanet isn't the one that requires an obnoxious downloader and has enforces browser useragent strings, is it?
16:09 ๐Ÿ”— * shaqfu tries
16:10 ๐Ÿ”— shaqfu No, seems to work fine
16:11 ๐Ÿ”— Schbirid i am logged in, so it might be different but example:
16:11 ๐Ÿ”— Schbirid http://www.fileplanet.com/224345/220000/fileinfo/Elder-Scrolls-V:-Skyrim---Macho-Dragons-v1.0
16:11 ๐Ÿ”— Schbirid click download
16:11 ๐Ÿ”— Schbirid you get http://www.fileplanet.com/224345/download/Elder-Scrolls-V:-Skyrim---Macho-Dragons-v1.0
16:11 ๐Ÿ”— Schbirid <a id="default-file-download-link" href="javascript:void();" onclick="window.location='http://m3x12.fileplanet.com/^542977820/ftp1/012012/Macho_Dragons_1_0.zip'">Click here</a>
16:11 ๐Ÿ”— Schbirid you can wget that just fine
16:11 ๐Ÿ”— shaqfu Phew
16:12 ๐Ÿ”— shaqfu Now I'm curious how far back their downloads go, given how old the site is
16:12 ๐Ÿ”— Schbirid they have a "Customer Support Fax"
16:12 ๐Ÿ”— Schbirid shaqfu: 1996 at least
16:12 ๐Ÿ”— shaqfu Yeah, I saw Half-Life 1 patches
16:12 ๐Ÿ”— Schbirid it started with planetquake and they migrated its downloads to fileplanet afaik
16:12 ๐Ÿ”— mistym Yeah, they're def. p. oldskool. I don't think they ever pruned stuff.
16:13 ๐Ÿ”— mistym "Customer Support Fax", wow.
16:14 ๐Ÿ”— Schbirid http://www.fileplanet.com/174/0/section/Gaming looks like a good starting point
16:14 ๐Ÿ”— shaqfu We should ask for a fax of their game demos...
16:15 ๐Ÿ”— Schbirid hm, 1999 for the oldest quake dm map
16:15 ๐Ÿ”— mistym Get a screenshot faxed back
16:15 ๐Ÿ”— SketchCow Adrian Chen - Gawker
16:15 ๐Ÿ”— mistym http://achewood.com/index.php?date=11222006
16:15 ๐Ÿ”— Schbirid thanks for keeping this place gawker free
16:15 ๐Ÿ”— shaqfu Trying to get an interview?
16:16 ๐Ÿ”— shaqfu Or just hoping something interesting happens to write up a clickmagnet headline?
16:16 ๐Ÿ”— SketchCow I assume same.
16:17 ๐Ÿ”— SketchCow I also assume he'll do the Sekrit Maneuver of coming back in here under an "assumed name".
16:19 ๐Ÿ”— Schbirid heh, wget -m --spider http://www.fileplanet.com/
16:20 ๐Ÿ”— Schbirid http://www.fileplanet.com/index.html works
16:20 ๐Ÿ”— yipdw "Jason Scott Bans Me From #archiveteam, Potentially Involved With Chickens"
16:21 ๐Ÿ”— yipdw or at least that's the caliber of writing Gawker seems to trade in
16:21 ๐Ÿ”— ersi lol!
16:22 ๐Ÿ”— Schbirid FPOps@IGN.com seems like a high contact
16:22 ๐Ÿ”— SketchCow Hey, some of you might know this.
16:22 ๐Ÿ”— SketchCow So, jsmess.textfiles.com no longer shows the javascript
16:22 ๐Ÿ”— SketchCow I know the big change - new apache server.
16:22 ๐Ÿ”— SketchCow But what's the setting to get it back?
16:23 ๐Ÿ”— ersi Do you get a 'download this javascript' file instead of the javascript?
16:23 ๐Ÿ”— ersi Or nothingness?
16:24 ๐Ÿ”— SketchCow Nothiness. Check it out.
16:25 ๐Ÿ”— yipdw SketchCow: it looks like the Content-Encoding header on the javascript is missing
16:25 ๐Ÿ”— SketchCow I am SURE it's a handler I don't have enabled.
16:25 ๐Ÿ”— SketchCow OK, where would I enable that in apache2.2
16:25 ๐Ÿ”— SketchCow I went from 1.3 something to 2.2 something so it was a fairly nasty jump.
16:25 ๐Ÿ”— yipdw in an appropriate configuration file (httpd.conf, virtualhost config, etc):
16:25 ๐Ÿ”— yipdw <FilesMatch "\.js\.gz$">
16:25 ๐Ÿ”— yipdw Header set Content-Encoding gzip
16:25 ๐Ÿ”— yipdw Header set Content-Type "text/javascript; charset=utf-8"
16:25 ๐Ÿ”— yipdw </FilesMatch>
16:25 ๐Ÿ”— yipdw I *think* that'llw ork
16:27 ๐Ÿ”— yipdw SketchCow: you can set that in a <VirtualHost> or <Directory> context, so scope to whichever one is most specific, etc
16:27 ๐Ÿ”— yipdw oh, fuck, I forgot that that needs mod_header loaded
16:28 ๐Ÿ”— yipdw well, maybe it's already loaded, we'll see
16:30 ๐Ÿ”— yipdw "You have already activated rake 0.9.2.2, but your Gemfile requires rake 0.9.2. Using bundle exec may solve this."
16:30 ๐Ÿ”— yipdw motherfucking christ I hate Ruby environments sometimes
16:31 ๐Ÿ”— SketchCow yipdw: Did it, no load
16:31 ๐Ÿ”— SketchCow LoadModule headers_module libexec/apache22/mod_headers.so
16:31 ๐Ÿ”— yipdw hmm
16:31 ๐Ÿ”— SketchCow it's definitely loading it.
16:32 ๐Ÿ”— yipdw hmm, ok
16:32 ๐Ÿ”— yipdw well, let's pull that FilesMatch thing out then
16:33 ๐Ÿ”— SketchCow I enabled compress
16:33 ๐Ÿ”— SketchCow already pulled
16:33 ๐Ÿ”— yipdw ok
16:33 ๐Ÿ”— SketchCow Still not working, but enabled compress.
16:33 ๐Ÿ”— yipdw the error Chrome is throwing is "Uncaught SyntaxError: Unexpected token ILLEGAL", so it's interpreting the gzipped javascript without decompressing it
16:33 ๐Ÿ”— yipdw I'm trying to figure out what combination of headers it needs to go "oh, this is gzipped"
16:34 ๐Ÿ”— Schbirid yay, we can go with http://www.fileplanet.com/NUMERICID/download/
16:34 ๐Ÿ”— yipdw I thought Content-Encoding was it, and it probably is
16:35 ๐Ÿ”— Schbirid pretty slow though
16:35 ๐Ÿ”— yipdw but there's probably something else wrong in there
16:35 ๐Ÿ”— yipdw one sec
16:35 ๐Ÿ”— devesine it is both content-encoding and content-type
16:36 ๐Ÿ”— yipdw devesine: yeah, but I'm not sure what was wrong with either
16:36 ๐Ÿ”— devesine <FilesMatch .*\.js\.gz$>
16:36 ๐Ÿ”— devesine </FilesMatch>
16:36 ๐Ÿ”— devesine ForceType text/javascript
16:36 ๐Ÿ”— devesine Header set Content-Encoding: gzip
16:36 ๐Ÿ”— yipdw what does ForceType do that Header set Content-Type doesn't?
16:36 ๐Ÿ”— devesine heck if i know
16:36 ๐Ÿ”— yipdw uhh
16:36 ๐Ÿ”— Schbirid yeah, nice. the /download/ HTML includes both the direct download link AND the location of the file, eg "Home / Gaming / RPG / Massively Multiplayer / Gas Guzzlers: Combat Carnage / Game Clients"
16:37 ๐Ÿ”— devesine looks like it causes all internal apache bits to treat it as that type
16:37 ๐Ÿ”— yipdw wtf
16:37 ๐Ÿ”— devesine though i don't know what aside from the content-type header cares
16:37 ๐Ÿ”— yipdw wtf apache
16:37 ๐Ÿ”— yipdw well, ok
16:37 ๐Ÿ”— yipdw I guess that can be tried
16:38 ๐Ÿ”— yipdw I'm not sure why the default MIME type detection would screw it up
16:38 ๐Ÿ”— Schbirid mail to ign sent
16:39 ๐Ÿ”— devesine possibly the mime type Content-Type header gets set late in the process?
16:39 ๐Ÿ”— yipdw devesine: I dunno -> https://gist.github.com/14cb242afb08c4c6714a
16:39 ๐Ÿ”— devesine (it's been a blissfully long time since i had to know details about apache internals that fine-grained - these days i'm all about nginx)
16:39 ๐Ÿ”— yipdw that's what I got with the previous ForceMatch in place, which I thought would work
16:39 ๐Ÿ”— yipdw but evidently it didn't
16:39 ๐Ÿ”— yipdw and I'm not sure why
16:40 ๐Ÿ”— devesine huh, that looks reasonable to me
16:40 ๐Ÿ”— yipdw yeah, me too
16:40 ๐Ÿ”— yipdw unless the charset thing is screwing it up
16:41 ๐Ÿ”— Schbirid http://www.fileplanet.com/024884/download/ is the same as http://www.fileplanet.com/24884/download/ and they both work
16:41 ๐Ÿ”— yipdw where by "it" I mean Chrome
16:42 ๐Ÿ”— devesine https://gist.github.com/14cb242afb08c4c6714a#gistcomment-298537 is my server's response
16:42 ๐Ÿ”— yipdw it looks like Apache is doing the right things there, unless it's mangling the response body
16:42 ๐Ÿ”— devesine yeah, i'm using chrome too
16:42 ๐Ÿ”— yipdw hm
16:42 ๐Ÿ”— yipdw i wonder what happens with no charset set
16:43 ๐Ÿ”— devesine huh, i'm getting Content-Encoding: x-gzip back from jsmess.textfiles.com
16:43 ๐Ÿ”— devesine (in curl)
16:44 ๐Ÿ”— yipdw the configuration might have changed
16:47 ๐Ÿ”— devesine chrome thinks the headers I got back include
16:47 ๐Ÿ”— devesine Content-Type: application/x-gzip
16:47 ๐Ÿ”— Schbirid fuckers return 302 for non existing links...
16:50 ๐Ÿ”— Schbirid gah. can i not tell wget not to follow a 302 to a directory?
16:50 ๐Ÿ”— Schbirid "wget -X error http://www.fileplanet.com/1/download/" still downloads "http://www.fileplanet.com/error/error.shtml?aspxerrorpath=/autodownload.aspx"
16:50 ๐Ÿ”— devesine hm, chrome wasn't believing that the cache was cleared, it looks like
16:51 ๐Ÿ”— devesine clearing the cache, quitting chrome, starting it up again, clearing the cache, quitting chrome, starting it up again, and /then/ going to jsmess.textfiles.com loaded it fine
16:53 ๐Ÿ”— Schbirid http://blog.jamendo.com/2012/05/03/response-to-the-community/
17:03 ๐Ÿ”— Schbirid i wonder how soon fileplanet links expire
17:12 ๐Ÿ”— Schbirid so, we need to download http://www.fileplanet.com/NUMERICID/download/ , grep that file for the dl link. and download the file (to the same dir i guess, sorting can happen later)
17:12 ๐Ÿ”— Schbirid i love bash
17:14 ๐Ÿ”— Schbirid and jedit
17:16 ๐Ÿ”— Schbirid hm, the actual download links are interesting too. eg /ftp1/fpnew/action/quake/levels/dm/armory2.zip
17:17 ๐Ÿ”— Schbirid not sure if i should rather save the files in that part or in www.fileplanet.com/NUMERICID/download/
17:17 ๐Ÿ”— Schbirid thoughts?
17:19 ๐Ÿ”— Schbirid nah
17:19 ๐Ÿ”— Schbirid the url is saved so if needed one could do that later
17:21 ๐Ÿ”— Schbirid script is ready \o/
17:22 ๐Ÿ”— Schbirid hm, any suggestion how i can make a loop like: for i in {$1..$2}?
17:22 ๐Ÿ”— Schbirid i would like to pass the starting and last id to download from the commandline
17:22 ๐Ÿ”— Schbirid doing this i get "{1466..1470}"
17:22 ๐Ÿ”— Schbirid for $i itself
17:23 ๐Ÿ”— Schbirid "for i in $(seq $1 $2)" seems to work
17:24 ๐Ÿ”— bayleef` maybe for i in {$1..$2}?
17:24 ๐Ÿ”— Schbirid nope, that was my first try
17:25 ๐Ÿ”— bayleef` Ah
17:25 ๐Ÿ”— Schbirid this works great, yay
17:25 ๐Ÿ”— * bayleef` grins
17:34 ๐Ÿ”— SketchCow http://jsmess.textfiles.com/ is back. Thanks for the help, everyone.
17:34 ๐Ÿ”— SketchCow As a bonus - some keys work!
17:36 ๐Ÿ”— Schbirid https://github.com/SpiritQuaddicted/fileplanet-file-download/blob/master/download_pages_and_files_from_fileplanet.sh
17:36 ๐Ÿ”— Schbirid we'll need to assign ranges. starting from 1 and ending somewhere in 200k
17:36 ๐Ÿ”— Schbirid it takes about one second for non-existing IDs so far
17:36 ๐Ÿ”— Schbirid the downloads much be big
17:39 ๐Ÿ”— Schbirid i'll start with 1-2000 just to see what happens
17:39 ๐Ÿ”— shaqfu Looks good
17:40 ๐Ÿ”— shaqfu Is it roughly chronological in terms of file ID?
17:40 ๐Ÿ”— Schbirid yeah, i think very much so
17:40 ๐Ÿ”— shaqfu Gotcha. Should go quicker early, then
17:43 ๐Ÿ”— shaqfu If it does its job, let me know and I'll run it
17:46 ๐Ÿ”— Schbirid sweet :)
17:47 ๐Ÿ”— Schbirid ahaha http://www.kb.cert.org/vuls/id/520827
17:49 ๐Ÿ”— shaqfu I'll give it a shot now, actually, on 2001-4000
17:49 ๐Ÿ”— Schbirid thanks
17:50 ๐Ÿ”— Schbirid http://jsdosbox.appspot.com
17:54 ๐Ÿ”— Schbirid <15 minutes per 1k so far
17:54 ๐Ÿ”— Schbirid but only single digits of actual found files ;)
17:55 ๐Ÿ”— Schbirid and i am quake fanatic enough to know all of them
17:55 ๐Ÿ”— Schbirid i'll go 4001-10000
17:56 ๐Ÿ”— shaqfu Sigh; I really need to get a real OS on this server, and not one with flaky wget that doesn't have --max-redirect
17:57 ๐Ÿ”— Schbirid haha
17:59 ๐Ÿ”— shaqfu Got it this time; doing 10k-15k
17:59 ๐Ÿ”— Schbirid yay
18:00 ๐Ÿ”— shaqfu Seems to be moving a bit slower for me, although those are probably much larger files than Quake maps
18:01 ๐Ÿ”— Schbirid might also be just fileplanet sucking
18:01 ๐Ÿ”— shaqfu That too
18:01 ๐Ÿ”— Schbirid eralier i sometimes had 5-10 seconds per html page download
18:03 ๐Ÿ”— Schbirid hm, i have a bug
18:04 ๐Ÿ”— shaqfu ?
18:05 ๐Ÿ”— shaqfu And it looks like there are ~220k fileIDs...
18:05 ๐Ÿ”— Schbirid ah, haha. i try to download even if there is no link. so files_$1_$2.log gets a lot of "http://: Invalid host name."
18:05 ๐Ÿ”— Schbirid i should make that nicer really
18:05 ๐Ÿ”— Schbirid scared me for a moment
18:05 ๐Ÿ”— shaqfu Ah, phew; nothing critical then
18:06 ๐Ÿ”— DFJustin holy hell this jsdosbox
18:25 ๐Ÿ”— SketchCow Feel inspired.
18:32 ๐Ÿ”— Schbirid doing fileplanet 15k-16k
18:36 ๐Ÿ”— shaqfu Are there any IGN network sites at-risk like this?
18:36 ๐Ÿ”— shaqfu any other*
18:37 ๐Ÿ”— Schbirid i'd consider them all at risk :(
18:38 ๐Ÿ”— Schbirid interesting, the new planet* files are not served from fileplanet anymore, i never noticed
18:38 ๐Ÿ”— Schbirid eg http://planetquake.gamespy.com/View.php?view=Quake4.Detail&id=157
18:39 ๐Ÿ”— shaqfu I don't see the warnings on planetquake; guess it's not at immediate risk
18:39 ๐Ÿ”— Schbirid haha, no way. they seem to have dropped forumplanet without any notice
18:39 ๐Ÿ”— Schbirid wtf
18:40 ๐Ÿ”— Schbirid jesus
18:40 ๐Ÿ”— Schbirid and i mirrored by pure random intend
18:40 ๐Ÿ”— shaqfu Ouch
19:01 ๐Ÿ”— Schbirid hm, the forums might just have been moved. eg http://www.ign.com/boards/forums/diablo.5218/ exists
19:47 ๐Ÿ”— Schbirid 15k-16k took 70 minutes
19:47 ๐Ÿ”— Schbirid ~300 mb
19:48 ๐Ÿ”— Schbirid bedtime for me. anything above 16k is free
19:48 ๐Ÿ”— Schbirid night
21:33 ๐Ÿ”— closure http://eindbazen.net/2012/05/php-cgi-advisory-cve-2012-1823/ I'll bet this is how the old wiki kept getting hacked
21:33 ๐Ÿ”— closure (current one does not seem vulnerable)
21:36 ๐Ÿ”— Ymgve does your old wiki use php-cgi?
21:37 ๐Ÿ”— closure according to the page, it was a common configuration on dreamhost
21:37 ๐Ÿ”— mistym Oh PHP.
21:37 ๐Ÿ”— Ymgve oh
21:41 ๐Ÿ”— shaqfu Oh Dreamhost
21:41 ๐Ÿ”— mistym That too...
21:49 ๐Ÿ”— shaqfu Well, today's foray into scripting taught me something: line breaks suck
22:03 ๐Ÿ”— yipdw "We found that giving the query string รขย€ย˜?-sรขย€ย™ somehow resulted in the รขย€ยœ-sรขย€ย command line argument being passed to php, resulting in source code disclosure."
22:03 ๐Ÿ”— yipdw what the fuck
22:03 ๐Ÿ”— chronomex not "somehow", some guy removed protection against that in like 2004 because it made unit testing more complicated
22:05 ๐Ÿ”— yipdw unit testing of what
22:05 ๐Ÿ”— yipdw I guess I should look at the commit history
22:05 ๐Ÿ”— chronomex probably unit testing that didn't exist
22:06 ๐Ÿ”— yipdw because the fact that an application query string can affect the command line seems ridiculous on its face
22:06 ๐Ÿ”— chronomex the query string is commandlineified
22:06 ๐Ÿ”— chronomex I probably sound like I'm making excuses, don't I?
22:07 ๐Ÿ”— yipdw oh
22:07 ๐Ÿ”— chronomex fuck those guys. they couldn't program themselves out of a paper bag.
22:07 ๐Ÿ”— yipdw "From the regression testing system we
22:07 ๐Ÿ”— yipdw use -d extensively to override ini settings to make sure our test
22:07 ๐Ÿ”— yipdw environment is sane."
22:07 ๐Ÿ”— yipdw that's still a what teh fuck
22:07 ๐Ÿ”— yipdw why not just put in a "sane" ini file as a precondition
22:07 ๐Ÿ”— chronomex too complicated
22:46 ๐Ÿ”— ersi yipdw: Testing is hard.

irclogger-viewer