[01:09] hot damn, I just noticed the warrior is uploading to archive.org now [01:09] and multiple uploads at once [01:09] I like [01:45] great talk at defcon SketchCow [01:55] http://ia600109.us.archive.org:8088/mrtg/networkv2.html [01:55] Guess where webshots started [01:55] xD [01:59] hmmmmmmmmm [02:57] Things are slowing down on the webshots side for FOS, which is good. [02:59] if i happen to have an ip-address change in the rsync process, is that a problem? [03:04] SketchCow: it's slowing down because a few of us are having issues running the script [03:06] No, no. [03:06] They should not going to FOS anymore. [03:06] There's some stragglers. [03:07] oh [03:50] uploaded: http://archive.org/details/cdrom-linuxformatmagazine-130 [04:09] SketchCow: do you keep FOS running for rsync over the weekend? [04:10] since i need to leave here soon and won't be able to fix stuff until monday, and currently getting some nasty errors with the new version of the script [04:11] (it's a small node, nooon) [04:14] ah nevermind, it wont let me run outdated code :/ [04:23] fos should be fine to accept pending rsyncs afaik [04:23] but you cant get new items from the tracker now [04:23] on the old code [04:24] yepe just realized [04:24] and thx :) [05:04] webshots standalone users getting async error: do which curl [05:13] http://archive.org/details/archiveteam-city-of-heroes-forums-megawarc-5 [05:13] Gaze upon the future! [05:14] MEGAWARC [05:14] I am chronomex and I approve this message. [05:16] Watching Breaking Bad [05:16] I also approve of Breaking Bad. [05:17] What I like is that it used/uses all this music that, it is later reported, the bands had no say in it going in. [05:18] So you get someone getting his face shot off or some prostitute ruining her life, and a sad little musician has to see it was paired with their music. [05:19] i found the glenn beck forums [05:20] hahaha, I didn't know that [05:20] what is funny is that there are 'pirated' copies of old glenn beck radio shows on it [05:20] lol [05:22] i found 3 days from 2007 [05:26] Uploading the SOPA blackout collection. [05:26] That should be interesting. [05:28] It'll be a nice complete grab. [05:29] I think we've about hit the end of the uploads through FOS of webshots [05:32] looks like i need to grab the glenn beck forums [05:32] based on wayback machine there only grabs from 2009 and 2010 [05:34] Well, I'll give you an inside tip, godane. [05:34] By the end of October, the Wayback machine will have doubled its content. [05:34] Subsequently, it might be worth it to just wait to see what flies in first. [05:34] ok [05:38] I've just killed webshots uploading on FOS [05:38] Since it's going to be replaced with a much more powerful system [05:38] And I need to dump a bunch of stuff through FOS to get it into Wayback [05:39] City of Heroes Forum is getting prepped for wayback. [05:39] That should be exciting in the extreme for them. [05:40] Wayback access [06:20] Breaking Bad.... perfect background for archiving work. [06:31] all glory to the megawarc [06:34] Yay, all my BT users are done [07:57] i know why there is not real archive for glenn beck forums [07:58] why? [07:58] you have to pay for access to it [07:58] i think thats the case [07:58] the link is only on the archive mp3 page anyway so it sort makes sense [07:59] hmm [08:02] so i maybe the only hope to archive this [08:03] thats what i tell my self cause stuff like dl.tv and crankygeeks would have be lost if i left it up to you guys [08:04] good thing i did archived when i did [08:17] Go for it! [08:25] We're now out of btinternet usernames. 4 hard cases left, but that's it. [09:00] hurrar [09:01] Haha I was last upload? \o/ [11:04] hm, how can I add a warc file to the internet archive? [11:20] C-Keen: http://archive.org/create/ [11:20] alard: and then just upload the warc? [11:20] Yes, make a new item. [11:21] alard: will that get put in the wayback machine? [11:21] Ah, that I do not know. [11:21] Although if you upload it there and put it on SketchCow's list there might be a chance. [11:22] alard: alright, will do [11:30] alard: hm, should a website mirror that contains mainly educational texts and source code be put in the Community Texts collection? I am unsure what to pick here [11:43] hah uploaded my first item to the archive... [11:53] I think Community Texts is the only collection you can pick. The others are protected (SketchCow can move items). [12:00] errrr [12:00] why has my webshots stopped generating new processes [12:00] Oh, restarting project ¬_¬ [13:41] I HAVE to start heading south. [13:42] But we just had a CDX derive fail off of a megawarc generator. [13:43] alard: http://www.us.archive.org/log_show.php?task_id=127674813 [13:46] SketchCow: Is the original tar somewhere? [13:47] Sadly, no. [13:47] I should have left it. [13:47] BOARDS-COH-01.tar.megawarc.json.gz BOARDS-COH-02.tar.megawarc.warc.gz BOARDS-COH-04.tar.megawarc.tar [13:47] root@teamarchive-1:/2/CITY# ls [13:47] BOARDS-COH-01.tar.megawarc.tar BOARDS-COH-03.tar.megawarc.json.gz BOARDS-COH-04.tar.megawarc.warc.gz [13:47] BOARDS-COH-01.tar.megawarc.warc.gz BOARDS-COH-03.tar.megawarc.tar megawarc [13:47] BOARDS-COH-02.tar.megawarc.json.gz BOARDS-COH-03.tar.megawarc.warc.gz [13:47] I'm going to gunzip one myself while I get dressed here. [13:47] BOARDS-COH-02.tar.megawarc.tar BOARDS-COH-04.tar.megawarc.json.gz [13:48] I'm downloading that failed .warc.gz now, but that will take a while. [13:48] I wouldn't do that. [13:48] root@teamarchive-1:/2/CITY# gunzip BOARDS-COH-01.tar.megawarc.warc.gz [13:49] More critical, MUCH more critical, is http://www.us.archive.org/log_show.php?task_id=127728961 [13:49] Watch it, and if it fails, THEN we have some testing to do. [13:51] http://www.us.archive.org/log_show.php?task_id=127728688 or this one. [13:51] That'll happen faster. [13:52] These are converted tars? Or original megawarcs? [13:52] Converted tars. [13:52] Wait no [13:52] I took three different tars, uncompresed them into a file directory. [13:53] Then megawarc'd the file directory [13:54] greetings all! is there a way to set the upload speed? This 0.4kB/s is ridiculous.... [13:55] The BOARDS-COH-01 too? That would be strange, since it contains a directory, and the pack option isn't supposed to add directories. [13:55] dragondon: Which project? [13:55] webshots [13:55] Does it say CurlUpload? [13:55] yes [13:56] Hmm. No, there isn't any limit. [13:56] I did see higher speeds earlier but now it's dragging... [13:56] been doing so for a few hours now [13:58] alard: So: [13:58] if http://www.us.archive.org/log_show.php?task_id=127728688 doesn't work, warning sign. [13:58] http://www.us.archive.org/log_show.php?task_id=127728961 is the critical one. [13:58] If that doesn't work, we have real issues, that's a webshots generator. [13:59] I have to start driving to NYC now. [14:00] If it doesn't work, for whatever reason (webshots), underscor needs to go back to .tar generation until we figure it out. [14:00] Otherwise, just assume BOARDS-COH is me doing something fucked up [14:00] Yes. (I can't reach archive.org now. www.us.archive.org works.) [14:00] see [14:00] i'm not crazy when i couldn't get to archive.org [14:00] same here (South Korea) "Iceweasel can't establish a connection to the server at archive.org" [14:01] i got the same error [14:01] :-D [14:01] I can ping it thought [14:01] though [14:01] its not just me [14:01] Just alerted them [14:03] The last gzip record in BOARDS-COH-01.tar.megawarc.warc.gz is fine. [14:04] As is the first. [14:04] with this version of the VM, will it loose everything if I force the machine to shutdown? I need to figure out some hardware issues here. [14:05] Yes. [14:05] I'm hoping that furutre updates will have a buffer to prevent that :) [14:05] http://www.us.archive.org/log_show.php?task_id=127728961 - task failed. [14:06] Luckily (?) it's a mysql error. [14:06] dragondon: Wget can't resume, and upload resuming is complicated, so it's unlikely. [14:06] :( [14:07] alard, is there no way to generate the files first, then send, and have soem sort of check/confirm/then resume? [14:08] It's complicated, and you don't have to restart that often. Resume things would also complicate error recovery (if the warrior has a problem now, you can reboot and start again). [14:09] it's not a warrior issue, for some reason my system is reporting only have my phyiscal memory....kinda don't like killing all the work it did, hence why I was asking for any speed mods. Guess I'll have to force restart [14:26] the sopa item failed [14:28] in both of these cases, I generated it from a set of directory. [14:28] I suspect there's an undetected invalid warc in there. [14:30] 4 web shots I am going to suggest that we go back to generating large tar files. [14:31] Yes. [14:31] it sounds like we need to do a few more additional tests. [14:31] We didn't do enough. [14:32] that is just because I think of you as an unstoppable code juggernaut [14:34] however, there is a hole range of code you have absolutely no access to. [14:37] It would be handy if these error messages included a byte position. That would make it easier to find the problem. [14:43] I am in the car in can't look this up easily, but I do believe there is a public repository of all this code. [14:45] If the gzip is invalid (that's what the error message suggests, at least) that just needs to be fixed. There's nothing wrong with the indexer. [14:45] ./megawarc --verbose pack test.tar data/infiles/ [14:45] Checking data/infiles/bad.warc.gz [14:45] Checking data/infiles/good.warc.gz [14:45] Copying data/infiles/good.warc.gz to warc [14:45] Copying data/infiles/bad.warc.gz to warc [14:46] That's wrong: bad.warc.gz isn't complete (I chopped off the last 1000 bytes) so it shouldn't go in the warc. [14:49] The megawarc gzip-testing doesn't work, it seems. (The good news is that the positions in the json are correct, so the current megawarcs can be repaired.) [15:00] old versions are kept. [15:00] For webshots? [15:11] all I had not to sure did wait 1 moment [15:12] You shouldn't text while driving. :) [15:12] Watch out SketchCow is using voice recognition [15:12] let's try again. Any archive that I was given are still in car for bad. The new batch was being tested, but we have not fully committed to it, instead we are just feelings disks on the round robin machine. [15:13] Sure. [15:14] we were going to suck my nuts off [15:15] I let that 1 go because what I said was sign off. [15:15] obviously, voice recognition has a way to go [15:17] although, if something with my computer end up sucking my nuts off, dad hey, what's a little problem here and there with voice recognition? [15:18] maybe that's how the algorithm got the job in the first place [15:29] Fatal error: Uncaught exception 'Exception' with message 'WARNING-OR-ERROR: [2] [mysql_connect(): Too many connections] [/usr/local/petabox/www/common/DB.inc] [269]' in /usr/local/petabox/deriver/derive.php:46 [15:29] Stack trace:# [15:29] It died. [15:29] Though that doesn't appear to be an issue with the megawarc itself which seems good. [15:42] I think it works better now: [15:42] CRC check failed 0x5cdcbe41 != 0x30788a20L [15:42] Checking data/infiles/bad-extra.warc.gz [15:42] Checking data/infiles/good.warc.gz [15:42] Copying data/infiles/good.warc.gz to warc [15:42] Invalid gzip data/infiles/bad-extra.warc.gz [15:42] Copying data/infiles/bad-extra.warc.gz to tar [15:42] Checking data/infiles/bad.warc.gz [15:42] CRC check failed 0xdcbe4175 != 0xc21fb9ffL [15:42] Invalid gzip data/infiles/bad.warc.gz [15:42] Copying data/infiles/bad.warc.gz to tar [15:42] https://github.com/alard/megawarc/commit/fb0ba014ff4df76411cdd426a15764695a33c59e [15:51] oh look [15:51] http://catalysthost.com/clientarea/cart.php?gid=4 [15:51] :P [15:51] >Unmetered 1gbit [17:02] for $7 a month? doesn't sound like a bad deal [17:21] SketchCow: http://www.us.archive.org/log_show.php?task_id=127728961 failed due to DB problems, rerunning. [17:22] (DB problems that are unrelated to megawarc) [17:25] oh shit, boxes are almost full [17:25] better start bailing [17:26] underscor: herp [17:26] weren't they already doing so :S [17:27] hm? [17:27] There [17:27] Why are they running outta room? [17:27] :/ [17:27] Oh [17:27] There's no auto-ingest to IA [17:27] Do they not automatically pump to IA? [17:27] Ah ok. [17:28] Jason (and I) want a human to eyeball them [17:28] For now [17:28] Understandable. [17:28] So, do YOU work at IA? [17:31] all of my theregister.co.uk warc dumps are up [17:31] i have not done 2011 yet [17:31] but its up to 2010 [17:31] which is all i have right now [17:40] SmileyG: Yeah [17:40] I'm part time, though [17:40] o [17:40] (I'm a student the rest of the time) [17:40] Still, awesome. [17:40] In upstate NY [17:40] hehe, thanks :D [17:41] IA should have some more DC's ;) [17:41] Like one in coventry, hahah here, its cheap(yeah right) [17:43] looks like that megawarc has gz issues as well [17:48] Awww [17:56] There must be quite a few invalid warc files then. [18:05] alard: I thought megawarc checked them out? [18:06] :< [18:06] hmmm this worries me [18:06] Is there a way to check the validity of a gz on the command line? [18:06] (besides just extracting it) [18:12] gunzip -t file.tar.gz [18:12] thx [18:12] alard: should we switch to tars for now, or what do you think? [18:12] for test :) [18:13] also hmmm [18:13] if your worried about the tars, you can check them too [18:13] gunzip -c file.tar.gz | tar t > /dev/null [18:13] I am beginning to get close to drowning, so I need to figure out the exit strategy [18:13] herp [18:14] (people should save slower!) [18:14] xD [18:14] can you just do what the warrior would do? [18:14] but just the upload bit (and direct it to FOS/IA ? [18:14] You said theres... 12? servers/ [18:16] SmileyG: No, no, I *am* fos/IA [18:16] * underscor is the servers warriors are uploading to [18:16] Those servers are nearing full [18:16] yeah [18:16] but originally all teh warriors were uploading to 1 location, which is now a number of locations? [18:16] FOS is full/not accessible for this project [18:16] Yes [18:16] What was the plan for the orignal server? [18:17] It was to upload tars, which had been happening [18:17] can you not replicate that process over to the other servers? [18:17] now the plan was doing the megawarcing with the script from alard, which I've been doing [18:17] but if they're corrupt, then maybe we should go back to tars for now [18:17] yeah [18:18] I may just make a Command Decision(tm) since SketchCow is on the road [18:18] and deal with the fallout later [18:19] well if you don't, all archiving basically stops [18:19] unless you've got his number? [18:19] yeah, I may call him after class [18:19] http://p.defau.lt/?Fy8RdcZOojTsFPlt6Yyzcg [18:19] uh oh [18:19] cc alard [18:23] alard: I thought megawarc checked them out? <-- I think it did, but the check wasn't working? (until he fixed it just now) [18:23] https://github.com/alard/megawarc/commit/fb0ba014ff4df76411cdd426a15764695a33c59e [18:26] aha [18:34] gunzip -t webshots-20121012070021.megawarc.warc.gz │·········· [18:34] gzip: webshots-20121012070021.megawarc.warc.gz: invalid compressed data--crc error [18:34] │·········· [18:34] sigh [18:35] so I guess this one is fucked [18:38] sigh [18:38] gunzip -t webshots-20121012070358.megawarc.warc.gz │·········· [18:38] gzip: webshots-20121012070358.megawarc.warc.gz: invalid compressed data--crc error [18:38] │·········· [18:40] Rebuilding using new code [18:40] alard: what does the script do if it encounters a "bad" .warc.gz? [18:44] DFJustin: Where did alard say that link btw? [18:46] underscor [18:46] do you not have history in here from earlier [18:46] alard and SketchCow were talking about the corruption. i can paste in pm if you need [18:47] woah, there we go [18:47] what the hell, quassel [18:48] how long does it take to rebuild :S [18:49] S[h]O[r]T: Found it. Not sure what quassel was doing saying there wasn't more scrollback >:( [18:49] SmileyG: Uh, I haven't timed them, actually [18:50] I assume if a set passes gunzip -t, then it's probably safe to upload [18:51] I *believe* so, the only better check is physically unpacking it and checking. [18:51] which kind of negates the point. [18:58] underscor: It turned out that the gzip check I had in megawarc didn't really check anything. [18:59] So if there was an invalid warc, it was added to the big warc, which then became unreadable. [18:59] I think it is fixed in the latest megawarc version (it works on my test files, at least). [18:59] Is there a way to easyclean from the json? [18:59] Ouch. [19:00] Before that fix, SketchCow suggested that we keep using tar until the megawarc is somewhat more stable and tested. [19:00] Yes. [19:00] schweet [19:00] The positions of the warcs in the json are correct. [19:00] So it's possible to untangle them. [19:01] So it might be an idea to keep using the latest megawarc script for webshots. [19:01] I think it works, it's a good test. We also don't loose data if it does not, it just means rebuilding things. [19:02] (To answer your question about what happens to the invalid gzips: they're added to the tar file.) [19:02] -Die in a Fire ? [19:02] ah, the "extras" tar file [19:02] ? [19:03] Yes. So if the tar file is not empty, that means there were things that couldn't be saved in the warc. [19:03] What about the ones that say "extra field of 10 bytes ignored"? [19:04] (ones = warc.gz, when testing with gunzip -t) [19:07] Uploading the first new set [19:08] That's the warc format: it has an extra gzip field with the length of the compressed warc record. [19:08] hmmm [19:08] gzip patch needed at some point then? :S [19:08] That's handy if you want to skip through the warc, but the gzip utility doesn't know how to use it. [19:08] Well, it does what it says: it sees an extra field and ignores it. [19:08] :D [19:08] least it doesn't blow up I guess [19:09] Wonder if you can tell the test to ignore it (so it only raises errors on _real_ error [19:09] s [19:09] I smell diner. [19:13] SmileyG: It still returns $? = 0 [19:13] so it's not really a big deal [19:14] Is this a new one? http://www.us.archive.org/catalog.php?history=1&identifier=webshots-freeze-frame-20121012103518 [19:15] Yes [19:15] Only the json is up though [19:15] the warc is still uploading [19:16] Ah. Was there a tar? [19:17] 0 bytes [19:17] So it's exiting to see if this one passes the test. [19:17] It passed gunzip -t too [19:17] underscor: Ah ok ! I presumed it'd return some non-fatal error code [19:17] but if its not showing it other than the stout output.... no worries [19:19] alard: Is the procedure to fix these to "create" the tar backwards, and repack, or will you be able to write a "fixme" thing? :) [19:19] I think it will be a fixme thing. [19:19] rad [19:20] another one finished! [19:20] -rw-r--r-- 1 abuie users 50G Oct 12 19:18 webshots-20121012070358.megawarc.warc.gz [19:20] -rw-r--r-- 1 abuie users 103K Oct 12 19:18 webshots-20121012070358.megawarc.json.gz [19:20] -rw-r--r-- 1 abuie users 388M Oct 12 19:18 webshots-20121012070358.megawarc.tar [19:20] Hey, a tar. [19:20] working and looking correct now? [19:20] That's both good and bad news. [19:20] Good for megawarc, bad for webshots. [19:21] o_O [19:22] We can extract out the "bad" users and requeue them, though, right? [19:23] AH, the tars are failed users getting left over? [19:24] SmileyG: Yeah [19:24] The invalid warcs end up in the tar. [19:24] Well, faulty warc.g [19:24] z [19:24] mhm [19:25] We could make a list of the users that have made it to archive.org and compare that with the full list of users. [19:25] But for the moment we have enough new users. [19:26] Lots of limestone networks hosts, wonder who that is [19:26] They're pumping a lot of data :D [19:27] Is it Sue? [19:27] She was saying shes gonna hit her cap shortly in #webshots [19:29] alard: http://archive.org/catalog.php?history=1&identifier=webshots-freeze-frame-20121012103518 Here we go! [19:31] wooooo [19:31] * chronomex parties [19:34] ugh, moving 50GB takes so long [19:39] http://www.us.archive.org/catalog.php?history=1&identifier=webshots-freeze-frame-20121012070358 is getting its replacement uploaded [19:42] schweet [19:42] Every box is now megawarcing [19:43] Although these take quite a bit of time [19:43] Wonder if I can keep up with the inflow [19:43] INTERFLOW [19:44] http://ia600109.us.archive.org:8088/mrtg/networkv2.html http://ia601104.us.archive.org:8088/mrtg/networkv2.html http://ia700106.us.archive.org:8088/mrtg/networkv2.html [19:44] Y'all have been keeping them pretty nice and busy [19:49] - Downloaded 18400 URLs got another nice one :D [21:02] http://www.us.archive.org/log_show.php?task_id=127779327 It's cdxing now! [21:02] Cross fingers!!!! :D [21:12] The CDX indexer is already running twice as long as the previous time. [21:12] 's a good sign :D [21:15] hooray. [21:15] \o/ [21:15] * SmileyG waits for underscor to start groveling. [21:15] yay, Jason's back [21:16] 25 seconds! [21:16] I have like 2TB processing [21:16] :P [21:16] megawarcing takes a fair bit of time/work, though [21:16] still can't quite tell if I'm filling faster than I'm dumping [21:17] hope not :S [21:17] Also, just the sheer (super awesome!) scale of moving 50gb bricks is... interesting [21:17] :) [21:20] [6~[6~[6~[6~[6~[6~[6~[6~ [21:20] what he said. [21:23] The voice recognition gets more and more interesting. [21:25] hahahha [21:25] It was better when it was sucking his nuts off or whatever [21:30] so, status update please [21:31] this android ssh client has no pgup [21:32] also. comiccon is hell on earth. [21:34] SketchCow: We (think) we patched the bugs [21:35] Test derive is still running [21:35] but it got further than any of them have [21:35] so (probably) good [21:35] I have like 2.5TB to ingest [21:35] once we see how this goes [21:44] Looks good? http://www.us.archive.org/log_show.php?task_id=127779327 [21:47] alard: you are the freakin' man [21:47] we need to set you up on gittip :D [21:48] i am so jeli [21:49] SketchCow: IT WORKED IT WORKED IT WORKED [21:50] This isn't actually that much better than before: there's no tar with invalid warcs. That already worked. [21:50] http://archive.org/details/webshots-freeze-frame-20121012173401 the latest one lacks a warc? [21:50] SmileyG: still uploading [21:51] alard: oh [21:51] o [21:51] :D [21:51] find one where the problem is fixed versus before. [21:51] sopa is a good one [21:51] alard: 3 finished, no tar [21:51] :D [21:52] The two that crashed with the zlib.error problem should have tars. [21:52] I just had to restart them [21:52] yes, those haven't finished [21:52] I'm now testing my fix script on the sopa files. [21:52] (Takes a while.) [21:53] Ah, this will be the one that lets us fix a bad megawarc.warc.gz [21:53] ? [21:53] underscor. ask hank when the last load in of the wayback happens, please. [21:54] Yes. It reads the megawarc, checks every warc.gz, sorts them into new warc/tar files, and saves the locations in a new json file. [21:54] It worked on my tiny test file, but 15GB takes a little longer. [21:54] alard, call it megarepair and add it to the repository. :) [21:58] Too late, it's already called megawarc-fix. It's now in the repository. [22:03] SketchCow: asking [22:04] alard: pulling :D [22:04] you're amazing [22:04] Might need some testing first, though. [22:11] Hmm. Apparently not every tar header is exactly 512 bytes long. [22:12] There's a 'gnu tar' type that has headers of 1024, 1536 etc bytes long, if there are long filenames. [22:12] As there are in the SOPA file. [22:35] -rw-r--r-- 1 abuie users 50G Oct 12 21:37 webshots-20121012183139.megawarc.warc.gz [22:35] -rw-r--r-- 1 abuie users 73K Oct 12 21:37 webshots-20121012183139.megawarc.json.gz [22:35] -rw-r--r-- 1 abuie users 639M Oct 12 21:37 webshots-20121012183139.megawarc.tar [22:35] alard: One of the ones with tars finished, fyi [22:35] Uploading now [23:20] So, which date is *only* available in a faulty megawarc? [23:20] *data [23:20] The SOPA megawarc is really hard to fix, since it has these ridiculously long file names. [23:23] So I'd like to suggest that we 1. make new megawarcs from scratch, and test with gunzip -tv / tar -tv / megawarc restore, if we have the original data; and 2. use megawarc-fix to fix the megawarcs that we don't have in another form, such as webshots. [23:23] alard: I'm currently running the fixer on 20121012070021 [23:23] webshots doesn't have long filenames, so the fixer should work for those files. [23:27] There are no long filenames in 20121012070021, so that should work, I hope: curl -s -L http://archive.org/download/webshots-freeze-frame-20121012070021/webshots-20121012070021.megawarc.json.gz | gunzip | grep LongLink [23:28] CoH can also be fixed: curl -s -L http://archive.org/download/archiveteam-city-of-heroes-forums-megawarc-1/BOARDS-COH-01.tar.megawarc.json.gz | gunzip | grep LongLink [23:29] But SOPA can not: curl -s -L http://archive.org/download/archiveteam-sopa-blackout/2012-sopa-day-collection.megawarc.json.gz | gunzip | grep LongLink [23:35] I'll defer to SketchCow before fixing CoH, but I assume he'll want it to be [23:56] how are they corrupt ?