[00:02] 100mbpt [00:02] but had 2 or 3 systems going at various times [00:02] right now 3 at a time [00:02] so about 300mbit :P [00:05] Aah, I see. Still that's some really good throughput. I've been getting...looks like about 500GB/week with my 100Mbit connection. [00:07] not running 24/7 ? [00:47] No, running pretty much 24/7 with three clients. [01:02] spending a lot of time waiting for wget to figure out what it wants to do [01:03] I really should invest in a faster vps for memac... [01:11] yes overtook Coderjoe :D [01:11] and hi underscor [01:11] underscor: check status board, [01:11] 810GB here in 48 hours or something :p [01:11] bastard. :P [01:11] ahah [01:11] woo [01:12] i ordered another dedicated box but that prob wont be deployed till monday [01:12] or tuesday if people are on holidays then i guess :( [01:12] just a little too much capsaicin in that sub [01:12] haha [01:12] hahaha [01:13] btw for those running their own servers: EasterBunny2012 [01:13] use that promocode with pingdom [01:13] free business account for a year [01:14] oh wait, appears it might be over :( [01:14] bah [01:14] freeze warning from the NWS [01:14] NWS? [01:15] national weather service [02:03] Jeez. At this rate I'll be lucky to stay in the top ten. [02:35] FYI, we're up and running in #fanfriction [04:42] I'M EIGHTEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEN [04:42] :D [04:42] Many Happy Returns! Now get out there and vote. [04:43] Also, Alice Coppers, "I"m 18", just sprang into my head. I'm much too old. [04:44] hahah [04:44] :D [05:25] If I kill one of these curl processes that's stuck in the giant retry loop, what happens? [05:49] underscor: 'grats; now go enlist :D [05:49] (kidding!) [05:50] haha [05:50] thanks [05:51] shaqfu: no, it's a federal law that you have to ... [05:51] chronomex: For the draft, yeah [05:51] yus [05:51] (don't get me started on the draft...) [05:51] <3 [05:52] I think I'm the one person on the planet in favor of it [05:54] Oh, you can also own guns now! Depending on jurisdiction [06:49] Wyatt|Wor: i think that the file(s) it is trying to upload never get uploaded and potentially marked as complete anyway [06:51] Ah. Because I've had it happen once before that it went through a hundred failures and then, as far as I could tell, after that it started a new curl process that worked fine on the first try. [06:51] Was it just a coincidence? [06:51] alard would be able to tell you for certain [07:18] Are you still getting 503? Are we just hitting IA too hard? [07:18] i'm not using s3 [07:18] Ah [07:19] http://archive.org/catalog.php?history=1&identifier=archiveteam-mobileme-hero-444x [07:19] the queue on that has dissipated, though [07:19] still lists the "server readonly" item, though [07:23] Ah, I see, looks like I've got a job over here: http://archive.org/catalog.php?history=1&identifier=archiveteam-mobileme-hero-480x [07:24] I suppose it just got backed up. [09:41] In case anyone wants to help the world by reporting spam... http://www.youtube.com/watch?v=7NdGGU19M1M [11:05] yes [11:05] made it into top 10 [11:05] lol [11:40] wolkommen Woet [11:41] hello [11:41] :D [11:56] nice to have woet on board [11:57] he is smasshing it [12:02] http://memac.heroku.com/ [12:02] am I doing it right? [12:03] is there any documentation for doing trackers? [12:09] woet yes you are :D [12:16] emijrp: Doing trackers? [12:16] that app Woet linked [12:17] emijrp: http://www.archiveteam.org/index.php?title=MobileMe [12:19] emijrp: This one? https://github.com/ArchiveTeam/universal-tracker [12:20] maybe [12:21] Well, that's what memac, focity, splinder are running on. And the new fanfiction tracker. [12:23] can it run in any server? [12:23] Yes, if you can run a Ruby rack application. [12:24] The live updates use node.js. The whole thing needs Redis. [12:24] we need that thing in #wikiteam who can help? [12:25] I could help you with the configuration. [13:18] I didn't run anything on Fortress for a day. [13:18] Within that say, 51% of the drive with mobileme sets filled. [13:18] So that... that's a party. [13:20] Actually, I forgot Fortunecity is on there. [13:20] That's the next question - curating Fortunecity. [13:22] how're you SketchCow ? [13:23] They gotta put a valve on my sexy so it doesn't explode [13:26] neat [13:42] i found some good things when mirror pdfs from cpu magazine [13:43] 1. 2012 and 2011 pdf torrents maybe from cpu magazine [13:43] same md5sum on may 2012 [13:43] and one from 2011 [13:45] 2. cpu magazine website hosts 5 magazines [13:46] comsumer electronics tips, first glimpse, pc today, smart computing, and refernce series [13:46] *reference series [13:57] Boy, I wish instead of discussing ways to improve access to at-risk digital works and coming up with ways to spread the message on the need to back things up, this was a 24-hour anaemic version of #bookwarez [14:05] Also fun: I think we're breaking archive.org's s3 [14:05] I ALMOST, but did not delete 2tb of uploaded files that were not actually uploaded [14:06] Which means I get to add a brand new step in uploading, where I later run a script that checks to see if a file exists on the server, the final one, and THEN deletes. [14:06] shouldn't use some sort of snapshot file system [14:07] Try that again using sparkling, clear english. [14:07] something like zfs [14:07] it has snapshot support i think [14:08] Were you trying to say "Shouldn't you use some sort of snapshot file system?" [14:08] Or were you saying "You shouldn't use some sort of snapshot file system?" [14:08] Or "You shouldn't use a snapshot file system!" [14:08] i mean should have [14:10] Isn't it enough to check the result of the curl upload? [14:10] No. [14:10] No, no it's not. [14:11] It can still go wrong if it says 200 - thanks for your upload? [14:11] YEs. [14:11] That's awkward. [14:11] Rarely, but yes. [14:12] Like, enough for me to want to add that extra step, that last verification. [14:16] And on top of that, a number of failures on the hardware side. [14:16] Just where I want to get on top of that, we have so much data coming in now. [14:36] blame Woet if he breaks it [14:36] he has a few gbits going to it now i think, lol [14:37] overtook db48x :D [14:46] #WikiTeam needs more volunteers to download 10,000 wikis. TaskForce http://code.google.com/p/wikiteam/wiki/TaskForce [14:48] How is URLTeam coming along? [15:16] We didn't really break s3, IA is having network issues [15:16] and large packet loss between the s3 headends and the datanodes [15:16] ok, thanks for the update [15:16] so s3 backed up and caused "issues" [15:17] (they're working on it) [15:36] almost 100TB of memac done :D [17:08] Where can I find the Wikileaks cables set? TPB doesn't seem to have it [17:09] (probably putting myself on all sorts of watchlists asking...) [17:58] hey everyone [17:59] how do you guys get past a robot? [18:03] i usually turn on RADAR ransparency [18:03] *T [18:06] pull the circuit breakers [19:20] shaqfu: http://archive.org/details/wikileaks-cables-csv [19:58] http://www.facebook.com/photo.php?pid=11709588&l=e63df9b592&id=517581288&_fb_noscript=1 [19:58] godane: if you want a real answer, ask a better question ;) [20:03] mmm, the smell of pipes being flooded with data [20:08] DFJustin: Thanks [20:08] Need to research work [20:45] http://arstechnica.com/gaming/news/2012/04/what-ever-happened-to-the-american-arcade.ars [20:53] Demographics, cost, taste [20:53] it died already back in 1985 or so [21:07] Schbirid: i'm trying to get a lot at files on computerpoweruser.com [21:08] but articles/pdfmagazine path is blocked [21:08] in robots.txt [21:08] i can download files once i know the name of it [21:09] but i don't know if i'm missing anything in there [21:11] you know i've never understood the point of robots.txt outside of seo [21:13] godane: what are you using, wget? [21:14] i tryed wget [21:14] the -e robots=off doesn't work [21:14] i'm also getting 403 error [21:15] *403 forbidden error [21:16] try changing your user agent [21:16] "eat delicious poop" is a popular one [21:16] or for greater transparency [21:16] "archiveteam/from_the_future" [21:17] still 403 error [21:17] hrm [21:17] do you need to send a referer? [21:18] yes [21:19] i use --referer and it did nothing [21:23] wget: unable to resolve host address 'www.computerpowermagazine.com' [21:33] 23:33:37 up 1 day, 10:44, 28 users, load average: 20.67, 17.40, 13.79 [21:33] :( [21:42] lol one of my wget threads [21:42] went nuts [21:42] and ate 22GB of RAM :( [21:43] wget sucks sometimes [21:49] wget sucks ram :) [22:08] godane: example url? [22:09] www.computerpoweruser.com/articles/pdfmagazine/ [22:11] what would be a referrer? [22:11] and what are the filenames? [22:12] add good/cpu_0406.pdf [22:13] you will get june 2004 pdf of cpu magazine [22:13] ok, i am downloading it just fine [22:13] its fine all the names thats the problem [22:13] cause some have weird names [22:15] do you know those names? [22:15] not really [22:16] i have seen some with R___0803.pdf on smart computing site [22:16] some https://encrypted.google.com/search?q=site%3Awww.computerpoweruser.com+inurl%3Apdfmagazine+inurl%3Agood&btnG=Search&hl=en&gbv=1&prmd=imvns&filter=0 [22:16] oh [22:17] never did that before [22:17] :) [22:17] its still not every url [22:19] smart computing is worser then cpu magazine [22:20] it disables everything in the url root [22:20] good news is i got all pdfs that i could get [22:23] :) [22:23] good night