[00:45] http://owely.com/215sMqH [01:06] http://ar.to/2010/01/set-your-code-free [05:07] i just found this on reddit: http://lancesbrewerytour.com/the-beer-autism-hope-miracle/ [05:16] what is it, godane? [05:25] i got it from here: http://www.reddit.com/r/wroteabook/comments/26mxkr/56_yr_old_beer_historian_w_autism_writing_his/ [05:26] its a bear beer historian with autism thats writing his first book [05:27] its also on indiegogo: https://www.indiegogo.com/projects/help-save-beer-autism-hope-a-miraculous-documentary-book-and-cause [05:36] ah [06:04] #justouttv is now active [06:04] warrior-active, that is [06:05] and we've almost pushed half a TB in, uh [06:05] 4 hours? [06:05] 3 [07:24] i'm starting to upload commandN episodes [07:29] yipdw: is there a quick way to get DigitalOcean running a warrior image? [07:30] voltagex: we have a Docker image for the Warrior @ https://index.docker.io/u/archiveteam/warrior-dockerfile/ [07:30] I'm not sure if DO does Docker stuff natively, but that's one possibility [07:30] I guess it's not IO-bound so that's a possibility [07:30] otherwise setting up the manual-run environment is not too bad [07:31] voltagex: https://www.digitalocean.com/community/articles/how-to-use-the-digitalocean-docker-application [07:31] what droplet size do you think would work? [07:31] the 1GB / 30GB / 2TB plan may work best [07:32] transfer is likely to be the thing that will be scarcest [07:33] I'm just using the 20GB ones and have 27% disk usage [07:36] any hint as to where the manual script is? [07:38] https://github.com/ArchiveTeam/justintv-grab [07:40] sorry, manual setup for the warrior I meant [07:40] http://tracker.archiveteam.org/ [07:53] bah, wget-lua won't build. [07:53] uploaded: https://archive.org/details/commandN.001 [08:58] this is so backwards http://resetthenet.tumblr.com/post/84330794665/the-reset-the-net-splash-screen [08:58] "take a stand for privacy" [08:58] "expose all your visitors to us" [08:59] to github actually [09:46] * schbirid slaps midas with a harddisk? [09:46] right in the face, reminder recieved [09:46] :D [09:47] schbirid: did you see the announcement on op-net? [09:47] they are removing the entire vps infrastructure [09:47] that's a fucking short notice [09:47] i am on a dedicated though [09:47] me too [09:48] but still, what the hell? yeah here is 1 month, and dont forget to backup all your shit [09:49] 10 days! [09:49] err 12 [09:50] oh yeah, 12. [09:50] fuck [09:51] another reason why i need to offload that jamendo stuff [09:51] rsync: write failed on "/mnt/backups/oneprovider/vzdump-openvz-100-2014_06_01-22_03_04.tar.lzo": No space left on device (28) [09:51] ;) [09:52] yessir, will work on that when im back home :X [09:52] :D [09:52] if you need money for the shipping, i'll raise some [09:52] oh no, shipping is already paid for :p [09:52] i just have to bring it to the post office [09:53] * schbirid slaps midas REALLY HARD [09:53] :P [09:53] probably the hardest thing to do :P /me lazy [09:53] it's ATLEAST 700 meters away from my house [09:53] (for your imperialists, thats 2296 feet) [09:54] don't you have a vehicle liek this woman https://www.youtube.com/watch?v=9xH7InazGDY ? [09:55] hahaha [09:55] fucking owned. [09:55] i sooo wish it was hd [09:55] i have the smartcar :p [09:55] to see her wararllglagllablling [09:55] :D [09:55] x [09:55] xD [10:12] if anyone is interesting thebox.bz archive its in wayback machine: https://web.archive.org/web/20130114075153/http://thebox.bz/forums.php?action=viewforum&forumid=26 [10:17] someones should get these - http://pjsho.ws/torrents/ (anndddd ia them with sexy meta data) [10:20] ohhdemgir: i'm grabbing it now [10:21] I have it, but I'm super lazy with meta data and that collection really deserves being done right [10:22] i'd almost join #archiveteam-lazy, but im too lazy to do that [10:22] I helped start seeding when that guy first set up the site, havent spoken to him in forever though I think my reddit habits put him off once he found my profile XD [10:22] hahaha [10:23] he seemed like the straight edge family guy type lol [10:26] fuck me, that's alot of torrents [10:26] :p [10:27] soooo, I set this off - ftp://downloads.netgear.com/ - came back a few hours later, stalled, out of disk space, du -hs... 653GB.. mv to larger array, started again, bigggg site!! [10:27] heh [10:28] was expecting like 50GB or something [10:28] i've been pulling in this uni FTP, it's 6TB atm [10:28] warick? [10:28] lemme check [10:28] *warwick [10:29] ftp.tu-chemnitz.de/ [10:29] brb, lunch [10:30] I was pulling a gov one the other day and realised after 5TB it was going to be too big for the array and removing while taring wouldn't work given the size of some of the files.. dumped it moved on [11:01] it's currently 6666GB :P [11:01] yeah, probably going to dump it, too big, too slow for now [11:09] every time I hit big site I wonder what the admin thinks XD [11:14] i archive every site we make here [11:14] as a admin, im fine with it :p [11:19] i need help with curl [11:19] i need it to take files with spaces [11:20] the best i got with a upload scritp for the pearl jam bootleg is this: [11:20] curl: (3) [globbing] error: bad range specification after pos 74 [11:22] godane: those are just the torrents right? [11:22] yes but i want just upload the mp3s [11:38] so i uploaded the torrent: https://archive.org/details/pj1990-10-22.off-ramp-cafe-seattle-washington-usa [11:38] but i don't see any mp3s [11:39] ok now its downloading the stuff [11:41] now i think it only gets the first file [11:47] i figured out my problem and reuploading here: https://archive.org/details/Pearl_Jam_Bootleg_1990-10-22 [11:48] i needed to replace with %20 for it to work: http://s3.us.archive.org/$id/"$(basename "$file" | sed 's| |%20|g')" [11:49] Famicoman: that could be useful to you if you alot of files with spaces [12:17] https://archive.org/details/Pearl_Jam_Bootleg_1991-02-08 [12:17] https://archive.org/details/Pearl_Jam_Bootleg_1991-02-11 [12:26] godane, aww yiss!! [12:29] i'm also making it easier to get the item by making it a fixed name with just the date that changes [12:30] most items have very odd names [12:48] here is my collection of pearl jam bootlegs so far: https://archive.org/search.php?query=subject%3A%22Pearl%20Jam%20Bootleg%22&sort=-publicdate [12:56] godane, looking good [13:29] I'm kind of surprised that those weren't already on IA... Pearl Jam has always had a pretty friendly taping policy. [13:32] SadDM, same, I knew they wouldn't mind them being up [13:33] from here - http://pjsho.ws [13:33] next up, metallica [13:33] * midas hides [13:38] i'm also adding the original text files to descscriptions [13:38] alot of them was in flac but then convert to mp3 at 320kps [13:42] midas, YESS!!!!! [13:43] i was watching this only last night - https://www.youtube.com/watch?v=mH6NkFG0aPI [13:43] which lead me to download a tnone of their concerts [14:00] they fucking hate the internet :p [16:12] anyone here wanna take over a ovh server of mine? expired today, i wont renew it. 17€/month, 200gb, 100mbit [16:12] porr cpu [16:13] verry porr [16:13] You're not really selling it [16:18] :) [16:19] yeah, actually screw that. the KS-2 is better and cheaper [16:40] HAAARRRRRHGHGHGHGHGHGH [17:07] The Internet Archive now hosts 20 GB of 1000 people's DNA. You're welcome. https://archive.org/details/OpenSNP.org_-_2014-06-03 [17:08] schbirid: hmmmm do ovh filter anything? [17:08] * Smiley just needs to redirect his torrent *searches* [17:08] never had any trouble [17:08] basically a proxy. [17:08] I know they don't like torrent *traffic* which is fine. [17:08] did lots of torrenting and also ran a tor relay (after asking) [17:09] o_O you did? [17:09] they told me "no, no wai!" [17:09] (about torrenting) [17:10] only one private tracker for renting stuff but also a bit of seeding of free content [17:10] either way, I just need a offsite proxy, and maybe other things for fun. [17:10] i'd grab something super cheap then [17:10] woohoo! [17:10] another openssl vuln! [17:10] yeeyhee [17:10] schbirid: exactly [17:11] how cheap can we go? :) [17:11] hmmm ms exchange for 2.79 a month :D [17:11] I feel like the current conversation may be relevant to my interests [17:11] what's the topic? [17:11] keep in mind that super cheap hosts might be peeping perverts [17:11] joepie91: me doing torrent searches on virgin (who are now blocking... *things*.) [17:11] plenty of proxies, no? [17:12] Smiley: http://fucktimkuik.org/ [17:12] joepie91: dunno,. [17:12] in case you were looking for TPB [17:12] will redirect you to a random TPB proxy [17:12] it was set up in response to a block of TPB in NL [17:13] which is now gone and/or in the process of going away [17:13] Je wordt nu doorgestuurd naar een willekeurige TPB proxy. (https://tpb.partidopirata.com.ar/) ><<< hmmm? [17:13] Smiley: "you are now being redirected to a random TPB proxy" [17:13] joepie91: oh ok cool [17:13] that one appears to be the pirate party of... argentinia? [17:13] This way is more fun ;) [17:13] heh [17:14] anyway, there's also a number of proxies for other torrent sites that are blocked in the UK [17:14] can't choose "no os" from ovh :D [17:14] which kinda goes to show how useless such a block is [17:14] nod [17:14] i got foxyproxy and one such site setup [17:14] it kinda works :D [17:14] Smiley: you can boot an OVH box from an ISO, no? [17:14] I can't recall how their custom install process worked [17:15] no custom install option given D: [17:15] hm [17:16] £2/mo for a box, that's almost as a cheap as most "private" proxy services lol [17:16] ah [17:16] Smiley: https://webcache.googleusercontent.com/search?q=cache:_PsKzGKLT10J:www.virtuallifestyle.nl/2009/04/installing-your-own-debian-os-on-an-ovh/+&cd=2&hl=en&ct=clnk&lr=lang_en%7Clang_nl [17:16] how to install custom OS on an OVH box [17:18] completely unrelated, EU court ruling: "libraries can digitize physical works without permission from the IP holder if it's necessary to protect the integrity of the physical copy (eg. rare or old book that would suffer from frequent usage), but that doesn't mean they can digitize their entire collection\" [17:18] ref https://tweakers.net/nieuws/96465/bibliotheken-mogen-in-eu-boeken-zonder-toestemming-digitaliseren.html?nb=2014-06-05&u=1500 [17:19] also an explicit ruling that a copy of materials in a browser cache does not infringe on copyright [17:30] godane, plenty more PJ uploading, you rock! [17:41] Asparagir: Wow! That's pretty amazing. [17:50] antomatic: Thanks! It's mostly partial gene data, not exome data, but that will change as consumer-targeted exome testing becomes more widespread in the future. [17:51] Someday we'll be using ArchiveTeam to back up humans to the IA. :-) [17:55] This is probably exactly what Max Headroom had in mind, all those years ago. :) [17:55] 'git clone' is going to take on a new meaning... [17:57] I could finally download a new family :D [18:00] um [18:00] Asparagir: what context did I miss :P [18:01] The Internet Archive now hosts 20 GB of 1000 people's DNA. You're welcome. https://archive.org/details/OpenSNP.org_-_2014-06-03 [18:01] From earlier, while you were busy talking about cheap boxes: The Internet Archive now hosts 20 GB of 1000 people's DNA. You're welcome. https://archive.org/details/OpenSNP.org_-_2014-06-03 [18:02] oh, damn [18:02] People translated as data... [18:02] wow [18:03] Someone please submit a patch so I can eat gluten again? :-P [18:03] That's not a fun bug to have. [18:05] Asparagir: DNA pull requests... this is simultaneously amazing and really fucking scary [18:08] joepie91: Yup. And if this is just the direct-to-consumer stuff, just think what the governments of the world can do that we don't know about yet. They were the first with the Internet, the first with good crypto, they'll be the first with fun genetic mods too. [18:08] khaaaaaan [18:09] China and Russia are going to be allllll up in this shit. [18:15] yeah, those evil countries will [18:15] thank god for the us gov, literally god [18:18] Ha, no one's hands will be clean once this arms race is on. Which it probably already is. But China and Russia will have the advantage in that they don't have to pretend to be moral. :-P [18:22] i can think of a few others countries that fit this definition [19:26] https://github.com/joepie91/python-whois/issues/19 [19:26] record time resolution :D [19:33] joepie91: btw, I think one thing you could do that would be tremendous benefit to archivebot would be to flesh out integration tests for its CI build [19:33] um... :P [19:33] i.e. start an ircd, start the bot up, archive a test site, verify some basic properties [19:33] yipdw: let's just say that that's not my forte [19:33] I meant to do that, and started a Travis manifest for it [19:33] but [19:33] well [19:34] that's one reason why I have not done much work on it; the other is that I have 3 other things to do atm [19:34] it has gotten to the point where it really needs something like it [19:34] or, well [19:34] let me put it this way [19:34] okay, so, the relation between me and testing is as follows: I test any implementations manually, in pretty much all possible permutations, and only implement automated tests for critical components (think crypto) or things that are unreasonably likely to break (think WHOIS parsing regexes) [19:35] I've made mistakes that would have been caught by such a build [19:35] I despise tests, I find them to offer no real value to the reliability of my software, yet they consume a lot of time... thus my experience with writing tests approaches 0 [19:35] well [19:35] I think acceptance tests add a lot [19:35] so I'm probaly not the right person to ask :P [19:35] probably * [19:35] they tell if it works [19:35] assuming that they do not overspecify the software [19:35] anyway, that's what I want to have in ArchiveBot before fixing some of the current deficiencies [19:36] thing is, the inputs and outputs for archivebot are really well-defined [19:36] input: IRC command; output: WARC [19:36] yipdw: yes, but so do my manual tests - I don't do any automated testing at all aside from the above exception scenarios, and I think you'd have trouble finding anybody with complaints about the reliaiblity of the software I write [19:36] for me tests add pretty much notthing [19:36] :P [19:36] nothing * [19:36] my trouble with manual tests is that I don't run them consistently [19:36] and neither do many other maintainers [19:37] I don't write a snippet of code without testing everything it touches [19:37] as a result, for software to survive multiple maintainers, it has a better chance of doing so if it has automated tests that can be run with a minimum of fuss [19:37] :p [19:37] right, but I'm still not the right person to ask to write those tests [19:37] that's fine [19:37] it's just #1 on my archivebot to-do list [19:37] right [19:38] which I haven't gotten to due to other stuff [19:41] oh nice [19:41] so, AV Foundation error -11823 indicates that a file already exists at the location where you are saving [19:42] (iOS / OS X thing) [19:42] NSErrors have user info; one bit of user info is a NSLocalizedRecoverySuggestion [19:42] the recovery suggestion for error -11823 is "Try saving again" [19:42] not "rename file", "confirm deletion" or something that would actually move the target [19:44] lol [20:06] i'd like a brain implant that would understand programming so i can do my normal thinking while it creates the code i need. [20:06] someone willing to build that for me, would be awesome [20:06] midas: or we could just, like, make programming not suck [20:06] lol [20:07] exactly, why not make a computer understand what im typing to it! [20:08] "gran all these video's, this is how the url looks. sort of." [20:08] grab* [20:09] im looking at this lua script for justin, understanding parts of it. like this i understand: if string.match(url["url"], "justin%.tv/archives/") then [20:09] it's logical so far [20:09] this on the other hand, local f = assert(io.open(item_dir .. "/status_info.txt", "w")) [20:10] io_open, must have something to do with storing something [20:14] midas: opens file handle, as far as I can tell [20:14] for writing [20:15] ok, maybe a bad example. i understood that part by just watching the rest of the code to be honest [20:15] but still :p [20:15] wait, justin.tv's lua script is only 44 lines? [20:20] it doesn't do much [20:22] lua script is just for wget right? [20:22] it just tells wget, do this. [21:11] wow. [21:11] "As for who it will affect: Only international users, and only readers of The Onion who want access to more than five full articles within 30 days. " [21:11] so apparently non-US readers of the onion now get paywalls [21:11] wtf. [21:13] you should move to america [21:22] clearly [21:22] :p [21:22] exmic: at least I can read the Onion while I'm in jail there! [21:53] now now joepie91, you will probably be send to guantanomo, so still paywall ;-) [21:54] BTW, any news about your case? [22:03] i'm uploading more pearl jam bootlegs [22:10] midas: nono, they say that it's also non-paywalled for overseas locations and military stuff etc [22:10] so guantanamo gets it free too [22:10] and nope, radio silence re: case [22:12] like, nothing at all? the bastards. [22:14] vpn [22:14] proxy via BiggieJon \o/ thanks mate! [22:15] (texas is part of the US right?) ;-) [22:15] ehhh, well, kinda, have seceeded yet [23:41] http://failureinthearchives.wordpress.com/ [23:42] "...covering the period 1500-1750…" [23:42] Because we're sooooo much better today, honest! [23:47] only because of technology [23:47] qualityof life is relative