[00:07] grar [00:07] my employer moved offices. today, I heard one owner tell the other "I told you not to bring all the floppies." [00:08] But they're here. if they don't want them, I'll take them [00:09] does anyone know if the me.com shutdown will include email services? [00:10] Last night I was at a fairly new Tim Horton's, and there was a sign sticker on the door saying if you were not 100% satisfied to email the owner/manager/whatever at a @me.com address [00:14] ah. yes. email will still be available [00:40] what's the deal with the steps in the MobileMe code to create a directory structure in advance because wget --mirror doesn't always do it? [00:40] did anyone ever figure out why wget --mirror wasn't creating dirs? [00:42] topaz: slide.com only gave 7 days? [00:43] yeah, just wondering. [00:47] anyone happen to have all those opensolaris bug pages that oracle killed? [00:47] happened a while ago [01:06] is there a best way to recommend a patch for mobileme-grab? [01:15] fork the github repo, push the patch to your fork, and submit a pull request? [01:16] or I suppose make a diff and post it? [01:16] I just forked, I'm exploring that now. just didn't know if there was a preferred protocol for this sort of thing. [01:17] (I'm a github newbie, see) [01:18] the normal protocol on github tends to be: fork, make changes, push to your fork, submit a pull request. [01:19] *nod* thanks. [01:39] Bummer, slide refuses AXFR [01:41] undersco2: does git-annex track metadata? since i'd like to track some, i know etckeeper does but git-annex is all about files not necessarily always accessible which i need [01:41] I don't know [01:41] Ask closure :) [01:41] HMM [01:43] Coderjoe: https://mashable.com/2011/08/26/google-kills-slide/ [01:43] undersco2: well what about this: might already be in git-annex, where i can have a local 'copy' of stuff in the form of git-annex-type text files with the location of something in them, then move those files around in my tree, and when say i plug in a relevant external harddrive, it shuffles files around to match how i've moved their textfile copies? [01:43] guess we only just found out about it [02:03] arrith: if I understand you right, yes git-annex can do that [02:04] closure: the moving stuff around? [02:04] yes [02:06] closure: well say i make a bunch of changes where i move stuff around, can i then say "please execute the moves" and have git-annex prompt me to then connect/disconnect relevant external drives necessary to do the moving? [02:10] ok, so you want to literally move files between drives.. so you make some directories in git, say drive1, drive2, drive23. git mv files into there. Then later you mount drive1, go into its clone of the repo, update it, go into the drive1 directory, and: "git-annex move --to ." and it'll move files from all the other drives, or complain if some of them are not mounted [02:13] hm that might work. i'll dig around with git-annex [02:35] there may be better ways, depending on what you're trying to do. For example I was running this command today: git annex --untrust=here --untrust=home copy --auto --to archive [02:35] ... which automatically identifies files that need more copies, assuming the current repo and another one are not reliable, and puts them on the archive drive [02:38] that's pretty fancy [02:39] especially nice since different directories are configured to need different numbers of files.. I like a ton of copies of my photos etc [02:40] i'm trying to automagically most efficiently be told by a program which resources (external drives or otherwise) i need to connect to have the move files around with the least amount of me connecting/disconnecting a drive [02:41] can you connect 2 drives at the same tim? [02:41] Tim won't like it, but yeah [02:42] closure: yeah the system would have to be aware of how many drives you can connect. i personally can connect 4, assuming usb. but some drives i'd like to only connect through esata since it's way faster for me [02:42] closure: btw offtopic but two things that might be neat for git-annex could be scrubbing support, where it verifies the hashes on files. and also a way to customize the command being used to copy/move. for example rsnapshot lets you add custom arguments to the rsync it runs [02:43] brb in a bit [02:43] every command git-annex runs (I think) can have options added, on a global or even per-remote basis [02:44] git config remote.foo.annex-rsync-options --bwlimit 100 [02:45] I think your thing could be built on top, since you can have git-annex --json dump where files are located, and get the info to calculate the drive swapping sequence [02:46] Help me not waste time. [02:46] We had a couple programs for geocities downloading. [02:48] hmm.. I barely remember. But I think I included the scripts in my geocities tarballs. [02:48] which I don't have anywhere now [02:55] I have them. [02:55] Also, I am making a sub-collection for Geocities stuff in the archiveteam collection [02:56] So besides our 8 part snapshot, we can add other sets people are finding in their pockets [03:12] Did someone have geocities lying around? [03:12] In the meantime, I will begin running the program that searches Google for Fortunecity URLs. [03:16] I think I found an extra DOOM levels CD- Instant Doom Levels is on cd.textfiles.com but doesn't seem to be on archive.org [03:28] Yeah, not sure of the pedigree. [03:35] textfiles.com down? [03:35] Looks so from here [03:36] same [03:36] Damn; just as I was hitting a stride [03:36] Oh [03:36] I got into it shaqfu [03:38] curl textfiles.com [03:38] curl: (56) Recv failure: Connection reset by peer [03:38] it is pretty slow [03:41] One moment. [03:42] Who's 124.124.65.141 and 99.85 [03:42] not me [03:42] i'm 76.28.x.x [03:43] I'm 68.39 [03:46] 173.66. [03:46] curl icanhazip.com [03:48] Someone had 75 open connections. [03:48] Going full bore. [03:48] Well [03:48] that ip you posted is from india [03:49] Someone had 75 open connections from india. [03:49] Going full bore. From india [03:49] :/ [03:49] Is that better [03:49] Anyway, I fixed it [03:49] sorry [03:49] Mister ipfw had something to say [03:49] hahaha [03:49] sign them up for archiveteam [03:49] If you open multiple connections to me past, say, 10, ban. [03:49] SketchCow: Do you have textfiles.com exported as an rsync module? [03:49] That would be pretty cool [03:52] Wow, things I'm not going to tell you [03:52] "Do you leave your porch unlocked" [03:52] "Where do you keep the cookies" [03:52] "What's the current mandatory sentencing for statutory?" [03:53] ... [03:53] I meant like read only, so it was easily mirrorable [03:53] Instead of hammering apache with wget --mirror, which doesn't even guarantee integrity [03:58] Yeah, that's not the way to do it. [03:58] SketchCow: yeah, I have 8 geocities parts that are actually one big .tar.xz that can be catted together [03:58] I... see. [03:58] Why wouldn't you want to do that? [03:59] clearly torrents of periodic db dumps. can point the new torrent at the old torrent and it'll only download the changed pieces, assuming that's how the db works [03:59] Well, that sounds crazy, tsp, but let's upload that. [03:59] How big is it? [03:59] 8.5gb [03:59] Either you can do the upload, or I can do it. [04:00] What's easier? [04:00] compressed, 16gb uncompressed. You do it, I've got most of it on my dropbox [04:00] Most.... or all? [04:00] arrith: textfiles.com doesn't have a DB, does it? [04:00] 7/8 parts, the last one can go in once you have one of the parts. My db ran out of space [04:01] OK, msg me the dropboxages, I'll make it happen. [04:01] I make EVERYTHING HAPPEN [04:02] This is true [04:06] undersco2: oh, right. maybe not [04:06] I could be wrong, usually am. XD [04:06] But I think it's all semi-manually curated/updated by the artistic bovine [04:10] Is there even a point in running the mobileme downloader if you're not kennethre anymore? [04:10] >:( [04:10] haha [04:16] undersco2: yes [04:17] I DOWNLOADED 14 MEGABYTES RECENTLY [04:17] I AM HELPING [04:17] hahaha [04:21] well, got on the Raspberry Pi waiting list [04:22] I guess now I just have to wait for the other 100,000 people to get one first [04:23] awesome [04:23] I can't wait to get mine either [04:24] I'd like to see what I can do with it re: mobile computing [04:24] I'd love to have a cheap inertial navigation system [04:24] stuff it in my pocket and go jump around the city, knowing that if it breaks it's only $25 [04:25] well maybe [04:25] the accelerometer and gyroscope assemblies might cost more [04:25] nunchuck [04:25] three axis accelerometer, buttons, and joystick for <$30 [04:26] Plus it uses standard I2C [04:30] that'd be useful [04:31] (you just have to clip off the plug) [04:31] http://todbot.com/blog/2007/10/25/boarduino-wii-nunchuck-servo/ [04:35] http://content.usatoday.com/communities/ondeadline/post/2012/03/live-in-relationship-of-calif-teacher-teen-ignites-furor/1?csp=34news#.T1BOR_Egc0k [04:35] LIVING THE DREAM [04:35] I saw that, made me think of you [04:35] funnily enough [04:36] undersco2: dunno if you heard but bibliotik isn't totally gone. they need new hosting and have a static page up on their site now. maybe you could put them in touch with some of the people at what that you know [04:36] undersco2: since it'd be awesome for them to get tons of servers so they can have a huuge userbase [04:37] yeah, 'twould be cool [04:37] can't say much obviously, but I've been in touch with various related parties [04:40] undersco2: ah alright. makin sure you knew at least that the site was back [04:40] yeah, thanks. preciate it a lot! [04:42] yipdw: which distributor did you put your preorder with? [04:42] or waiting list thing [04:42] arrith: RS [04:43] I think I may sign up with both RS and Farnell, though [04:43] also, ha, that guy's name is Hooker [04:43] One of them does uk only [04:45] Hello! My name is grayson, im 16 and i live in elkhart. I deeply enjoy your website, because it gives us teens a glimpse into the past. Unlike everyone else i know, i have a thing for old stuff, all i listen to is 80s-90s and i love all classical music composers, i also have a strong love for Claude Monet. Now, saying all of that, i think your website is the greatest thing ever created. All of the categories and info. (Thanks to your site i had f [04:45] Great, now I have a fan younger than underscor [04:45] I figured it'd be someone in the second trimester [04:46] Elkhart, Indiana? [04:46] HOW DO 16 YEAR OLDS HAVE A BLAST FROM THE PAST [04:46] That's like having a blast from last week [04:46] hahaha [04:46] that is so bottom of my wall [04:47] SketchCow: But I'll always be your biggest fan [04:47] I dunno, childhood obesity is getting out of control [04:48] Yeah, that's what they told Selena [04:48] haha [04:49] * SketchCow throws it into the fanmail folder [04:55] wikipedia says RS is shipping to the UK only [06:16] http://www.archive.org/details/archiveteam-geocities [06:18] underscor: how old are you? [06:19] 4 [06:20] hah [06:20] kennethre: Can you assemble those internet daemons? I just don't have the time. [06:20] SketchCow: context? [06:22] did we ever get in touch with the reocities guys to swap dumps [06:23] I archived a grand total of 2 geocities sites when it was going down, heh [06:24] The Reocities guy, I got in touch with him. [06:24] He's not going to swap dumps. [06:24] Well, I mean, he took OUR dumps [06:25] :( [06:25] what why? [06:26] I chatted with him at length about this. [06:26] We downloaded Geocities. [06:26] He downloaded Geocities. [06:26] Our downloading geocities cost us $0 as we stole it or used it from whatever. [06:26] His downloading geocities cost him $9000 [06:26] So he's trying to ad-hump his way back to solvency [06:27] $9,000, jeez [06:27] Yeah [06:27] So I'll wait for him to feel less poor, then take it from him [06:27] We can waight [06:27] Wait [06:28] wasn't there another site too [06:28] oocities [06:28] yeah [06:28] geeze [06:28] well that was very generous of him [06:29] why is this just popping up now though? [06:29] What is just popping up? [06:29] http://www.merit.edu/events/mmc/ [06:29] oh I just asked because some other people's geocities collections are going up [06:29] Look, some asshole [06:30] an asshole on the Internet [06:30] rare species [06:33] i'm wathcing hoarders [06:33] reminds me of archiveteam [06:38] Title of talk: ARE YOU SURE YOU WANT TO DELETE YOUR HERITAGE? (Y/N) [06:46] Running alongside the miracle and power of life going online is the shadowing darkness of digital oblivion. As fast as data and personal digital history goes online, so can it be deleted, and this has been happening with greater and greater frequency as dot-bomb and long-suffering sites are thrown into the incinerator to clear out a bottom line. Jason Scott is the official Mascot of Archive Team, a loose collective of archivists, developers, and a [06:46] I'll bet some got cut off. [06:46] "archivists, developers, and a" [06:48] ctivists dada_ [06:48] dedicated to making sure the conversation doesn't end with a site shutdown and a pulled plug. In the last few years, they've saved dozens of sites from Dark_Star [06:48] oblivion and in many cases brought access to the data right back again - often over the noise of the endless debate about the meaning of their actions. [06:48] Jason will present a fast-paced, hilarious talk about digital apocalypse and discuss day-to-day operations at Archive Team, and how you can help or at least help yourself [06:48] Copypassttaaaaa [06:54] yum, pasta. [06:55] passtttejjrr [06:55] Down to 10 e-mails in the inbox [06:55] Soon I can go home [06:57] jealous [07:03] SketchCow: his downloading cost him $9000 because he didn't know about the googlebot trick and so spun up a bazillion servers [07:05] only cost me the goodwill of my roommates [07:12] Yes [07:12] Oh, no doubt [07:13] But if someone pays for a taxi ride but didn't know to ask for a lift, they still owe the taxi [07:13] Clock is starting to run out on e-mails [07:23] That feeling of discovering 10 of your outbound e-mails were held in drafts [07:23] No wonder some people hadn't gotten back [07:35] that's how you clear your inbox [07:35] reply, don't send until it's empty [08:31] SketchCow, underscor: Any of you have that Google-search-crawler script laying around? [09:17] Here's my script (for fast crawling via an ipv6 tunnel), if that helps: https://gist.github.com/2788197d2db2779cb7b0 [09:30] weee, thanks for not serving 404: [09:30] 3dactionplanet/forumplanet.gamespy.com/technical_issues/b48748/4585267/p1/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplan [09:30] et.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/ [09:30] www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http/www.forumplanet.com/http [09:30] just because one guy linked to "http//something" and the stupid board made a proper link [09:30] http://forumplanet.gamespy.com/technical_issues/b48748/4585267/p1/ [09:39] christ. [09:53] alard: Thanks, I'll take a look at it [11:36] anyone wanna wget some forumplanet forums for me? simple bash script, needs wget and 7z, takes around 100kilobytes/s and might be a couple of gigabytes of crap. in the end you would send me a <20MB 7z. [11:48] Schbirid, me [11:55] Nemo_bis: here is the bash script, ignore the comments on top: http://pastebin.com/R6vRV0Yu [11:56] run it and pass planetcallofduty as argument, eg "sh thatscript.sh planetcallofduty" [11:56] wait [11:56] no, dont wait. that is how it works :) [11:57] if you want to do more, planetmedalofhonor, planetunreal, planettribes [11:58] ok [11:59] thanks! [14:18] ugh, the script totallay needs an "echo "7zipping now" [14:18] :D [14:35] Schbirid, the script died and didn't seem to produce anything but a 3 MB log [14:36] uh, not true [14:36] Schbirid, 531M planetcallofduty/forumplanet.gamespy.com/ [14:43] looks untrue: Scaricati: 270 file, 13M in 0s (25354560 GB/s) [14:45] Schbirid, creating .tar.7z now (paths too long for 7z apparently and dies whining), what forum should I archive next? [15:46] Schbirid, the script says only === Mirroring http://forumplanet.gamespy.com/scripting/b50250/p1 === etc., but it actually downloads also following pages, doesn't it? [16:34] use tar.xz rather than 7z. xz is just the LZMA2 compressor without all the extra archive (multifile) overhead [16:36] and yes, it is LOADS of fun when something doesn't 404 properly. the splinder wordpress install had that issue. [16:41] Coderjoe, what's the overhead? [16:42] probably miniscule, actually, with the amount of file data stored [16:42] Also, I'm lazy, I use PeaZip to use all 7z options. [16:42] but it is like putting a tar in a zip file [16:42] gnu tar has an xz option now. iirc, it is J [16:46] j is bzip2 [16:46] at least it was [16:46] if the GNU guys changed that, I wonder how they plan to handle backwards compat [16:47] J [16:47] oh [16:47] not j [17:09] or just tar xf file, it'll decompress automatically [17:28] xa [17:33] closure: we were talking about compressing, though [17:33] well, I was. yipdw brought up decompression [18:04] closure: oh, I didn't know tar x detected compression format [18:04] that's cool [18:05] Depends on your version of tar I guess. [18:06] Newer ones generally do. [18:06] I think tars compiled to be POSIC correct don't. [18:06] X [20:06] hey guys, finally back.. quite a hiatus hehe [20:07] SketchCow: could you please reactivate my rsync slot for the batcave? I have some stuff I didn't fnish uploading a couple weeks back from mobileme... still data from the old script. [20:17] https://twitter.com/codinghorror/status/175676345353908224 [20:19] Wasn't it an old bank or something? [20:22] Speaking of the IA, fuck manuscript copyright law [20:23] I just learned that a correspondence collection I have can't be hosted anywhere (we were considering IA) due to how insane copyright is on unpublished works [20:26] shaqfu, how old is it? [20:26] Nemo_bis: 1910s-1940s [20:26] shaqfu, country of origin of author? [20:26] The author had to have died before 1942 for it to be public domain [20:26] Nemo_bis: USA [20:27] shaqfu, author and heirs known? [20:27] So even if it was written during WW1 I'd have to track down the obit [20:28] Nemo_bis: Author, yes, heirs, that'd take a lot of work [20:28] It's a couple hundred authors [20:28] shaqfu, how much work? [20:28] Nemo_bis: Enough to not make it worth it [20:28] Just claim it's an orphan work [20:28] If it's not worth it, it's orphan [20:29] shaqfu, who is the publisher and editor? [20:29] Nemo_bis: Unpublished [20:29] It's correspondence [20:29] oh, correspondence [20:29] Yeah [20:29] private correspondence? no editor? [20:29] That's why it's a huge problem [20:29] Yep [20:30] so it's not a manuscript, it a collection of papers [20:30] One guy writing to lots of other people, and he donated the full collection [20:30] Right, manuscripts collection [20:30] One guy? But the collection includes also the letters of the others? [20:30] Yep [20:31] You could also try to play with the definition of "published". [20:32] There's no way we can claim it was published/registered [20:33] So if we'd want it hosted somewhere, we'd have to do an item-level survey of dates of deaths [20:33] Fuck that [20:34] Well, archive.org doesn't care. [20:35] Wait, really? Isn't it a liability for them? [20:35] Not until nobody complains. [20:35] Which won't happen [20:35] Quite obviously [20:35] The content is totally benign [20:36] That's promising; I'll look more into it [20:36] And I think that even a publisher would publish this, it's quite arguably a set of orphan works. [20:36] shaqfu, if you publish it, please waive any publication right https://en.wikipedia.org/wiki/Publication_right [20:36] Nemo_bis: Eh, it's big enough to be a hassle to publish [20:37] 3.5-4 ln ft [20:37] Or you'll effectively make them copyrighted at least in EU, possibly [20:40] Yeah [20:46] shaqfu, anyway, don't overlook https://en.wikipedia.org/wiki/Publication etc. [20:51] Nemo_bis: Thanks [20:52] shaqfu, how did you find/acquire the correspondence? [20:52] and where was it conserved [20:52] *preserved (?) [20:52] kept [20:52] Nemo_bis: Local historical society [20:52] I'm volunteering there to get it digitized [20:53] It's a bunch of manuscripts in a filing cabinet right now [21:30] can i run multiple concurrent seesaw scripts? in the same directory? [21:31] gui77: it's ussually not needed :) [21:32] kennethre: ok then - i just figured in case one failed, and to maximize the pipe... [21:32] gui77: that's what I mean, mobileme's so fast, the pipe is ussually maxed with just one running :) [21:36] well [21:36] kennethre: depends on where you're hitting mobileme, I guess -- I think I'm rate-limited [21:37] though if I am I'm not sure how MobileMe is doing it [21:37] maybe it's just the pipes [21:38] shaqfu, is the cabinet accessible and to how many persons? [21:38] id the place where it's places has ever been somehow public, you can claim the works are published [21:39] I'll market an internet optimizer called Drano [21:39] makes shit flow faster [21:41] hahahaha [21:47] Nemo_bis: Anyone that asks for it [21:48] And I don't think "access to original manuscripts" counts as publication [21:48] shaqfu, if you have to ask, it doesn't, but were it open access it coul [21:48] d [21:49] Nemo_bis: Looking at the legal definition, it hinges on distribution [21:49] "access to the one copy" doesn't qualify as distribution [21:49] it does if it's open to the public [21:50] at least in some countries and cases (think of a statue) [21:50] Nemo_bis: In US law, public performance != publication [21:50] And since 1976 federal law trumps state [21:50] Well, I don't know all the details, but one can always try. [21:51] Publication is not very precisely defined, there are only some suggestions by federal agencies and so on AFAIK. [21:51] Admittedly, it's not worth fretting over, since there's zero chance of anyone really caring [21:51] Anyway, this is just playing with borderline cases and definitions, the fact is that there's no actual risk in publishing that material. [21:51] Yeah [21:51] Yep. [21:52] If you ever happened to have someone complaining, you would have lots of arguments, that's the point. :p [21:52] Oh no, someone that died in the 1950s' grandkids might get annoyed that Gramps once asked the historical society about his house [21:52] The actual risk with such things is usually libel, private data and so on. [21:52] Yeah [21:52] I'd have to survey the collection thoroughly [21:53] If it's 100% benign I'll contact IA [21:54] They don't want to be contacted. [21:54] If they don't officially know about it they're protected, as an ISP. [21:54] Ah, file locker defense? [21:55] I don't know the English terminology very well. [21:55] Nemo_bis: The same way sites like RapidShare don't get in trouble [21:55] They claim they don't know what's being hosted; to them it's just files [21:55] Well, or shouldn't. :p [21:55] Yes. [21:56] dmca safe harbor is the terminology [21:57] DFJustin: Thanks [22:21] anyone skilled in TKinter? [22:44] SketchCow: http://games.slashdot.org/story/12/03/02/214243/sony-to-delete-virtual-goods?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot+%28Slashdot%29 [23:51] god dammit. that is exactly why I hate the locked-up downloadable content system. [23:51] I don't trust companies to not fuck me over [23:51] I still occasionally play games that are 10-15 years old [23:52] and yes, the game industry wants to kill replay because it somewhat hurts them, being competition for their newer titles. [23:56] Nemo_bis: planetmedalofhonor, planetunreal, planettribes [23:56] Nemo_bis: you can easily run them all at once [23:57] Nemo_bis: yes, the script only mentions which subforums it downloads, it will actually download all topics in those [23:57] planettribes is done, planetunreal running [23:57] i will clean up those long paths later on [23:57] awesome [23:57] so now I'll start medalofhonor if you're not doing it [23:58] wait [23:58] i got that [23:59] no wait [23:59] uh, ok [23:59] yes please [23:59] ah :) [23:59] sorry, got confused