[01:11] I love oneliners like this [01:11] id=387;for i in $(wget -q -U "Testing/1.0" -O - "http://xxxx:xxxx@www.rusc.com/members/series.aspx?ID=$id"|grep -oP "show.aspx\?ID=[0-9]*"|grep -o "[0-9]*");do wget $(wget -q -U "Testing/1.0" -O - "http://xxxx:xxxx@www.rusc.com/members/show.aspx?ID=$i"|grep -oP "http://data.rusc.com.*?mp3");wget -O $id.html "http://xxxx:xxxx@www.rusc.com/members/series.aspx?ID=$id";done [01:47] Coderjoe: that's one scenario I hope is true [01:48] Coderjoe: it's not so much about "hurting GoDaddy", because nothing short of a metaphorical meteor strike is going to do that -- it's more just about followthrough [01:48] on a happier note, I think I have a significant percentage of public Proust data on batcave now [02:25] oh [02:25] something good to know: the open-source Wayback Machine code doesn't seem to handle WARC revisits [02:25] er [02:25] maybe it does and my server just sucks [02:25] hard to tell. [02:43] http://girlwalkallday.com/watch-the-film [03:47] Does anyone know about fast/relatively cheap ways to digitize photos? [03:47] My family has gone over this question, and my brother even got a scanner. But a flat-bed scanner, so it's really, really slow. [03:47] I have a ScanSnap, so I could get a lot of them done in a minimal amount of time, but I doubt the quality would be terribly great. [03:48] But I get the feeling that no other option would result in the photos being digitized, as the other options are too slow or too expensive. [03:49] Don't confuse the scanner with the shredder [03:50] True. I'm intending to use the ScanSnap, regardless. I just keep hoping for a higher-quality option that doesn't take 50x as long or cost $5+ per 24 photos. [03:51] And perhaps there's no good option, now, and thus ScanSnap now, but different scanner in five years. [03:51] this takes too long, let's go shopping [03:52] I have an canon mx870 with a document feeder [03:52] I just take 1200dpi scans with that to tiffs [03:52] It takes forever, but I can start it and check on it */10 [03:56] Taking forever is much more reasonable when it's unattended. [03:57] But if it's a reasonable option, I can probably convince my brother to buy it, and we do bunches of scans during Family Christmas. [03:57] That 1200dpi end up being pretty close to what you'd get from using a flatbed scanner? [04:04] whoa [04:04] Paradoks: Yeah [04:05] I mean, it's a flatbed with a document feeder [04:05] So [04:05] I dug a three-year-old ARM7 board out of my closet and wired up a power supply to it [04:05] hooked it up via serial [04:05] Technologic Systems TS-LINUX/arm 7.0 [04:05] ts7200 login: [04:05] :D [04:05] now I just need to remember the fucking credentials [04:06] Paradoks: Apparently it's been replaced by the MX882 [04:06] underscor: Theoretically using the document feeder could mean it acts differently. *shrug*. But cool. I'll mention it to my brother. (Though, please, anyone else with experience, do still chime in.) [04:06] Cannot vouch for it's quality [04:06] its* [04:08] i don't know if I would trust one-of-a-kind items of unusual thickness or material to an ADF, though [04:09] I like to live on the edge [04:09] Haven't gotten out of that "FUCK YES I'M INVINCIBLE" stage yet [04:09] But it's seriously the best document feeder I've ever used [04:10] Autoduplex too, f'yeah [04:10] eek [04:38] Coderjoe: I figure the "valuable" photos (e.g., the ones to actually make it into an album) would get flat-bed scanner treatment, but with the rest of the photos, the danger of a fire/flood/etc. seems greater than the risk of losing a photo or two. [04:45] i have a CC dvd that i want to edit, specifically fix the menus, because i pulled out 4 gb of junk, is there ANYTHING besides qdvdauthor that does dvd with menus from an iso, because qdvdauthor will let me make them from scratch, but i would muh rather just have something to load the iso into and tweak it slightly, and before you ask ive been googling for hours, ive got nothing, using ubuntu, lucid 1004 if it helps [04:48] ifoedit (windows program) might be able to let you remove menu items (though just the menu links, not the on screen listing. that's burned into the mpeg2 video) [09:56] http://img90.imageshack.us/img90/5859/airbaga.jpg [10:02] "Use what you got at hand" [15:12] Well, I have to say.... WHOIS modification at dotAM is pretty damned unimpressive. [15:12] They say 72 hours [15:12] I thought, oh, they're just being super-safe [15:12] But now, it may in fact be 72 hours [15:24] hey [15:24] adding my VPS to the MobileMe project [15:28] Excellent. [15:28] not like me.com is any fast [15:28] I think it's time for us to kick that into high gear. [15:28] i will try to run two processes on my VPS [15:28] to the coders, could any of them impose a filesize limit? [15:28] i can only spare 15GB on the VPS [15:29] the perfect solution would be "download ~10-15GB, upload via rsync, repeat" [15:29] also SketchCow rsync slot please [15:30] also, - Running wget --mirror (at least 1367 files)... <- that's not very informative of the progress [15:33] oh hey wait [15:33] it's fast [15:33] and it DOES write te result [15:35] SketchCow so 2 requests: rsync slot and a maximum download size [15:38] if there is a maximum download size i can set up an automatic download script on my VPS [15:38] 1. download 10GB [15:38] 2. upload 10GB [15:38] 3. delete data folder [15:38] 4. repeat [15:41] @SketchCow how's the rsync slot coming? [15:41] oh hey i'm on the status board downloading stuff [15:49] Jesus. [15:49] Calm down. [15:49] sorry :< [15:49] it's just my first time i got to help the archiveteam [15:49] Here is what you don't do. [15:49] You do not go "thing please" at 10:51 and then go "How is thing going" 12 minutes lter. [15:50] I'm not a fucking tech support call [15:50] okay [15:50] I'm trying to decipher 4 names from an audio [15:51] I would play that audio so you can help but you won't be able to help. [15:51] When I'm done with that, you'll be sent a slot. [15:51] okay [15:51] right [17:02] asiekierk: if you want more information, tail the wget logs [17:02] dumping that information to stderr or stdout would ordinarily just spam a console [17:02] yes [17:02] and i accidentally wasted 3GB of work by accidentally killing the whole process... oh well, newbie errors [17:23] asiekierk: https://gist.github.com/6483375b0fcab1c5ae74 [17:24] asiekierk: let me know if that works for you. if it does, I'll add it to mobileme-grab [17:24] ok, when i finish this batch [17:27] yipdw so from what i presume i give it the amount of free space, in gigabytes [17:27] and if the free space is smaller than that it stops [17:28] [Wed Jan 11 12:26:11 UTC 2012] data filesystem: using 11 GiB, free space threshold: 10 GiB, remaining: 1 GiB [17:28] /dev/xvda1 61927420 11535784 47245908 20% / [17:28] Filesystem 1K-blocks Used Available Use% Mounted on [17:28] is this what should happen? [17:28] actually, that needshttps://gist.github.com/6483375b0fcab1c5ae74/d944dd7df9482b9a58afd3464425412bce202a02 [17:28] er [17:28] https://gist.github.com/6483375b0fcab1c5ae74/d944dd7df9482b9a58afd3464425412bce202a02 [17:28] and no, that isn't what should happen [17:28] what version of coreutils is that [17:29] actually no [17:29] I'm using this [17:29] gdf --version [17:29] Copyright (C) 2011 Free Software Foundation, Inc. [17:29] df (GNU coreutils) 8.12 [17:29] i didn't use -BG on df [17:29] Copyright (C) 2010 Free Software Foundation, Inc. [17:29] License GPLv3+: GNU GPL version 3 or later . [17:29] df (GNU coreutils) 8.5 [17:29] use the version I linked [17:29] it's less stupid [17:30] the previous version of that script did some bogus calculations [17:30] it's also grabbing the wrong parameter for me, Used rather than Available [17:30] the revision I linked is more straightforward [17:30] i'll just use print $4 and it should work [17:31] oh [17:31] oops [17:31] yup [17:32] Illegal option -s [17:32] Usage: /usr/bin/which [-a] args [17:32] also [17:32] jesus christ [17:32] I hate shell scripting [17:32] We all do [17:32] Don't worry [17:33] nothing is ever cross-compatible [17:33] still, big thanks [17:33] nothing is ever cross-compatible beyond the basic features shell scripts never need* [17:33] hmm [17:33] how about this: I set DF to df and the people on *BSD/OS X can fix it themselves [17:34] yeah [17:36] ok! [17:36] asiekierk: try this -> https://gist.github.com/6483375b0fcab1c5ae74/6429a6e3b34f89e15418dc9e867501b334b1c903 [17:36] er wait, no [17:36] i already fixed it for myself [17:36] don't worry [17:36] thanks [17:38] ok, I'll push it to the mobileme-grab git repo [18:30] okay, after a few dumb things i set up the VPS [18:30] 3 dld-client processes at once and a free space monitor [18:31] as well as 2 dld-single processes, finishing 2 incomplete downloads including keith.garner which is quite large [18:32] the speeds are 200KB/s-1.5MB/s [18:32] total [18:33] with spikes of 3MB/s [21:15] is Splinder all done? [21:19] istr needing someone (SketchCow or underscor most likely, given batcave) to run a verification script over everything and identify problem profiles to add back to the pool. or something. [21:19] Eeryday I'm shufflin' [21:20] I finally have an internet connection that's not limited in bandwidth use per month ,I want to archive something. [21:20] you're in luck [21:20] http://archiveteam.org/index.php?title=MobileMe [21:20] there's a good number of terabytes [21:26] cool. I'll fire up the linux box. [21:38] building wget-warc: configure: error: --wish-ssl was given, but GNUTLS is not available. Suggestions? [21:39] apt-get install gnutls-dev [21:40] yay [21:42] is warc in the mainline repos yet? i have it but it qould be more convenient to just apt get update it like everything else [21:50] is it in wget's canonical repo, yes: http://bzr.savannah.gnu.org/lh/wget/trunk/revision/2571 [21:50] is it in a wget release: doesn't look like it [21:50] is it in any package repository: who knows [23:43] SketchCow: what happened with the people queueing up in that one interview? they just showed up and stood there? [23:44] in your test shot from magfest you posted earlier