[00:10] Dude. [00:10] So, opinions. [00:10] Savetz, a buddy, at least I think it's Kevin Savetz, just added a link on the rescuing floppy disks with a service. [00:10] Will take floppy disks and convert to CD-ROM. [00:10] $20/Floppy [00:10] Doesn't that seem exhorbitant? [00:10] $10 additional to return it [00:11] Site has massive google ads [00:12] yes [00:12] I mean, I guess the fact is, it's on a page where it has volunteers and it has a service [00:12] So I guess we should just volunteer [00:12] The service is a service [00:12] free, if you're okay with sharing the data. [00:13] return requires a postage-paid package [00:13] and no strong guarantees on turnaround time [00:15] For standard 3,5 inch floppies or some other format? [00:15] Go ahead and read it. [00:15] floppyrecovery.net [00:15] http://www.floppyrecovery.net/ [00:15] seems kind of contrary to the spirit of archiveteam and the archiveteam wiki [00:16] bleh [00:16] that's easy [00:16] $20 per disk is kinda pricy [00:16] $20/disk doesn't sound too bad for some old disks in weird sizes [00:16] soultcer: this is for standard 5.25 and 3.5 though [00:17] Well if you were careless enough to throw out your floppy drive without checking your floppies for valuable data fist you deserve to pay [00:18] I need to find my point and shoot camera so I can write the scripting needed to drive the autoloader, kryoflux, and camera to have it automatically run through a stack of 3.5 [00:18] your... camera? [00:18] (the camera image is to capture any label metadata) [00:18] :O [00:18] OH [00:18] soultcer: that's pretty arbitrary to decie [00:18] decide [00:18] the autoloader ejects the disk and the camera takes a picture [00:20] and the kryoflux logs are also kept. and I think a file signalling a kryoflux error, though that could be handled via log file grepping [00:21] I'd do that for basically beer money, to be honest [00:21] but I wouldn't be doing it as a career or real source of income [00:22] I don't have that sexy usb floppy disk controller though [00:22] i guess if it takes 20 minutes per floppy overall, that is somewhat acceptable... it comes down to how you value your time [00:22] 20 minutes per floppy? [00:23] between reading it, burning the CD, and packaging it up [00:23] (if using a KF, it can take some time if you tell it to read each track multiple times per read) [00:26] I don't think $60/hr is a fair price for "press butan" [00:26] but what do I know [00:29] it's probably not, but he's probably taking a page from the VHS conversion folks- cheap for single ones, but rather expensive in bulk, and the equipment costs just enough to make it not worth your while to do it yourself [00:30] I think that's the real problem; $20 for the first floppy isn't massively expensive, but converting a stack of 20 disks for $400 seems really excessive. [00:31] dashcloud: Re: csoon.com. Running the same command you did, the coming-soon directory ended up with 3,167 items, totaling 68.9 MB. The warc.gz file portion is 37.3 MB. It looks about like how things looked with Splinder and Mobile Me, but, again, I'm far from being an expert. [01:29] SketchCow, your ustream.tv isn't working [01:30] 'We're sorry, the page you requested cannot be found.' [01:32] PatC: It doesn't like the lack of URL encoding. Try this: http://www.ustream.tv/channel/jason-scott%27s-film-school%3A-editing%2C-live! [01:32] thank you sir [01:39] can anyone help with a SED regex find/replace? [01:39] It works in dreamweaver, but in sed/nix environment it doens't seem to want to work [01:40] i'd like to match: http://blah.com/.*?/folder/ [01:40] where .*? = a folder name that could have numbers, letters, or stuff like underscores [01:50] what do you want to do with it after you've matched it? [01:54] s#http://blah.com/(.*)/folder/#\1# [01:55] i want to replace the entire part, the http:// all the way throguh to the end slash with something else [01:55] well then s#http://blah\.com/.*/folder/#somethingelse# [01:56] sed -e 's|http://blah\.com/[-_0-9a-zA-Z]*/folder/|whatever|' [01:56] hmm ill experiment with those [01:57] i had tried .* [01:57] maybe i didn't escape the . in .com [01:58] in other examples i have see, / = the delimeter [01:58] so you can just change to pipes or pound? [02:00] yes [02:01] nice [02:01] booting slackware VM to test this now [02:01] there is a limit to the chars available, but if you have / and : you will find it is easier to do pipes or hashes or the like [02:09] I use , for my delimiter usually, because it sticks out on the bottom of the line a little bit [02:11] nice, it worked, thanks guys [02:17] holy glitter [02:18] http://veryfashionblog.splinder.com/ [02:25] so is there an automatic script for knol archiving? [02:30] http://mbebenita.github.com/Broadway/broadway.html [04:04] bsmith094: They've talked about scripts in #klol, but there's nothing "automatic", yet. I don't know if it'll require a crowd. [04:04] Coderjoe: Ooo... Nifty. [04:05] yeah [04:06] and it is also nice for countering the anti-js-emulator people... "you can't do that well in js!" "oh yeah? check out this x264 decoder implementation!" [04:06] er, h264 [04:06] whatever [04:07] ha ha [04:07] Well, yes. [04:08] Few people complain about the js idea anymore. [04:08] Now we're just working on it. [04:08] Got some good, good people in a huddle. [04:08] Threw some resources. [04:08] I have good feelings about it. [04:09] well if it doesn't work well give it a year and browsers will be faster :v [04:10] it is one of those things where you have to aim for what might be possible later rather than focusing on the constraints right now [04:11] it is very much one of those situations where those who say it is impossible should stop getting in the way of those trying to achive it [04:23] I wonder if we should have case squashed the profile names in the intermediate directories [06:00] hmm, lame. for some reason wget in linux is failing on -I (include_directories) with a wildcard [06:30] BACK [06:31] hmm [06:31] twice now my rsync to the batcave was interrupted by peer [06:32] hmm [06:32] and now i get connection timeout [06:32] Looks like we hit a problem. [06:33] Independently verified. [06:33] I think IA is having internet connectivity issues again tonight [06:33] Yes [06:33] We're finding this [06:33] They're doing a lot of upgrades, dealing with a lot of upgrade issues as a result. [06:34] http://www.archiveteam.org/index.php?title=Metadata_warriors [06:39] does anyone have any experience using directory based wildcards in wget? [06:39] for -I aka --include-directories [06:40] Make it \* instad of * [06:42] wouldn't that escape * making it just a normal asterisk and not a wildcard (at least on linux. windows seems to handle * alone just fine as a wildcard) [06:49] yea \* didn't work [06:51] wget is basically downloading 1 file and stopping, breaking the operation [06:51] even just using '-I /fooba?' is causing the same result as '-I /fo*ar' [06:57] I forget what they were called again, but I hung several of those lights you showed for a couple of live tv productions. most were mounted to the drop tile ceiling framing (for 2 days at most) [06:57] bright as hell [06:58] they also had screens that fit between the body and the flaps. iirc, it was mainly so that if a bulb broke, it wouldn't rain glass. [06:58] it's been a few years, though [07:03] bingo [07:03] finally figured it out [07:03] http://osdir.com/ml/bug-wget-gnu/2009-06/msg00045.html [07:08] over 2 years later and still not addressed [10:10] What actually causes wget memory to go up so high? Is it the list of finished and unfinished links? [11:26] :( [11:26] someone's poems were badly corrupt [11:26] ./000/944/122/000944122.html: HTML document text [11:26] ./000/944/122/000944122a.html: data [11:26] [db48x@celebdil poems]$ file ./000/944/122/000944122{,a}.html [11:26] [db48x@celebdil poems]$ cat ./000/944/122/000944122a.html [11:26] [¦|¦¦¦¦[db48x@celebdil poems]$ [11:28] I'd say [11:28] :/ [11:30] +++ ./011/006/587/011006587a.html 2011-05-02 22:59:15.000000000 -0700 [11:30] --- ./011/006/587/011006587.html 2011-11-23 13:36:53.297491846 -0800 [11:30] [db48x@celebdil poems]$ diff -u ./011/006/587/011006587{,a}.html [11:30] @@ -11,9 +11,9 @@ [11:30] - [11:30] - [11:30] - [11:30] + [11:31] + [11:31] + [11:31] [11:32] not sure how I'll weed out that kind of duplicate [12:26] SketchCow: I think you agreed with me on the Cracked post about libraries destroying stuff [12:27] you said this in 2007 defcon video [12:27] dude, dudes, major problem [12:27] like, major problem [12:28] what is? [12:28] global hard drive shortage [12:28] and i need to buy lots of drives and now i cant [12:28] im screeeewed [12:28] i know the feeling [12:29] well, i cant buy in bulk now [12:29] anywhere i buy from limits max of 2 drives for purchase [12:29] luckly the stuff i'm backing up i can fit on dvd's [12:29] i cant really fit 3TB of data onto DVD's lol [12:29] there are seagate drives for $130 at walmart [12:30] only up by $40 [12:30] im in aus though [12:30] they went up alot [12:30] used to get 2TB drives for $80, now they are $160 each [12:30] That's the good part. [12:30] unlike WD where is $100 mark up [12:30] before the shortage, the 2TB WD's were $80 [12:31] i was talking usb hard drives [12:31] ahh, i dont use those... [12:32] i also backup stuff on dvd [12:32] they're just normal hard drives with a SATA to Mini-USB converter built in and casing. [12:32] yeah, but i get impatient woth USB drives [12:32] with* [12:32] Simply rip them open, and use them like a normal SATA drive! [12:32] hmm, doesnt sound like a half bad idea [12:33] also, i worked out if i were to backup all my stuff onto DVD's, it would take [12:33] 697 DVD's [12:33] i have 100 dvd's stack [12:33] more like 95 i think [12:34] bluray you can cut it down [12:34] yeah, blueray is like, 60GB per disc or something isnt it? [12:34] bluray* [12:34] more like 22gb [12:34] hmm [12:34] okay, gimme a sec here [12:34] but still 200 dics [12:34] 136 bluray discs lol [12:35] lol, that would be a few shoe boxes full [12:35] dual layer are like 46gb but are $15 [12:35] $15 us per disc? [12:35] i know you can get a 15 pack of single layer for $30 [12:35] only dual layer [12:36] dvds are more expensive, and you cant reuse [12:36] i can get 20 dual layer bluray discs from ebay for like $80 which is expensive but, means less dsics [12:36] hards disk are better : ) [12:36] except hard disks are on short supply globally lol [12:36] but of course, a use a lot dvds too [12:37] by the way, do yo need so many hard disks por personal use or prof? [12:38] for personal use, yes, i download that much [12:38] lol [12:38] 1TB to go untill ive filled this 4TB [12:38] then i need to buy another 8 to 10TB [12:39] you not download HD movies or something [12:39] You don't simply download HD movies into mordor [12:39] do you have 2 copies for every disk? [12:39] i only have about 1TB backed up, because, well, i cant get nay hd's right now lol [12:40] that 1TB is what i have deemed the necessary building block, should my drives fail [12:40] but, the biggest portion of that drive space used is geocities [12:41] oh [12:41] bluray that bitch [12:41] only way [12:41] i dont download what is on IA serves. If I dont trust IA, then, I wont have enough hard disks never. [12:41] yes, next is videos that i backed up form VHS and other things downloaded, and all the standard things like pictures, music, laptop backups, work ect ect [12:41] But really, I dont trust IA. [12:42] IA? [12:42] You forgot porn. [12:42] lol, i dont sotre porn [12:42] store* [12:42] MAN. [12:42] IA = Internet ARchive. [12:43] ok [12:43] also gotta dorp 4TB in my server since, 500GB is too small [12:44] i would love 6TB dvd [12:44] I have 3 hard disks, and I dont want to buy more because it has no sense. [12:45] even at $100 [12:45] I cant store the entire Internet. I select what is important for me. [12:45] i store what i use, what i deem important and necessary [12:45] i'm backing up crankygeaks and diggnation [12:46] i have hak5 backed up [12:46] Yeah, you use 4 TB of data every day. Dude. [12:46] almost 4TB [12:47] i'm also making a linux archive [12:47] a distro that can stay alive without internet [12:48] like compile itself from scratch [12:48] that would take a bit of effort [12:48] i have been working on it with some friends [12:49] reminds me of robots building robots [12:49] im still trying to figure out how to setup a small linux box to act as a DNS for my main server [12:49] with some luck but not much more [12:49] dnsmasq [12:49] eh, i been using Bind9 on this little box [12:49] i got it partially working, not 100% though [12:50] i used a local-mirror script i made to work with local host or wireless ip [12:50] also to setup the website from the mercurial repos [12:56] looks like i broke WebCitation [13:01] hmm [13:01] i suppose i should stop being a lazy ass and poke fortunecity some more but ill do that tommorrow XD [13:19] as far as they reckon, the drive shortage probably wont be over till the start of next year :/ [13:20] you can waste christmas with your family [13:21] lol, spending christmas with the family and the girlfriend is great (: [13:24] I swear I can detect sarcasm [13:24] no sarcasm whatsoever in my comment (: [13:25] it's that smiley face, it's putting me off. [13:25] O.o [13:26] Yep, sarcasm. [13:26] Admit it, you are in love with archiving. [13:26] You want to be with archiving FOREVER. [13:26] LOL [13:26] not even sarcasm XD [13:26] * NotGLaDOS hands kin37ik the ring [13:26] Go on, propose. [13:26] bahahaha [13:27] eh, phones ringing at this time of night? O.o *hopes it isnt a telemarketer* [13:27] It's the INTERNET. [13:27] No, it's archiving, ringing up [13:27] Wondering why you don't love it [13:30] lol [13:30] well, oddly enough, it was a good mate of mine, wondering what i was upto tommorrow O: [13:31] Did you reply with "saving the internet"? [13:31] If you did not, please hit "redial" [13:31] LOL no [13:31] XD [13:33] You're Internet Superheroes. [13:34] We are! [13:35] Who is the wget creator? That man had a good idea. [13:39] Giuseppe Scrivano? https://en.wikipedia.org/wiki/Wget [13:40] Nice surname http://en.wikipedia.org/wiki/Scrivener [13:40] http://en.wikipedia.org/wiki/Scribe [13:42] ohh yeah, i gotta redownload Wget after siping my system 2 weeks ago, thanks for the reminder [13:42] wiping* [13:44] siping > wiping [13:44] Actually, I think siping does exist.. [13:45] http://en.wikipedia.org/wiki/Siping I do love a good rubbersing [13:47] lolwat [13:47] random as [13:51] Wikipedia editors have bots creating unreal articles. I heard. [13:52] created in 2005. [13:53] When Yahoo! was a great corporation. [13:53] Wait. [13:53] LOL @ yahoo....... [13:53] emijrp: made my night [13:56] https://en.wikipedia.org/wiki/Toilet_paper_orientation [13:56] 70kb. Not bad. [14:33] hmm, ill poke some more fortunecity tommorrow, off to bed, laters [16:29] Simply rip them open, and use them like a normal SATA drive! [16:29] they're just normal hard drives with a SATA to Mini-USB converter built in and casing. [16:30] not so much anymore. WD actually manufactures hard drives with built-in USB, partly because people would do this. [16:31] yes, but that's not all drives. [16:31] another issue is that externals usually have aggressive power saving schemes. [16:32] sometimes they can be disabled [17:01] hmmm.. I seem to be able to reach much of archive.org, but still no luck on the batcave [17:22] batcave down? [18:04] dnova: As far as I can tell, yes. That or having connection problems. Maybe SketchCow or underscor know more. [18:17] batcave being rebooted [18:34] seems to be talking now [18:34] uploading one set of splinders. other server is still splindering [18:48] what is batcave [18:49] It's the server that SketchCow has at Internet Archive [18:50] ah, wondered if it was there or elsewhere. [18:55] for some reason I thought the downloader gzipped the entire profile into 1 file [18:55] guess not [19:04] SketchCow: I have 160 gb of splinder to upload, does batcave have the space? [19:09] I'm sure it has [19:11] I'm surprised at how fast my upload is going. [19:12] from ireland to (I assume) california [19:18] Latency does little to transfer speeds these days [19:18] plus, most links are pretty darn quick [19:25] currently uploading one set of ~35gb or so and I'm already at the M's :D [20:02] oh it's not quite so simple. but still going very fast :) [20:04] Batcave has the space. [20:05] excellent [20:06] SketchCow, can sometimes CPU be a problem? [20:06] I noticed that upload speed was way higher without compression, even if I had enough idle CPU [20:07] SketchCow, do you know if you can use a standard 5 1/4" floppy drive to copy c64 disks? [20:08] Yes [20:08] (to PatC) [20:08] I can't answer the other one. [20:08] Thank you sir! [20:09] ok [20:14] you can use a standard drive to copy the front side of c64 disks, but if you want the back side you either need to modify the drive or cut up your disks [20:14] or use the 1541C drive I have? :p [20:15] yes that works too [20:46] once the upload script finishes is it completely 100% safe to blindly assume everything went ok? [21:38] I haven't finished a profile in ages, but I'm still downloading from splinder at about 0.5MB/s [21:42] good for you [21:42] how many instances? [21:42] down to 19 or 20 [21:43] 19 or 20 profiles [23:50] this is the 2nd most hard core thing I've seen this week: http://www.youtube.com/watch?feature=player_embedded&v=IY0mDRrqcVU (floppy drives playing Still Alive) [23:52] I am of course the most hard core thing you've seen. [23:52] I bought my new table. [23:56] Watch out, we've got a badass SketchCow over here. [23:57] you're definitely tied for number 1- this guy did some rather impressive feats- he re-did his HP thin client to have SATA ports, and didn't have a spare PCIe for GigE, so he desolders the GPU and reuses that connection http://hackaday.com/2011/11/26/rebuilding-a-mac-se-as-a-server-again/