[00:49] sad about emulator-zone [00:50] atari are assholes [00:51] well, "Atari" [00:52] "Atari" indeed. [00:55] i think i can burn 4.6 Gbyte to dvd [00:55] normally its 4.4GB [00:56] i have a md5sum file so i can check if everything is there on disc after burning [00:56] I can burn like [00:56] I dunno 20gb to a dvd [00:56] if it is highly compressible data and I have enough time for bzip [00:57] i'm doing video [00:57] so just overburning? [00:57] dd if=/dev/zero bs=1M count=1M | lzop | gzip | gzip [00:57] 1 terabyte => few hundred kbyte [00:57] hehe [00:57] lol [00:57] i'm using recorder [00:58] a pygtk front end for command line tools [00:58] chronomex, I assume your joking [00:58] it's 100% true [00:58] why would I make this up? [00:58] i'm pretty sure /dev/zero = 0 <_< [00:59] yeah? [00:59] try it [00:59] I can put a terabyte of data on a floppy disk! [00:59] that's hilarious [01:01] I remember fake compressions programs on bbses in the 90s [01:02] any amount of data down to 50 bytes!!!! [01:03] burning 40-53 of systm [01:04] hd version [01:05] bzip2 would do really well on large quantities of nulls, but only because of a historical mistake [01:05] oh? [01:05] the first stage is an RLE compressor [01:06] can take 256 of the same byte down to 5 bytes [01:06] and the 900k buffer is after that RLE [01:06] reminds me of those fake hard drives that write in continuous mode [01:07] well there's a milestone. I hit 1000 tasks on archive.org [01:07] lol Famicoman [01:08] I've never seen that techique on mechanical hard drives, but usb flash drives and flash memory cards yes [01:09] ebay is littered with them [01:27] i'm only getting the large version of systm at and after episode 66 [01:59] Bloody Atari [03:03] not sure how many of you follow Carl Malamud, but he's got an interesting petition up here: https://wwws.whitehouse.gov/petitions/!/petition/start-national-effort-digitize-all-public-government-info/15vthgVB [06:19] SketchCow: I don't know if you already know, but your blog's RSS/atom feeds are broken again :-/ [06:31] (Or is the 403 forbidden for certain user-agents deliberately there to stop spammers?) [16:10] "There are no more posts to show right now." [16:10] yes, because nothing exists before 6:30pm tuesday [18:18] hey [18:19] if i've only got, say 100gb of disk space and an ok home connection (without a data cap), can i still be useful? [18:19] i do have a fairly powerful computer which i leave on 24/7, so i can compress stuff before upping [18:20] yes! [18:22] great :D [18:22] i'm curious about one thing though [18:23] typically volunteers downloads stuff, and then re-upload it to some server. why not just have the server do all the downloading? is it to have the data backed up sooner/for redundancy or something of the sort? [18:23] or so sites don't block that server for too much access? [18:24] sometimes the big iron helps directly. [18:25] but there are lots of advantages (and some disadvantages) to making a huge distributed effort. [18:28] oki [18:28] i'm going to try and get something for mobilme setup as it seems splinder is done already :) [18:29] you do have a non-windows OS on that system, right? [18:29] yeah [18:29] i have an ubuntu-based linux distro [18:29] (windows unfortunately has some stupid filesystem limitations) [18:29] ok [18:39] "(I/O Error: None: None)" <3 [19:10] Poor blog. [19:10] I should write that detector of mine. [19:11] broken blog detector? [19:13] Yeah [19:13] I should do a few things with the blog, really. [19:13] To shore that up. [19:13] But my Jamendo nightmare is nearly over. [19:13] The load-in of 55,000+ albums. [19:14] 16010 pts/26 S+ 0:00 /bin/sh ./whoopdedo [088897] Fallback - Bad Signal -- Jamendo - MP3 - 2011.04.08 [www.jamendo.com].zip [19:14] 16063 pts/17 S+ 0:00 /bin/sh ./whoopdedo [084930] d27m - because of an handful of stupid humans -- Jamendo - MP3 - 2011.02.05 [www.jamendo.com].zip [19:14] 16119 pts/37 S+ 0:00 /bin/sh ./whoopdedo [097783] Monster-Kill - Vulturous (Single) -- Jamendo - MP3 - 2011.09.01 [www.jamendo.com].zip [19:14] 17083 pts/22 S+ 0:00 /bin/sh ./whoopdedo [101246] AlecsRims - Stereotypes -- Jamendo - MP3 - 2011.11.01 [www.jamendo.com].zip [19:14] See, four threads. [19:14] As they each hit 0XX999, they're done. [19:14] I see. [19:15] Then I'll have them up to 101999. [19:15] 102XXX now exists, but it's not full yet. [19:15] When it is, I can run a script. [19:17] remember to back fill older IDs [19:17] any plans to handle updates? artists sometimes change their albums [19:19] All the older IDs should be there. [19:19] Any that are around. [19:20] No, I doubt I will handle updates at all - it's a snapshot. Unless there's an efficient way to write a script to do that work. [19:20] Just like there was a way to write a script to do the initial check. [19:20] older IDs might not yet be published publically [19:20] ah ok [19:20] what filesystem do you guys recommend that i use for a aprtition that'll only be used for these kinds ofprojects? ext2, ext3, ext4? [19:20] Older IDs? I have IDs in there from 2008 and 2009. They couldn't possibly still be waiting, could they? [19:21] gui77: ext2 or thereabouts [19:21] SketchCow: you never know. but people definitely change their albums, less than 1% of them, but still [19:23] Edge cases are always welcome. [19:23] I'm going to write something to step through what's on there and make sure everything's kosher, of course. [19:23] But that's different. [19:23] Depends on how crazy we want to go. [19:25] > x-archive-meta-title:Tokks Voitto - Oblivious to assisting mankind addicted to breed the starving hunter writing his reason on a blood sheet [19:25] I have a weird job. [19:26] :) [19:28] I'm almost done with it, that's what matters. [19:28] It was 1.8 terabytes of data. [19:30] I'm down to the last, like, 15g [19:36] after running get-wget-warc, is it already compiled or do i still need to com'pile it? [19:36] *compile [19:38] nevermind, i did it [19:41] Archive Team helps those who help themselves [19:43] OK, I'm actually going to slam those last 300 albums in, just to be complete. [19:51] i'm running like 7 instances and the speed seems to go between very little (200 kB/s) to maxing out my pipe. this is normal, right? any tips? :) [19:52] oh and what happens if/when it runs out of space? does it just quit gracefully? [19:54] oh and is there a way of running multiple instances quickly/easily, without having open up a term for each? [19:54] sorry for all teh questions hehe :/ [19:55] gui77: ext4 or jfs is what I use normally. [19:55] chronomex: yeah i decided with ext4 :) [19:55] jfs, though a bit slow, is stable as bedrock [19:56] i'm using ext4 for the rest of my partitions so i might as well use it for this one, haven't had an issue yet [19:56] anyone know when 28c3 ends? [20:00] tomorrow [20:00] ~15 hours [20:01] Yeah [20:01] They're putting up preliminary cuts of the talks, which is also amazing. [20:01] http://events.ccc.de/congress/2011/Fahrplan/events/4814.en.html is in 45 minutes, should be great [20:02] "Behind the scenes of a C64 demo" [20:02] Yes, I tweeted about that. [20:02] Obviously it's relevant to my interests. [20:10] ah, the good old C64 demoscene days. When you couldn't just throw polygons at the hardware and have a few billion transistors do the maths for you... You know you're old when you remember what DYCP, DYPP, FLI, etc. mean :-D [20:11] get your rocking chair off my lawn, grampa [20:11] SketchCow, 28c3 will be on archive.org then? [20:11] sure [20:14] A lot is going to be on archive.org. [20:14] But they also have their stuff up very well. [20:15] A project for next year is pulling in every conference that makes its videos available, ever. [20:15] Won't that be exciting. [20:15] <3 [20:15] and audio recordings too i hope [20:15] the HOPE conferences only have that iirc [20:26] Yeah, everything. [20:26] Every. Thing. [20:26] http://www.archive.org/details/hackercons is the first test run-through. [20:26] It'll be many more soon. [20:27] I'll see if I can get you the full quuality HOPE archives [20:27] SketchCow: is it possible to start uploading stuff while still downloading more? [20:29] what do i do if one of the instances presents an error? Running wget --mirror (at least 1064 files)... ERROR (4). [20:29] I need more detail gui. But first, a VERY fast food run so I can watch the demo [20:30] cool feel free, /query me whenever you have time :) [20:32] chronomex, which HOPE archives? [20:32] chronomex, if it's 2010 I have it too [20:33] and I have the pcap file [20:40] all of em [20:40] stack of dvds a foot high [20:42] "Behind the scenes of a C64 demo" starting right now, "Saal 3". streaming URLs: http://pastebin.com/raw.php?i=MfvqjZvc [20:42] err "Room3" ;) [20:46] or even the old DOS demoscene. again where you couldn't just throw polygons at the hardware [20:47] kinda miss those.. quite impressive [20:47] though some of the content of some of the 64k demos are still rather impressive [20:47] it was Second Reality that got me hooked to the PC demoscene [20:47] you want a return to the olden days? [20:48] join the calculator hacking community! [20:48] http://www.ticalc.org/ , /join #ti [20:48] chronomex, let me know if you upload them anywhere [20:48] dunno if it'll happen [20:48] ok [20:48] also #cemetech [20:48] bah [20:49] I was hacking on TI calcs 10 years ago [20:49] get off my lawn [20:49] so was I [20:49] fucker that's my lawn too [20:49] public park [20:50] I'm going to wait for you old men to die and I will fuck your skulls [20:50] Now watch the talk [20:50] NO [20:51] He's streaming the demo [20:51] DEMO IN EFFECT [20:51] AMMMIIIIIGGGAAAAA [20:51] Pants off [20:51] fap fap fap [20:53] That demo is awesome [20:53] They're using an emulator, points off [20:53] ^ [20:53] * PatC fires up his c64 and 1541 drive [20:53] C64 - VGA in... NOT A PROBLEM [20:53] Come in early [20:54] At blockparty, we did what we needed to do to bring stuff off a Colecovision [20:54] I think I still have the disk with the C64 port of the Second Reality Demo lying around here somewhere... I'm curious how it wuold look on my big TV :) [20:54] SketchCow, I have a coleco telstar colortron [20:55] Man, I am so fucking glad we've cleaned this NYC apartment [20:55] btw. there's a demoscene party at the moment here in Germany as well... "The Ultimate Meeting". I guess there will be some interesting releases from there as well [20:55] We have so much space [20:55] Yeah, I cruise pouet [20:56] what happens if you're running multiple instances of the mobileme wget-warc and then run touch stop? [20:57] Someone needs to answer you who are not me. [20:57] Ha ha "Only a few illegal opcodes are useful, but they should be used." [20:57] gui77: it stops eventually [20:57] Breaking news, there. [20:57] i'm in no hurry for an answer :p [20:57] :p [20:57] yeah, self-modifying code [20:57] This guy's going to quickly go from A to a B [20:58] ...evil to debug [20:58] chronomex: but does it stop all instances? or just 1? if 1, which? [20:58] self-modifying code - can't do without [20:58] DarkStar2: tricky to write, trickier to fix [20:58] true [20:58] gui77: all of em that are running from that directory [20:58] ah, that reminds me... [20:58] "if it was hard to write it should be hard to read!" [20:58] chronomex: ah ok thanks :) [20:59] 3 registers, 256 bytes stack, one interrupt... amazing what you can still create with such limited resources [20:59] that's tiny [21:00] msp430's tiniest chip is luxury compared to that [21:00] Oh boy, here comes a spammer on archiveteam.org. [21:00] uh oh... [21:00] SketchCow, where in the site? [21:01] User pages. [21:01] whoa, that's really eeevil coding :) [21:02] jumping into I/O space... geez, these guys still come up with really crazy ideas [21:03] speaking of spam, how come when googling "archiveteam" the text is an ad for viagra hehe? [21:03] You get half a guess, that I will ignore [21:04] The 4 Tenets of Archiveteam: Friendship - Unity - Caring - Kindness [21:34] The other reason a C64 demo might look better now than then was because they can use off-machine coding and development tools, then pour them right into the c64. [21:34] He's not mentioning that, I don't think he has to, but he's not. [21:35] yes, that's what I thought too [21:35] he talked about using LZ packers on his PC which probably was not possible in the 80s on the C64 itself [21:36] Exactly. [21:36] Or the girl, who is obviously a convert. [21:36] I'm FINE with all this, it's just why it seems to look better. [21:40] wow. wikipedia is currently registered with godaddy. they're apparently transferring away over sopa [21:47] Yeah [22:31] if for some reason (like a system or crash or a connection drop) i don't finish some downloads, does the trackers know that i didn't finish them (so it can assign them to someone else? [22:31] and should i go find incomplete files/folders and try to delete them so they're not accidentally uploaded, or does the upload script only upload complete ones? [22:34] i don't want to be supplying corrupted/incomplete data :/ [22:41] or should i just, without any downloads running, delete any folder with a .incomplete? [22:43] anyone want 2500 appleII disk images? [23:02] bsmith093: of course [23:02] k then uploading to batcave [23:03] fyi i grabbed it as a torrent zip i glanced throu it seems fairly organized [23:03] no obvious warez, plus its only ~400gb [23:03] gui77: the upload script only uploads complete profiles. the tracker (macme?) is told when a profile is completed, so it knows a profile was given out but not completed, but it doesn't know that it isn't still being downloaded [23:03] mostly shk files [23:04] Coderjoe: how should i correct the situation? [23:04] splinder STILL, how big is that last profile?