#archiveteam 2012-01-28,Sat

↑back Search

Time Nickname Message
00:21 πŸ”— balrog SketchCow: shit from whom?
00:21 πŸ”— balrog :/
00:21 πŸ”— balrog that's 100% true though
00:21 πŸ”— balrog from personal experience, even
01:35 πŸ”— DFJustin yeah he's 100% on the money
01:37 πŸ”— DFJustin you look at something like http://www.gamebase64.com/index.php - 22,500 games, you can probably buy some of those on iphone or virtual console or whatever but I doubt it's even 1%, the rest of it is not being sold and realistically will never ever be sold
01:40 πŸ”— DFJustin maybe in 10 years someone will remember they own it and try to sell it, but all their floppies will be gone and/or dead
01:40 πŸ”— DFJustin several of the "arcade classics" packs that have come out for recent consoles have had to use pirate dumps from mame because taito etc. didn't keep that shit
01:41 πŸ”— DFJustin and that's for durable rom chips
01:58 πŸ”— Coderjoe even if you can buy recent ports, it isn't the same game, and may have bugfixes or other changes
01:59 πŸ”— Coderjoe wow
02:00 πŸ”— Coderjoe not to mention (going back to analog tech) pirate copies of Doctor Who episodes helping the beeb recover from their colossal shortsightedness.
02:00 πŸ”— Coderjoe and some "lost" films being found in "pirate" collections
02:01 πŸ”— balrog even "durable" ROM chips only last 20-30 years
02:01 πŸ”— balrog ha, pirate dumps from mame Ҁ” you mean officials but "borrowed" from MAME
02:01 πŸ”— balrog that's Ҁ¦ sad
02:01 πŸ”— balrog sad that the companies are so sloppy and lazy
02:01 πŸ”— balrog :<
02:03 πŸ”— Coderjoe well, back at the start of every new form of media (film, tv, video games) there was little known reason to preserve stuff. it was being created to make money, and there was little thought to re-releasing it.
02:04 πŸ”— balrog yeahҀ¦
02:04 πŸ”— balrog like they can't learn from past mistakesҀ¦
02:10 πŸ”— Coderjoe new people in charge each time
02:10 πŸ”— Coderjoe each with "maximum profit" foremost in mind
02:32 πŸ”— balrog yes :/
02:32 πŸ”— balrog it's sad
02:35 πŸ”— SketchCow This is quite a debate.
02:35 πŸ”— * SketchCow checks the watch.
02:35 πŸ”— SketchCow Is it 1981 already?
02:36 πŸ”— SketchCow root@teamarchive-0:/2/CDS/friendly/MCbx/PC World Komputer Cover CDs/2002/2002# ls
02:36 πŸ”— SketchCow 01 02 03 05 06 07-08 09 10 11 12 i04
02:36 πŸ”— SketchCow root@teamarchive-0:/2/CDS/friendly/MCbx/PC World Komputer Cover CDs/2002/2002# du -sh .
02:36 πŸ”— SketchCow 14G .
02:36 πŸ”— SketchCow Delicious
02:36 πŸ”— SketchCow Look at that, 14gb of CD-ROM.
03:22 πŸ”— DFJustin <balrog> ha, pirate dumps from mame Ҁ” you mean officials but "borrowed" from MAME
03:23 πŸ”— DFJustin not even, iirc there have been dumps from bootleg boards because the originals had read-protected mcus
03:35 πŸ”— SketchCow OK, here we go.
03:35 πŸ”— SketchCow I got mail from a guy.
03:35 πŸ”— SketchCow My fledgling company, Temporal, LLC is in the early stages of developing a 4D holographic feed platform for commercial use. We are currently hammering out licensing details with a corporate client as well as re-architecting what was originally constructed in the Unity3D online gaming engine, and now being ported to HTML5 canvas.
03:35 πŸ”— SketchCow OK, so that's in there.
03:35 πŸ”— SketchCow It makes me want to shoot myself in the face.
03:35 πŸ”— SketchCow However, he deserves someone else looking at it.
03:35 πŸ”— SketchCow Who wants it.
03:36 πŸ”— SketchCow First to say it gets it, gets the response as archive team.
03:36 πŸ”— kjdemco not me
03:38 πŸ”— nitro2k01 wtf
03:45 πŸ”— chronomex .... 4d holographic?
03:45 πŸ”— chronomex I want to respond to that.
03:46 πŸ”— chronomex I totally can go somewhere interesting with this.
04:22 πŸ”— SketchCow OK, chronomex. msg me what e-mail gets it.
04:22 πŸ”— SketchCow http://www.youtube.com/watch?v=FWJZa2NvRFU
04:41 πŸ”— don lumarca is pretty cool.
06:21 πŸ”— SketchCow I am happy with it.,
09:22 πŸ”— Nemo_bis 3 day before Splinder closes
09:43 πŸ”— DFJustin http://29.media.tumblr.com/tumblr_lx4oocZxum1r7vyy1o1_500.png
09:45 πŸ”— Nemo_bis heh
10:00 πŸ”— Coderjoe omg
10:00 πŸ”— Coderjoe old but new to me
10:00 πŸ”— Coderjoe http://www.youtube.com/watch?v=xs72vl4h_pU
14:56 πŸ”— Nemo_bis https://wikitech.wikimedia.org/view/Dumps/Image_dumps
14:58 πŸ”— SketchCow It's a major technical problem and I do not envy the person trying to make it available.
14:58 πŸ”— SketchCow 200gb per tarball + 100 tarballs
14:59 πŸ”— SketchCow By the way, archive.org can take those tarballs.
14:59 πŸ”— SketchCow And would lovingly.
14:59 πŸ”— SketchCow There was a meeting between wikimedia and archive.org
14:59 πŸ”— SketchCow Some discussion.
15:01 πŸ”— Nemo_bis Yes, they're discussing image dumps specifically too.
15:01 πŸ”— Nemo_bis IIRC
16:04 πŸ”— alard yipdw: The wget chunked WARC patch has been accepted and added to the wget repository. Thanks for your help.
17:19 πŸ”— bsmith093 yipdw: did you ever finish the ffnet grab
17:20 πŸ”— bsmith093 project script
17:30 πŸ”— Coderjoe alard: did they also take the memory leak patch?
17:30 πŸ”— Coderjoe I wonder what other leaks remain in recursive mode
20:39 πŸ”— alard Coderjoe: Yes, the wget maintainer added both patches.
21:55 πŸ”— closure YM recursive mode is not supposed to eat ever-increasing memory until it dies?
21:59 πŸ”— alard It only does that if you're unlucky.
21:59 πŸ”— closure or try to archive geocities, in my experience
22:00 πŸ”— alard Heh, yes, downloading any non-trivial site increases the probability.
22:03 πŸ”— nitro2k01 http://blog.gg8.se/wordpress/2012/01/28/a-modest-copyright-proposal/
22:04 πŸ”— alard (By the way: a while ago I did some work on a wget version that keeps its queue on disk, using berkeleydb. I got half way, so if anyone feels like continuing that I'm happy to share the code.)
22:06 πŸ”— arrith alard: you could toss it up on github, slightly more discoverable that way
22:08 πŸ”— alard I might do that, yes.
22:11 πŸ”— Nemo_bis nitro2k01, I think you miss a piece there; the work people could be sent to is for instance a new war
22:13 πŸ”— closure heh, so the other day I wrote something to back up a github repo, by using their API to get all the forks, and all the other info from their database. Tried it on one of the most forked repos and it needed > 500 mb of ram
22:13 πŸ”— closure I need to adapt it to use a disk cache too
22:25 πŸ”— Coderjoe alard: can you tell me where that free goes again? I want to add it to my existing (old) wget checkout and run valgrind on a recursive download
22:26 πŸ”— Coderjoe before free:
22:26 πŸ”— Coderjoe ==32377== in use at exit: 150,315 bytes in 2,231 blocks
22:26 πŸ”— Coderjoe ==32377== total heap usage: 202,749 allocs, 200,518 frees, 8,065,348 bytes allocated
22:26 πŸ”— Coderjoe ==32377== HEAP SUMMARY:
22:26 πŸ”— Coderjoe for 318 files
22:27 πŸ”— alard https://gist.github.com/fcbd1025f8a439f811c0#file_1_fd_read_body_memory_leak.patch
22:27 πŸ”— Coderjoe wow
22:27 πŸ”— Coderjoe several of them
22:42 πŸ”— Coderjoe yep. there are other leaks. go figure
22:42 πŸ”— Coderjoe ==954== in use at exit: 112,427 bytes in 1,935 blocks
22:42 πŸ”— Coderjoe ==954== total heap usage: 202,749 allocs, 200,814 frees, 8,065,348 bytes allocated
22:42 πŸ”— Coderjoe ==954== HEAP SUMMARY:
22:44 πŸ”— Coderjoe er, wait
22:44 πŸ”— Coderjoe wrong lines
22:47 πŸ”— Coderjoe looks like that fixed all leaks in recursive mode

irclogger-viewer