[00:01] Nemo_bis: You need to tell me [00:01] I can do it [00:01] Easily. [00:01] SketchCow, give me permission you mean? [00:01] Insectoid. Sorry for the thing. [00:01] Or move hundreds items? [00:01] I was off seeing a movie, etic. [00:01] I saw Cabin in the Woods OH MAN CABIN IN THE WOODS SO GOOD ANYONE WITH SPOILERS GETS KICKED [00:02] * Nemo_bis saw Titanic [00:02] SHIP SINKS [00:02] nooooooo [00:02] DUMBLEDORE MOLESTS POTTER [00:02] CAMERON GETS RICH [00:02] OK anyway so [00:03] Simple offer, Insectoid - first, let me have a copy of the data. We can hold it [00:03] Then we can find ways to undark obvious things [00:03] Or, and I am fine with this, it's held for x amount of time. [00:03] I'd prefer to curate and undark, of course. [00:03] But please don't delete. [00:04] ideally also point to archive.org items with 301 redirects [00:04] but that's step 6 [00:07] Dragan of the geocities torrent curation wants me to come to europe to talk with him at a museum [00:09] that's awesome [00:09] EU loves itself its data curation [00:26] http://i.imgur.com/9M0lk.jpg [00:26] That's the library/museum [00:26] SweeeeeeeeeEEEEeeeEEEEt [00:27] Woah, which is that? [00:27] Woah, that's badass [00:28] Lot of free shelf space on those lower levels. [00:29] Wyatt|Wor: Given the obscene waste of floor space by cutting out the center of the building, I think they're going more for aesthetics than function :) [00:31] At least in the part shown. [00:31] Looks like a huge gap between corners [00:31] Either way, fuck is it sexy [00:32] SketchCow: Wow. That is *gorgeous* [00:32] Kind of reminds me of Seattle Public Library's amazing corkscrew stacks [00:33] holy piss [00:33] * DFJustin saves to "library porn" folder [00:36] BTW, SketchCow: I've got most of digiplay.info now (need to clean up my data scraping) - what's the best way to upload it to archive.org? eg, anything special to mark it as an archiveteam project (if it counts?) [00:37] mistym: corkscrew stacks are good, would be better if there were escalator out as well as in :P [00:39] You museum folk are making me jealous; I spent two years in a bomb shelter :( [00:39] DFJustin: Here, another one for you http://upload.wikimedia.org/wikipedia/commons/b/b3/OSU_Thompson_Library_-_west_atrium_and_book_stacks.jpg ;) [00:39] chronomex: http://theamericanscholar.org/uploads/2011/09/Seattle_Public_Library_4th_floor_2A-e1317923814889.jpg [00:39] I had this as my dual-monitor wallpaper at work for a while http://upload.wikimedia.org/wikipedia/commons/c/c5/British_Museum_Reading_Room_Panorama_Feb_2006.jpg [00:40] Nice! [00:40] mistym: that's the "inside a cyclopean colon" floor [00:40] mistym: I live in Seattle, btw [00:41] Natural sunlight :( [00:41] shaqfu: We can't make our own suns yet. We're working on it! [00:42] Wyatt|Wor: Those books in the OSU pic would prefer none at all :) [00:43] wtf is that tagbanwa script at the bottom [00:44] shaqfu: I was rather curious about that. I think I heard the glass was UV polarised or something, but I couldn't tell from the inside. :/ [00:44] chronomex: Oh, cool! I've only been there once, but you can tell I was p. impressed by the library. [00:44] mistym: indeed. where are you homed? [00:45] code4lib conference was there in February. I guess I kind of associate the awesome time with the city + library too :V [00:45] chronomex: Winnipeg. [00:45] heh k [00:45] Wyatt|Wor: I'd imagine it was; there's no way the librarians would let them build stacks like that [00:45] aka frozen tundra, etc [00:46] shaqfu: For 108 megadollars, I should damn well hope so. ~_~ [00:50] mistym: ah, yes, "winnipeg is a frozen shithole" etc [00:54] chronomex: Exactly [00:55] At least you got hockey back! [00:55] Yes, it's true! [00:57] My favourite depiction of Winnipeg is Guy Maddin's movie My Winnipeg, though. Truest pack of lies I've ever seen [00:58] I've never been to Winnipeg [01:53] mistym: Collect up, I'll be able to talk more later [02:06] SketchCow: OK. [07:44] http://archive.org/post/417779/cbs-evening-news-airs-news-piece-about-brewster-and-the-archive [07:48] SketchCow: looks like dl.tv and crankygeeks videos are not link now [07:48] i got all of crankgeeks but only the first 30 of dl.tv [07:49] episode 30 not even 2 weeks ago [07:49] its a dead link [07:51] shit [07:51] all media links to episode 92 are down [07:51] this is DEFCON 1 guys [07:52] we have to save this [07:53] dl.tv -> Holy fucknuts that's a lot of scripts. [07:53] i know [07:54] looks like i was not fast enough [07:55] episode 249 is down too [07:55] this is really bad [08:02] i'm getting some 260ish episodes of dl.tv [08:07] looks like everything with old video names are the ones have problems [08:22] hahaha [08:22] an oldie: http://archive.org/details/stage6-3156 [08:24] archiveteam reacts to a shutdown notice [08:24] javascript is weird. [08:27] mmm [08:27] http://archive.org/details/stage6-2165705 [08:34] go remix culture! [08:34] i may get some full episodes of dl.tv from google videos [08:48] i'm getting dl.tv episode 258 [08:48] more episodes have the new files name and hosted there [08:58] Morning. [09:02] Up early. When's your flight? [09:05] Official time is 7am, the boarding is 6:20am. [09:05] Delta are extraordinarily asshole about times. [09:05] They want bags checked in 75 minutes before flight, they want they want. [09:06] I am only with them because they were the most not-insane times for the lowest cost. [09:06] But I do hate Delta - it says something I am doing this like this. [09:07] No doubt; this is pretty exciting. [09:09] It is a somewhat insane expense in a life where I don't have much room for it. [09:10] But the attention and resultant connection to my various causes is why I am doing it. [09:10] morning [09:10] P.S. 800 radical zines. http://archive.org/details/solidarityrevolutionarycenter [09:10] I wish she had scanned some better, but that's life. [09:11] SketchCow: Do you at least get an interview with Jordan out of it? (And whomever might also be in the area) [09:11] awesome [09:12] Oh yeah, here or there, I get Jordan. [09:12] I brought a subset of interview equipment. [09:12] Not sure if I can get a great interview setup with it, I've got only the barest of essentials. [09:14] i have some good news [09:14] all dl.tv shows maybe on mevio.com [09:15] is there a version of that german kid flipping out that is subtitled in english or has no subtitles? [09:23] To think... I almost went to Wendy's!! [09:23] I look around the corner... gourmet organic to-go shop. [09:24] Also, a Todd English restaurant but they don't seem open yet. [09:32] test [09:35] test [09:35] . [09:35] Hey, it's the most awesomest blind dude in Archive Team [09:36] Yeah [09:37] Insectoid: When you are available again, I'd like to know if you'll take my offer to store. [09:37] So how do I send private messages anyway? Trying out Irssi [09:37] You use the /msg command. [09:38] Yeah, figured that. So it it /msg @sketchcow bla bla bla or whatever/ [09:38] For some reason / commands aren't working, no idea why [09:39] I'd love to find a good Windows IRC client that works with my screen reader. I'll have to ask around [09:39] jaybird11: without the @ :) [09:40] No @sketchcow just sketchcow [09:40] Ah yeah I figured that, but since I'm also familiar with Twitter I figured the @ might be part of his username [09:41] my ocd is going [09:41] got to get all dl.tv now [09:41] jaybird11: In IRC, the @ means channel OP [09:41] Ah got it [09:41] jaybird11: You could try http://nightowlproject.freehostia.com/ [09:47] i'm going to start putting up crankygeeks [09:48] since i maybe one of the few that has all episodes now [09:55] ,/quit [09:56] Okay, this is weird. I was using my Linode to connect just then. It seems maybe my ISP has blocked access to efnet? I can get on other IRC servers but not efnet, from my local machine [09:57] This airport is now bustling. [09:57] 6am. [09:57] Oh SketchCow, I'd like to wish you the best of luck with your source code retrieval efforts. [09:59] Here's hoping! [10:13] Okay I'm on using NightOwl [10:14] Test please ignore [10:17] test [10:20] pilot episode of crankygeeks uploaded: http://archive.org/details/crankygeeks_pilot_episode [10:39] episode 001: http://archive.org/details/crankygeeks_001_episode [14:57] Who's working on the Flickr project? [15:17] WHY http://www.atarimuseum.com/robots.txt [15:18] Clearly they're trying to provoke us. [15:19] DFJustin: the saddest face :( [15:23] mistym: what now? :/ [15:24] balrog_: 04/16/12 11:17:52 < DFJustin> WHY http://www.atarimuseum.com/robots.txt [15:24] balrog_: WHY http://www.atarimuseum.com/robots.txt [15:24] daamnnnnn [15:24] dnova: You win [15:24] that is stupid [15:24] not as stupid as blocking * though [15:25] more perverse, though [15:25] http://www.a1k.org/robots.txt [15:25] it is more perverse, yes :[ [15:26] why would they do that, does ia_archiver really take much bandwidth? [15:26] I doubt it's the bandwidth they are worried about [15:26] balrog_: I remember seeing people block *, then wonder WHY DOES NO ONE FIND OUT SITE :( [15:27] LOL yeah [15:27] I didn't know of that site either... [16:34] well, there is this [16:34] https://www.google.com/search?q=robots.txt+User-agent+ia_archiver+disallow [16:34] http://perishablepress.com/wordpress-robots-rules/ [16:34] this is also weird, too [16:35] like so much else on the Web, it looks like people just cargo cult the Disallow: ia_archiver bit [16:35] I know that a lot of those are WordPress sites, and so cargo culting is to be expected [16:35] but I would not be surprised if it bleeds over [16:36] tl;dr: fuck em [16:37] yipdw: s/WordPress/PHP/ [16:38] I guess [16:38] lol "perishable press", apt [16:38] I just saw a lot of WordPress [16:38] also [16:39] http://perldesignpatterns.com/robots.txt <-- this guy has anger issues [16:39] Whoa. That is a long comment. [16:39] that said, that is perl, so I guess anger issues are to be expected [16:40] how many other unfair stereotypes can I propagate today [16:40] ahahah, Slurp is a "pisswater search engine that no one uses" [16:42] Isn't it true though. NO ONE is using Yahoo [16:42] Except for Archive Team [16:43] if you get my drift [16:43] zing [16:43] # Slurp makes more passes through the site than all the other search engines [16:43] # put together. [16:44] And [16:44] # You know what? People create these faster than I can shitlist them, so [16:44] # you dinks have ruined it for everyone. Unless you're Google, go away. [16:44] "shitlist"... was expecting that this domain would be owned by Kimmo Alm after seeing that [16:44] heh [16:44] http://www.webmasterworld.com/robots_txt/3245173.htm [16:44] I like how that file just GROWS [16:44] Is there a random sample of Geocities that's smaller than 700G and easily accessible? Like say 5,000-10,000 accounts? [16:45] In English, preferably. [17:13] Or about 10G; whichever [17:16] Wyatt: i have maybe 20-50 accounts here somewhere [17:16] Well, it's a start.... :< [17:17] some robots.txt files make sense, to block internal URLs and stuff [17:18] hell, robots.txt tells you right where to look [17:18] balrog_: that's why we have HTTP 403 [17:19] it doesn't make sense to use a robots file to block that if it's truly internal [17:20] I can see an argument for expensive operations triggered by GETs, but that is only a shitty band-aid at best [17:20] yipdw: no, I mean stuff like edit pages for CMSes and all [17:20] to cut down on unnecessary GETs [17:21] oh [17:21] ok [17:21] yeah, I agree that's ok [19:20] any of the coders able to take a look at this problem, like alard [19:20] http://pastie.org/private/deb5yju7ld0botqw4hm6ma [19:20] a lot of my threads are stopping there [19:20] then i have to stop and start it again [19:20] and from what i can see it is wasting a lot of time like this because a whole batcch from the previous download set is lost [19:22] lachlan cranswick's site has a sensible robots.txt file: it only blocks the reports section of the site, which is a bottomless pit of generated-on-demand site usage reports, with links out to files (sometimes incorrect links, causing spider hell) [19:23] oli: after that 100 continue, it is uploading the actual data. [19:24] you should check with alard about updates, though. I don't think you're supposed to be going through batcave anymore. [19:25] im using the latest script alart send me a link to [19:25] and ok, didnt realise it was acutally uploading [19:25] just sat there forever :| [19:25] guess i should run nload before killing stuff :p [19:30] i know curl won't output a progress meter unless either --output is specified or stdout is redirected [19:30] ok [19:45] Would it be feasible to make batcave a CNAME for fos? [19:46] Why? [19:46] edit your /etc/hosts file if you need to [19:47] Suppose that's a solution. I'll get to it in four days then. [19:55] Wyatt: different port number from the s3 enpoint machines, and i don't think fos is running the proxy currently [19:55] Hi. batcave is still the way to go for the mobileme seesaw-s3 uploads. [19:55] Oh. Well damn, never mind then. [19:56] Oh, okay [19:56] And the scripts get the upload target from the tracker, so it's very easy to switch over. [19:57] Wow, I like this universal tracker more and more. [19:57] ok. iirc, SketchCow was mentioning needing to stop going through batcave before too much longer [19:57] Yeah, he's been trying to get off of it for weeks now. [19:57] 's good shit [19:58] So yeah, don't suppose any of you have a random ~10-15G sample of Geocities that could be acquired easily? [19:59] i currently have no geocities [20:00] Wyatt: Can't you just download a tar.gz and stop it halfway? [20:01] It won't be random in a statistical sense, but it will be 10-15G. [20:02] ^ [20:02] alard: Yeah, that's exactly the issue. [20:02] so download em all and sample yourself? :P [20:02] That's gonna take like a month. [20:02] At best. [20:03] Move your ass over to someone with nice bandwidth [20:03] Like a.. university, or something [20:03] No portable computing. [20:03] I'd do it at work, but that machine is currently tied up with mobileme [20:04] Bring a drive of some sort :P [20:04] Wyatt: Heroku? [20:04] I get it, you guys don't have any. Thanks anyway. [20:04] how much geocities data is there? [20:04] that was rescued [20:04] alard: Heroku? [20:04] around a terabyte [20:04] Wyatt: I'll check with my buddy if he has any left on his drives [20:05] Wyatt: An ec2 machine, free bandwidth. [20:05] i really need to get those random-access gz/bz2 and tar tools I've been meaning to write written. [20:05] alard: Ah. Hmm, might be doable. [20:05] there is e.g. this but its randomness is not guaranteed http://archive.org/details/archiveteam-geocities-latecomer [20:05] i have a box on 100mbit only i can do something with if you need help with geocities stuff [20:06] "only" 100mbit [20:06] yes well when we are talking about the volume of shit moving around here its not fast :( [20:06] Oh! I might have a lead, actually. [20:08] oli: So your mobileme curls are hanging before the upload? [20:10] Awesome, I've got a friend who was at Defcon and he got it stashed away. [20:10] Problem solved. [20:10] Neat! [20:14] alard: they just sit at that point i pasted [20:14] and i do not think they are uploading anything because i left some for ages and no progress, and not using much bandwidth [20:17] Strange. Looking at the tracker, there are also lots of instances that stopped a bit earlier. http://memac-tamer.heroku.com/ The blue ones didn't even start uploading. [20:18] I downloaded some shitpile of memac before there was an upload strategy [20:19] alard: I've noticed that My uploads have been going through a full 100 failures before trying again with a new process and succeeding the first time. [20:19] Sometimes. [20:23] What would be a good way to solve this? 1. Setting the number of retries to one or two would help (the script retries too). [20:24] blopody hell [20:24] 2. curl should time out. Why doesn't it? I can't find a timeout option in the man page, except for the connect-timeout that only works at the start. [20:24] so thats a lot of mine that are wasted really from what i understand :/ [20:24] Well, not really wasted. You can restart them if you're careful. [20:24] i dont understand that tracker page [20:24] Doh, I actually meant to drop it to twenty or so retries, but forgot. [20:26] Okay, I've updated the script to have 2 retries. [20:27] For the text upload I've also set the max-time to 60 seconds. I'm not sure what I should do with the tar upload: how long can that take? [20:27] Do any of the mobileme files have non-ASCII characters in the URLs? [20:27] Probably. [20:29] oli: If you have a failed client, you can restart it as follows: 1. make sure it's stopped. 2. remove the to-upload.tar, if it exists, but keep to-upload.txt; 3. run ./seesaw-s3-repeat.sh again in that directory. [20:29] Oh, is THAT how you do that? [20:29] lol already fucked it up a few times mate, i think ive lost what was done because i didnt do what you just said [20:29] but ive noted it and will do that next time they die [20:30] You could also try to just kill the curl, I think the script will start a new one. [20:35] Anyway, I'm not sure what to do about the hangs-while-uploading problem. max-time is the only way to set a timeout in curl, but genuine uploads can take many hours, too. [20:37] hey guys i've got approx 30gB of data from mobime that i downloaded a while ago but didn't upload... before the all-in-one seesaw script was in place. do i need to arrange for a rsync slot? [20:38] gui77: No. There's an upload script you can use. Update the code you have (git clone pull) to get the latest version. Then run ./upload-finished.sh [20:38] Sorry, that should be git pull :) [20:39] what's the command for a git pull? :/ [20:39] You've used git clone before to download the scripts? [20:40] cd to the directory that holds the scripts (and which has a data/ subdirectory with the files). Then just run git pull [20:40] It will get the latest code from GitHub. [20:40] i did... but have forgotten. can't find where i found it in the wiki either. thanks mate [20:41] Run ./upload-finished.sh gui77 to upload. You can kill the script at any time and restart it later. [20:43] got it, thanks mate. it's doing the git pull now :D [20:43] Well, thanks for uploading! [20:47] alard: yaay it's working - and thank you! no need for more than one instance, right? [20:47] gui77: No, that won't work. [21:27] curl lets you set a minimum speed and a time for that minimum speed. perhaps you can use that to detect a stop? [21:29] alard: ^ [21:30] (see --speed-limit and --speed-time) [21:31] GREETINGS FROM JORDAN MECHNER'S HOUSE [21:31] HELLO [21:31] hi sketchow [21:31] hello sketchcow and jordan [21:31] HI [21:31] !! [21:31] Heey, how's was your flight? [21:32] Pfft, fuck this Mac and it's inane bullshit [21:32] Argh [21:32] Wow netsplit almost killed me there! [21:39] just watched the cbs news segment featuring Brewster Kahle and the physical archives. one shot saddend me a bit: the worker scanning a book's barcode and then carelessly tossing it into a nearby gaylord [21:43] Coderjoe: Making cases for that many books would be a staggering task [21:48] I suppose they could shrinkwrap them, but then it's an issue of time [21:49] or stack them in the gaylord a bit more neatly [21:49] That too [21:49] Oh, you meant *literally* toss [21:49] yes [21:50] Ouch [21:50] http://www.cbsnews.com/video/watch/?id=7405466n [21:52] Ooh, they're scanning bound newspapers also? [21:52] Sexy [21:53] good job, you made me look up "gaylord" [21:53] http://en.wikipedia.org/wiki/Gaylord_Container_Corporation [21:55] That doesn't look like a Gaylord they're chucking books into [21:57] i've never seen brewster speak before, but he looks like a dude who actually gives a shit [21:58] at 24 seconds, those look like gaylords to me [21:59] They might be; I've never seen huge boxes like that used for book storage [21:59] What _is_ a gaylord? I take it it's a vessel of some sort? [21:59] perhaps they get reboxed before going into deep storage [21:59] Coderjoe: In the shipping containers they're in smaller boxes on pallets [21:59] Wyatt: large shipping pallet boxes. [21:59] Wait, fuck; we're using two definitions [21:59] shaqfu: yeah, I noticed [22:00] aka a "bulk box" [22:00] http://a248.e.akamai.net/origin-cdn.volusion.com/bwhe9.dh7dw/v/vspfiles/photos/B-1001-2T.jpg [22:00] That's what I was thinking of [22:00] http://en.wikipedia.org/wiki/Bulk_box [22:01] I associate those as Gaylords since they come from the same company [22:01] the bulk box is also known as a gaylord box because at one time, the largest producer of the box was the Gaylord Container Company from Gaylord, MI [22:01] if you want to hear him speak some more, I recommend this talk (1.5 hours) http://longnow.org/seminars/02011/nov/30/universal-access-all-knowledge/ [22:02] Well, learned something new today: Gaylord makes things other than acid-free paper [22:04] or the first to produce them [22:17] test [22:17] hi Jaybird11 [22:17] So I'm watching #popsource now, as if anyone here isn't probably. [22:17] win [22:17] (define popsource [22:17] lol [22:18] what network? [22:18] Twitter hashtag. [22:18] oh [22:18] Too bad I'm blind, I'd probably love to be seeing those pics [22:19] Jaybird11: Isn't it #sourcecode? [22:20] It was supposed to be. I searched for #sourcecode and found out he changed it to #popsource [22:20] Wyatt: changed by Jordan Mechner because #sourcecode was too popular for something else [22:20] Aaah, I see. [22:21] here's the change announcement tweet: https://twitter.com/jmechner/status/191972868479922178 [22:22] wow. some korean tweet just popped up [22:48] I expect a FUCK YES to show up in here pretty soon [22:48] or something along those lines [22:49] Unless the disks are dead :( :( [22:52] that's true [22:52] in that case I guess we'll see a "Bad news, gang" or :( or DAMNIT [22:52] Looks like they're still copying: https://twitter.com/#!/jmechner/status/192020918145515520 [22:53] is there a livestream? [22:53] just the twitter feed, afaik [22:54] a live stream would be awesome [22:54] things are being worked on :) [22:54] am I missing something important? [22:54] Knowing me I probably sm [22:54] am [22:54] https://twitter.com/#!/search/%23popsource [22:55] Ohhhh, the PoS sourcecode [22:55] if you're watching @jmechner, @textfiles, and @y816 you're covered [22:55] right [22:55] PoP* [22:55] lol [22:55] go me [22:56] doh. missed others because they didn't have the hashtag :( [22:58] that is probably one of the coolest disk copy utilities I've seen in a while [22:58] the only way it could be better is if you could play Metroid while it copies :P [22:58] damn looks like they got they right man for the job https://twitpic.com/9afois [23:00] oh he is [23:00] he's got what's probably the largest collection of interesting Apple stuff anywhere [23:00] tons of proto stuff [23:01] (for Jaybird11 - picture is of a room festooned with apple posters, floppy drives, opened Apple II chassis, monitors, etc.) [23:06] I was wondering if that was a stock system or something fancy [23:06] Looks like the answer is "something very fancy" :) [23:06] stock system :p [23:07] with an ethernet card [23:07] Ah, gotcha [23:09] btw [23:09] today is the 35th anniversary of the Apple II [23:09] which makes it more significant :) [23:09] Yeah, how much of a coincidence is THAT [23:09] auspicious [23:09] Hah [23:10] I didn't even think of it. [23:10] maybe SketchCow did XD [23:12] all hail the mighty Cow [23:17] キタ━━━━━━(゚∀゚)━━━━━━!!!!! [23:19] DFJustin: So here's a stupid question. I recognize the kana letters, but what're the other things? [23:19] YAY [23:20] https://twitter.com/jmechner/status/192028978549235712/photo/1 [23:21] victory! [23:21] \o/ [23:21] Victoly!! [23:21] CTRLSUBS.S exists outside the flow of time ;) [23:22] Was that the only disk they needed to recover? [23:23] there were several disks, based on the photos SketchCow posted [23:23] maybe they were the SFX/GFX dissk [23:23] dissk [23:23] DISKS [23:23] and the copy protection disks [23:24] those photos being http://www.flickr.com/photos/textfiles/sets/72157629727983887/with/6925540534/ [23:25] "POP source code disk: Read and copied with no errors!" [23:25] That's what threw me off [23:26] bayleef` it's a japanese forum thing, basically it's a happy guy's face with the ー from キターー ("it came!"/"yesss!") going through it [23:27] this is why he is called flippy disk sometimes [23:28] It was supposed to be. I searched for #sourcecode and found out he changed it to #popsource [23:28] Um never mind that [23:28] I will need some help. [23:28] We're going to put kareteka and prince of persia source into github [23:28] Tonight [23:29] Yes indeedy [23:29] holy shit! [23:29] oh my [23:29] wow [23:29] Impressive [23:30] SketchCow, did you choose this date deliberately? [23:30] awesome! [23:30] DFJustin: ah. わかりました。 The reader I'm using doesn't recognize kanji, so who knows if the conversion would come out right [23:32] This can't go anywhere [23:32] That we're doing this. [23:32] so where's the help needed SketchCow? [23:32] We're going to do it and then maybe someone mght freak at ubisoft [23:32] SO [23:33] Wait, I thought JM owned the rights to the original POP? [23:33] Yes [23:33] Oh, yes. [23:34] But he wants as he puts it the genie out of the bottle and then we're done [23:34] So what does the guy need help with [23:35] Github, he doesn't use it [23:35] How does he make a repo without installing git - can that be done? [23:36] He can't just send the code to you and have you handle it? Or would he rather do it himself [23:38] I imagine git without using the actual git software is like using Linux without a terminal in general; you just don't do that [23:39] yes, he needs git [23:39] if he's not a *nix guy then install tortoisegit [23:40] install msysgit and tortoisegit [23:41] really, he could just put up a ZIP or whatever [23:42] it'll make its way into a git repo at some point [23:42] yipdw has a point; not like he's going to try and mod his own decades-old code straight away [23:42] well, git is good for distribution and integrity checking [23:42] but for the purpose of getting it out there a zip + sha256sum is also fine [23:43] unless he's got revision history on those disks [23:43] (I don't know) [23:43] and an otherwise-unused github account would keep ubisoft from being able to claim some other person was illegally distributing the code [23:43] "stolen" code [23:44] I dunno why Ubisoft would care unless people started to distribute paid versions off this code [23:44] Given the unique circumstances, that'd be a hard claim... [23:44] because Ubisoft is dicks [23:46] "here it is now on github" is cooler than "here's a zip file" [23:46] something else about github (and gitorious/other code hosting services for that matter): they'll respond to DMCA requests very quickly, and at least Sony has learned to watch github [23:46] so if Ubisoft is actually a concern, and if the point is to take the genie out of the bottle [23:46] I can't really recommend github [23:47] not sure if this made it to your IRC channel yet. Art Knowledge News needs rescuing. They're reporting that the new owners of their site will clean their servers of 7+ years of content [23:47] cmccormic: link? [23:47] http://www.artknowledgenews.com/16_04_2012_01_59_52_sadly_this_is_the_very_last_art_knowledge_news_forever.html [23:48] yup [23:48] I guess it's time to wget -r www.artknowledgenews.com [23:51] oh wait, they have a sitemap [23:51] excellent