[00:40] Going to be heading out to dinner [00:41] (If you're in SF, come to the dinner!) [04:29] http://www.youtube.com/watch?v=wQKKj_qeOBQ [04:38] test [08:37] guys, I have a question: may I run two "upload-finished" at the same time, for the same project, each getting its data to upload from two different directories ? [08:37] I mean, is the tracker ok with it? [08:38] sorry, I forgot to mention the project I'm talking about is MobileMe [08:45] fact is I now have a remote machine that's executing "seesaw.sh", but I also have 400GB of MobileMe data I previously collected with "dld-client.sh" which I was unable to upload (computer died) [08:46] so now I'd like to upload that 400GB of data on my remote machine (in another directory), and run "upload-finished" [08:46] is that ok for the tracker? [08:49] altlabel: Yes, you may run more than one upload-finished at a time, *if* you run them in a different directory. [08:50] (You shouldn't run more than one instance on the same set of files, but different files is OK.) [09:11] alard: thanks [09:50] ia s3 is super slow :( [11:16] http://www.underground-gamer.com/imagebucket/boot_001.png "We have M.U.L.E! If no one beats me to it, I'll package up the .IMG, teledisk image, box scans and a couple screenshots and make a torrent of it here. that may take a couple days." [11:16] nice [11:54] Good morning team. I'm having some server issues myself this morning, but could someone snag a mirror of GOOD Magazine's website at http://www.good.is [11:54] They laid off almost their entire LA editorial staff, and I quote " GOOD appears to be exploring a community-based publishing system with a public beta site described as “a platform for 21st century citizenship” that includes aggregation (GOOD Finder) and a tool for mobilizing locally (GOOD Maker)" [11:58] Also wandered across this and got me thinking - http://www.good.is/post/michaelangelo-matos-on-magazine-archives/ - more mags going digital in some fashion, with digital-exclusive content. Anyone archiving these digital conversions, and digital exclusives? Seems like something the traditional archiving space should (might) catch [11:58] Off to the town-wide yard sales for some urban preservation, free-market style ... catch you later [14:31] archiving videos of the black block [14:31] got close to 12 min video of them smashing stuff [16:31] ugh, http://thearcadeboneyard.com/librarysite/index.php is charging $26 per year now :( [16:31] for pdfs of useful manuals [16:32] so they're basically selling other peoples' shit [16:32] charming [16:35] It's OK. [16:35] I duped that shit some time ago. [16:35] :) [16:36] figures, and good morning! [16:37] you did? :) [16:37] great. also seems with some google-fu you can get any file in there anyway. [16:37] since they left the php-generated file indices up [16:40] http://picplz.com/ is our new target. [16:40] By the way. [16:40] I need to head out but we should focus on them. [16:42] wow, 1 month hotice :( [16:42] notice* [16:44] http://picplz.com/user/chula1925/pic/17tcp/ [16:44] what's interesting is that you can swap "17tcp" for any other picture's ID, even if it's from another user [16:45] and i make that 60466176 possible IDs [16:46] https://picplz.com/pic/3572/ and without the user link, things behave strangely. [16:48] looks like incremental IDs [16:48] partyhard.gif [16:50] I get ~ 11181084 ids. [16:51] There's an API that works with numerical ids: https://sites.google.com/site/picplzapi/api-methods [16:51] sum 36^i for i from 1 to 5 = 62193780 [16:51] ah, yes [16:51] Take a recent picture, here: https://picplz.com/city/san-francisco-ca/ [16:52] 'long url id': 17g07 [16:52] http://jsonviewer.stack.hu/#http://api.picplz.com/api/v2/pic.json?longurl_ids=17g07 [16:52] numerical id: 11181084 [16:52] http://jsonviewer.stack.hu/#http://api.picplz.com/api/v2/pic.json?ids=11181084 [16:53] the latest ID via the URL shortener is z9PXm [16:53] if they're incremental, what's that converted from base 62? [16:53] They could be random? [16:54] This is the highest id I can find: http://api.picplz.com/api/v2/pic.json?ids=11183550 [16:54] That's numerical id 11182860 [16:55] http://api.picplz.com/api/v2/pic.json?ids=11183559 is the highest consecutive one after that that still works [16:55] so somewhere north of 11 million [16:55] I nominate #piczzz [16:55] There are gaps in the numeric IDs, though [16:55] Could be deleted pictures I guess [16:56] warning: this project will result in you having an unbelievably huge collection of dong self-shots [16:56] warning, or reward [16:56] Users end at somewhere here: http://api.picplz.com/api/v2/user.json?id=1515537 [16:56] E.g. 11183541-11183542 don't work, but 11183540 and 11183543 work. [16:59] Going to drive a few hundred miles and do a couple interviews. [17:00] Does it make sense to start using the picplz API, or does that provide date we don't need? [17:00] *data [17:00] I say let's spend a day or two considering all the options [17:00] Before turning on warriors next week. [17:07] 34 likes: http://blog.picplz.com/ [17:52] i knew it :) http://forums.gamespy.com/gamespy_site_feedback/b67609/21181554/p1 [17:53] i'll do another mirror of them [18:00] wow, that's awful [18:09] it's IGN, it was expected [18:10] http://archive.org/details/Forumplanet.gamespy.comArchive ;) [18:10] i was off by 3 months [18:17] ah, i forgot that those assholes changed urls already [18:18] oh, np [20:43] 30-day warning for mobileme [21:23] Ops, anyone? [22:13] http://gigaom.com/2012/06/02/photo-sharing-app-picplz-calling-it-quits-on-july-3/?utm_source=social&utm_medium=twitter&utm_campaign=gigaom [22:14] On July 3, 2012, picplz will shut down permanently, and all photos and user data will be deleted. Until then, users may download their own photos by clicking on the download link next to each photo in their photo feed. [22:15] I'm not familiar with the service, but not sure there's any thing that looks scrapable [22:17] Ok, here's an example user [22:17] http://picplz.com/user/andrewsayer/ [22:17] DrainLbry: There is, http://archiveteam.org/index.php?title=Picplz , https://github.com/ArchiveTeam/picplz-grab [22:17] sweetness [22:18] Also, #piczzz [22:18] on the fucking ball man :) [22:18] SketchCow mentioned it earlier. [22:18] Feel free to add your thoughts, and to do a test run of the script. [22:30] really?! one DAY!? wow they really care about the entirety of their business model [22:30] sarc/ [22:30] july, not june [22:31] oh, well ok then [22:31] so how do i run these, the lua or the dld woth a list [22:32] bsmith093: ./get-warc-wget-lua.sh ; ./dld-picplz-user.sh {USERID} [22:32] k thanks [22:32] Where USERID should be a number. [22:32] wheres the list of numbers [22:32] But please note that this is just for testing. [22:32] There are a few at the top of dld-picplz-user.sh. [22:33] But you can also try a random one. [22:36] seems to run beautifully, but why not just run through all the links in seq, whether or not they exist? [22:37] Which links? [22:37] oh, i meants the userids [22:37] still being tested .... [22:37] Ah, we probably will do something like that. Run the script for every user id. [22:38] But first the script should work, and we should check if it saves everything we want to save, and in the right form. [22:38] on a related note, is efnet ok? ive been dumped like 3 times today, usually its more like once every 4 days or so