[01:04] Anybody around? I just heard about this: http://www.oldpunks.com/Analog%20CyberPunk%20Page%201.html and i'm wondering if i can do anything. [01:05] you can certainly do something- download them all and then re-upload to the Internet Archive (do check if they already are on there to save yourself a little work first) [01:06] it looks like they're not on there [01:06] i may not have the bandwidth, but a friend is doing that. maybe i could email the host there to see if they'll do it. what on archive.org would i point them to? [01:07] if you need to download a whole bunch, this tool: https://code.google.com/p/plowshare/ was mentioned in the past. Don't know how well it works, but it's worth a shot [01:08] if you need a person, ask SketchCow- he's the person to ask [01:36] Bike: are you familiar with the person hosting them at all? [01:38] No. I sent them an email anyway. [01:43] mostly because July 6th is nearly over, and tomorrow is July 7th- I'd really like to know how much time there is before the person lets the files expire [01:44] yeah, i have no idea, unfortunately. [01:44] if you care about these files, then why are you sitting here hemming and hawing over them? [01:45] because my internet connection is bad. i'll see if i can grab a few i guess. [01:59] trying to grab a few right now as well [02:00] hmm, downloading these through wget is hard [02:01] since the rapidshare.com urls are small javascript redirects to the actual host [02:02] tabs tabs tabs! [02:04] there's a much better option in this case: there's a link to divshare at the top, and the downloads there are ridiculously fast [02:05] at the bottom of the page here: http://www.divshare.com/download/5127561-942 [03:32] so, I think I've got all 12 of the original files from http://www.oldpunks.com/Analog%20CyberPunk%20Page%201.html (I used the links on here, at the bottom: http://www.divshare.com/download/5127561-942 as they were much faster than rapidshare) [03:32] I haven't tried to grab anything else on the page [03:32] I'll upload mine tomorrow sometime [03:33] my connection cut out. so uh, thanks. [04:48] Setting up the books to be scanned tonight and this week. [11:59] http://macintoshgarden.org/forum/retro-mac-books-magazines-etc-digitization-bob-kiwi— magazines! [14:13] I don't know how big http://www.thegridto.com/ is, but could it be archived through the bot ? [14:13] (the magazine shutdown a couple days ago) [14:16] Smith: archivebot has started grabbing it 5 days ago. about 90000 pages fetched so far [14:53] oh nice [14:53] where can you see what the bot is archiving ? [14:55] Smith: check here: http://archivebot.at.ninjawedding.org:4567/ [14:55] thanks [14:57] I'm fairly new to the AT… what do you use to code custom project (Warrior etc.), Python? [15:00] Smith: for specific projects, it's usualy wget+lua scripting. details are documented at http://archiveteam.org/index.php?title=Dev/Source_Code [15:03] the warrior uses python 2.6 for managing the wget processes [15:04] i've been wanting to learn how to code for a while, (complete n00b here), so if I learn Python, can I undertand Lua's syntax too ? [15:05] they're different, but learning any programming language will help you understand programs in another. [15:08] ok [17:15] all my tabs are in the wrong tab groups [17:28] Does anyone want to write a warrior task? [17:29] I wasn't able to get to the Pixorial task yesterday, and we only have two weeks left [17:29] I have managed to produce a list of things that need to be backed up [17:29] 308k urls [17:35] db48x: if you've got a list of URLs, we can work with that [17:36] * Smiley needs to learn how [17:36] http://db48x.net/pixorial.urls.2014-06-26.bz2 [17:36] channel is #pixofail [18:20] we should find some people to janitor the wiki :) [18:20] what needs fixing? [18:20] just look at the frontpage [18:20] i mean keeping it up to date [18:21] oh god yeah [18:21] i did an update but that was months ago [18:24] could someone add "June, 2014: [[Earbits]] shut down and came back again. We archived it." to Archive Team News [18:25] and "[[Earbits]] - Saved 130k MP3s from threatened deletion. [https://archive.org/search.php?query=earbits Archives]" to Recently Ended Projects? [18:53] http://ourincrediblejourney.tumblr.com/ [20:37] https://www.youtube.com/user/ROCKETBOOM/videos <- Allegedly, Rocketboom will delete their video archive soon- [20:37] "**Update May 20, 2014: [20:37] This video will be removed from YouTube in the near future. Would you like to host a copy on your channel? Soon, Rocketboom will be starting anew and all archives from 2004 through 2013 will be deleted. If this is an episode you would like to see remain free and openly available to the public, we'll consider sending you a free copy to host on your YouTube channel with a simple agreement. Please contact us at licensing@rocketboom.com for more infor [20:38] ! [20:38] OK, one moment [20:40] Ok, hero medal to anyone who downloads all the videos. [20:41] * DFJustin pokes ivan` [20:42] youtube-dl -citw ytuser:ROCKETBOOM [20:42] grabbing now [20:42] Would this also grab metadata? [20:42] 1354 vid's [20:44] Grab them and we'll figure out metadata or it will grab metadata. [20:45] i'm grabbing them [20:45] And on that notes, YT stopped showing the number of videos users have the other day for some reason. [20:45] You can't avoid taxes, death, and Google doing weird UI decisions. [20:45] with --all-subs --write-json-info --write-descriptions [20:46] ill add that godane [20:50] seems that youtube-dl is missing that if installed from pip, so ill grab the video's first [20:53] SketchCow: i will ftp you ROCKETBOOM videos when done [20:54] ok, then ill cancel my download [20:54] it's --write-info-json [21:02] midas: keep the download going [21:02] i keep getting errors [21:03] also is there a way for youtube-dl to make a list of youtube urls [21:05] what's the point of keeping the name if you're going to start over? [21:07] godane, --get-id will get the youtube id, -g will get the url of the video [21:22] no worries, just 900 video's to go. [21:32] i'm still grabbing too [21:32] looks like some videos only give me the json and desc files