[00:06] how do you scan them? [00:06] throw them on your flatbed scanner and press scan? [01:07] that's what I've ben doing for cds [01:07] not the fastest process though [02:51] can't you do several at once? [03:03] dunno if this has come up here, but oh, the irony... "WebCite will stop accepting new submissions end of 2013, unless we reach our fundraising goals to modernize and expand this service." [03:03] Dear Stickam member, [03:03] We are very sad to announce that Stickam has closed down, effective February 1. The site will remain active until February 28, so that you can login and download any of your live recordings or other media you wish to save. [03:04] (email went out today) [03:05] I wonder how much of that is porn [03:37] filer, I noticed that on WebCite earlier today - rather disapointing. [03:38] "WebCite®, which used to be a member of the International Internet Preservation Consortium", now apparently run by the "Centre for Global eHealth Innovation" [07:52] http://i0.kym-cdn.com/photos/images/newsfeed/000/493/968/57a.jpg [07:52] ROFL [07:54] ah crap, posted to the wrong channel, my bad [08:07] I wonder how much it would cost to run a webcite competitor [08:08] ha, we could have a combined shortener + page archiver thing [08:09] a shortener that archives whatever url you shorten? [08:10] doesn't seem like that or webcite would cost very much [08:10] put the archived copy of the page in if the link is down, that way links are never broken? [08:12] Aranje: I can't immediately see how to implement that [08:13] oh, I take that back [08:13] I'd have to look up the details, but there's a way to do multiple http responses to a request [08:13] where the later replaces the earlier [08:14] your first response is a small message saying that you're checking the page [08:14] now the link server tries to connect to the web page [08:14] if it can, it sends a response with a redirect [08:14] otherwise it sends a response with the archived copy [08:16] hm, possible [08:16] but yeah [12:09] db48x: Ya mean, HTTP 100 Continue? [12:10] I never went and looked it up [12:15] aight [13:17] The White House, We The People thing should be archived I guess - I wouldn't be suprised if things start to disappear from there [17:00] Yeah, at the very least a subsequent administration may not keep it running [18:23] any current projects? ive had the urlteam scraper running for months now [18:23] xanga-grab is going strong [18:29] if you set your warrior to 'ArchiveTeam's Choice' it'll jump in to new projects as they become available [18:34] looks like i have completely different version of the bits and bits encoded version [18:35] looks like these new ones are captured/encoded at higher bitrate and are all at 640x480 [18:36] the older ones have different res but look sort of cleaner then these [18:37] i may use the same item names but add -v2 to it [18:42] sensible [18:43] the older ones was like 80mb in mp4 [18:43] these are like avi's at about 300+mb [20:42] bsmith095 urlteam scraper wonderful idea...glad you are doing this... [22:05] https://github.com/jplehmann/coursera or https://github.com/siddharthasahu/coursera-downloader? [22:06] * ivan` goes with coursera which looks like more serious business [22:11] I use the former. works fine.