[00:26] do you guys backup slashdot? [00:31] Not well enough [01:05] SketchCow: isnt that specifically against most cc licenses? [01:09] CC-NC only [02:17] hi all, i'm trying to do a little tiny personal archive project, and have a question. [02:18] i had an old Lycos HTMLGear guestbook, and want to archive it. Of course Lycos doesn't give any d/l option. [02:20] it displays five entries per page, and you navigate between pages by clicking a form submit button. [02:20] my question is: what kind of tools could anyone recommend for grabbing data from a site setup this way? [02:35] :-\ [02:36] Hmm, have you tried wget on it? I'm still pretty low level at this archival stuff, but it's kind of the go-to tool. [02:37] yeah, i'm a total newbie in this regard. i figured if anyone could tell me what to do, it'd be archiveteam. :P [02:47] ok, so i'm looking thru wget's docs right now... but it seems like it only follows links. problem for me is this guestbook doesn't use links to go from page to page. it uses a form submit button [02:47] Uhm, hang on. I know I dealt with something like this a couple weeks ago... [02:48] maybe the URLs are easy? [02:48] eg jkust soome number increments [02:48] i should to do bed [02:48] ugh [02:48] 4am [02:48] good night =D [02:48] and happy new year [02:48] night [02:49] There is that. [02:50] And if not, look into using --post-file or --post-data [02:50] since it uses form submits, the url never actually changes [02:53] (I was hacking a module into wgetpaste for our internal pastebin and I learned of those from that) [05:16] woop woop woop HAPPY NEW YEAR!!!11!! but seriously, yay one year left till the end [05:18] one can only hope [05:18] i think i may found something for twitter clone [05:18] called bup [05:19] https://github.com/apenwarr/bup [05:21] godane: Don't like status.net? [05:22] never seen that one [05:22] It's what identi.ca uses, IIRC. [05:23] OH! [05:23] Twitter backup, not twitter-like [05:23] yes [05:24] I misinterpreted "Twitter clone" [05:24] i was think twitter thats like git/bup [05:25] where there is no center server [05:26] bup looks pretty neat, I'll give you that! [05:27] https://www.youtube.com/watch?v=u_rOi2OVvwU&feature=channel_video_title [07:08] go out and party, guys [07:10] Can't. Have qmail to battle. [07:11] Happppy neeewww yeearhrfkjdfdhf [07:12] mmm [07:12] a tenth of capn 100 proof [07:13] on an empty stomach over 4 hours. (with 1L of coke) [07:13] in other news, I can still spell? [07:15] Seems to be going well for you in that regard. [07:18] well, I had to call in reenforcements to get home [07:24] btw, zip (without zip64 extensions) is limited to 4 GiB, iirc [07:27] But is that compressed or uncompressed size? [07:28] compressed [07:29] well, as long as you try following the central directory at the end of the file, rather than scanning for zip file-record headers [07:30] the file offset field is, iirc, a 32-bit unsigned integer [07:30] i suppose if you haxed up a multi-volume archive you could get around the problem, as long as no part was over 4 GB [07:31] there is a separate field of the central directory for which "disk" the file starts on [09:30] wonder if that zipview.php can be hooked up to the unarchiver http://wakaba.c3.cx/s/apps/unarchiver [09:30] would allow browsing the shareware isos and such too [11:40] SketchCow: http://www.oclc.org/worldcat/newgrow.htm [15:15] 1L of coke, whoa [18:26] for mobileme, do you guys download only, then upload on [18:26] *downlaod, then upload, then download, then upload, and just repeat the process? or do you do both at once? [18:38] gui77, when I asked (for Splinder) I was told that it's better to keep everything on your disk until data is not published on archive.org [18:38] otherwise, you should be able to do both at the same time [18:38] if it makes sense because you have enough bandwidth [19:33] Nemo_bis: do yo umean until it IS published? [19:33] *you [19:33] I suppose so, "until" in English always confuses me [19:33] aha :) what's your native language? [19:34] Italian [19:34] i have enough bandwidth (home connection, not very fast but it's not capped) - my main problem is disk space [19:34] "finché" or "fintantoché" etc. can be used for both [19:34] i'm portuguese! [19:34] so just rsync with delete option while you keep downloading [19:34] :) [19:35] i'll run it the second time (to check) with the delete option, good idea [19:35] bbl [19:42] no, do not just rsync with delete option [19:42] use the upload script, as it skips incompletes [19:43] (which includes profiles you are currently downloading) [20:59] Coderjoe: ok then [20:59] but then i have to erase everything manually, assuming i want to downlaod and re-up more, right?