[01:57] so i scanned 2 new magazines tonight [02:07] Thank you! [02:10] i have 4 magazines that i can upload in total [02:11] i found out there was a setting to repeat the scan [10:30] http://i.imgur.com/BnJSXsr.jpg Jason, is that you? [10:42] yes. [10:42] In the beginning of episode 3 of Lois & Clark: The New Adventures of superman, Clark asks Lois if she backed up her work to floppy disk. Even superman is an Archivist [12:16] uploaded: https://archive.org/details/g4tv-video-xml-20130131 [13:05] :D [13:19] first new scan: https://archive.org/details/pc-novice-1995-04 [13:19] I'm using magazine media setting this time [13:20] the images look alot better now [13:20] less dots in them [13:26] so i was just looking at zdnet.com today [13:26] i may have figure a quick way to back up articles [13:27] the number that the end of story title can be used like this: www.zdnet.com/7000012360/ [13:27] it will redirect to the story [13:29] ( •_•)σ Smiley or ( •_•)σ GLaDOS [13:29] Something about anarchive? [13:30] kinda [13:30] Go on.. [13:31] uhoh [13:31] Smiley: nothing bad. [13:31] ah cool [13:47] godane, Why do you think zdnet needs a backup? The IA has hundreds of grabs of it every year going back at least 13 years [13:50] just was thinking how would grab it in a panic download [13:50] thats all [13:51] I need a bigger scanner [13:51] or one where the scanning element actual goes to all the borders of the glass [13:55] so looks like i got that american pages ads right in 1995-05 issue of pc novice [13:56] there was a copy in i think 1996-06 or 1996-07 where i couldn't get all of it cause of the text being right at the binder of the magazine [13:57] uploaded: https://archive.org/details/pc-novice-1995-05 [14:01] my item derive saids 97 pages when there is only 96 [14:02] weird [14:08] ok now ftp is not working [14:09] getting a 421 too many connections (10) from the IP [14:11] it looks like i can get check if my pc-novice-1995-06 item is fully uploaded with ftp right now [14:30] can anyone tell why i'm getting this error? [19:27] Anyone around and could help me with some SQL? [19:28] Depends on the type. [19:28] How would I replace the statistics.infohash with the torrents.name in this query? SELECT DISTINCT statistics.infohash, MAX(statistics.uploaded_bytes) FROM statistics GROUP BY statistics.infohash ORDER BY uploaded_bytes DESC; -- schema at http://pastie.org/private/hqrpad4a1irzq1zvpk546g [19:30] Doesn't the GROUP BY already make it DISTINCT? [19:30] Hmmm, maybe. It doesn't complain about it at least [19:30] SELECT torrents.name, MAX(statistics.uploaded_bytes) FROM statistics, torrents GROUP BY statistics.infohash WHERE statistics.infohash = torrents.infohash [19:30] perhaps? [19:31] Yeah, GROU BY will make it DISTINCT already (just tried) [19:31] select torrents.name FROM statistics JOIN torrents on torrents.infohash = statistics.infohash GROUP BY torrents.name ORDER BY whateverfieldyouwant ? [19:32] depends what other fields you wanted and whether the group or distinct were necessary [19:32] I think. [19:33] no need for the group if you don't need/want a sum/max/min of another field - e.g. uploaded_bytes [19:34] Well, what I want is the highest uploaded_bytes per infohash/name [19:34] I think both of these queries should work. [19:35] select torrents.name, max(uploaded_bytes) FROM statistics JOIN torrents on torrents.infohash = statistics.infohash GROUP BY torrents.name ORDER BY name [19:36] Seems to work [19:38] Thanks :) [19:38] does any one know of a mirror of cbc podcats? [19:38] *podcasts [19:39] i ask cause even thought wayback machine did a good job it doesn't have all of them [19:39] example of dead link: http://podcast.cbc.ca/mp3/podcasts/spark_20081118_9315.mp3 [19:41] very funny that spark talks alot about backup stuff then they lose like every past 2011-09 [19:44] hah :| [19:45] i now wished i downloaded all of it when jason scott first came on [19:46] now my crazy archiving is now not crazy [20:39] Load average: 24.88 24.76 24.70 for my server ;) [21:48] writting parsing tools [21:49] for generating a big url list is not fun [21:50] (destroyed some part of my encrypted fs, need to remake local copy of some comics) [21:50] o_O [21:51] yeah [21:51] never trust the debian installer to not fuck up luks [21:51] :D [21:51] the oh sh* moment [21:51] Interesting... Debian Installer was so far the only tool with which I could build upa PROPER LUKS based LVM layout [21:52] then the how old are my backup [21:52] 2 month ago ?!? [21:52] :O [21:52] rebuilding is time consuming [21:52] drwxr-xr-x 4 root root 4.0K Apr 26 03:16 daily.10 [21:52] drwxr-xr-x 4 root root 4.0K May 5 03:16 daily.1 [21:52] drwxr-xr-x 4 root root 4.0K May 6 03:16 daily.0 [21:53] 30 daily backups ftw [21:53] what fs are you using [21:53] my backup =~ 4GByte [21:53] Differential backups as I see [21:53] norbert79: i wanted to import an existing lvm over luks layout [21:54] nico_: Oh, I see... I saw some sort of description on that regarding Mint [21:54] only 4gb, lucky guy [21:54] as Linux Mint not supporting LUKS [21:54] in default [21:54] norbert79: i was running the debian installer of debian 7.0 [21:54] fedora 18 & fedora 19 are nogo on this machine [21:54] instant crash in some pango libs [21:55] look like debian 7.0 is working [21:55] (but i really miss the updated graphic stack) [21:55] rrsnapshot -c /etc/rsnapshot.d/daily.conf daily || echo "Backup failure" [21:55] bang [21:56] However I do need to fix up my router with rsnapshot too [21:56] Smiley: i store my backup on a computer that is powered off when i am not backuping to it [21:56] Ah [21:56] you cannot hack sometime that is not running [21:56] wakeonlan ftw. [21:56] I've not been hacked yet. [21:56] :/ [21:57] I miss ZFS. Having the filesystem take care of backups is better [21:57] (look like my buggy script worked) [21:57] 529 experiment/image_to_dl [21:57] nico@Gallifrey:~/Images$ wc -l experiment/image_to_dl [21:57] omf_: someone I know i working on bringing it to linux properly. [21:57] * nico_ is happy [21:58] I really need to get my QoS working on my router however D: [21:58] * Smiley ponders @just loading it@ [21:58] wtf is up with my locale :< [21:58] wget -e robots=off --user-agent="Mozilla/5.0 (Photon; U; QNX x86pc; en-US; rv:1.6) Gecko/20040429" -i ./image_to_dl [21:58] (-‸ლ) [21:59] let's do it :) [21:59] yeah I facepalmed you [21:59] omf_: ZFS is ready for Linux use... [21:59] but utf works again :D [21:59] omf_: Not in 'experimental' anymore [21:59] norbert79: i didn't find how to monitor the state of the harddrive [21:59] aka knowing when the hdd is going to die [22:00] nico_: You mean smarttools? [22:00] smartctl [22:00] that [22:00] smartmontools - control and monitor storage systems using S.M.A.R.T. [22:02] nevermind it is in zpool status [22:03] look like it need a cronjob to do a zpool scrub mypool [22:08] yes, zpool scrub will verify checksums (and repair), and zpool status will show health. smartctrl is still useful for checking the raw drive info, however [22:34] Hmmm no godane [23:24] "Creative Clown Went Down" - to the tune of Merry Go Round Broke Down? [23:27] what is creative clown anyay... [23:27] joke on cloud? [23:27] s/cloud/clown/ [23:27] It is this shitastic product - https://www.adobe.com/products/creativecloud.html [23:28] Adobe is clowing up [23:34] Ooooh just saw that on slashdot, heh [23:34] didn't connect the dots [23:34] We have now seen how Adobe will fail [23:38] omf_: you mean the many flash and reader vulns were not enough of a fail?