[00:02] youtube-dl rocks, autodownloads the best format available, numbers playlists optionally, its like wget for youtube [00:04] we need a youtube-dl bot? :) [00:09] hey im recently out of storage sleeves for my massively expanded pile of stuff on dvd, anyone know of a GOOD QUALITY storage case that isnt insanely pricey? [00:16] wow, youtube-dl just freeking works [00:16] very cool [01:14] anyone know of a tool i can use to download an entire soundcloud profile? [01:14] i use youtube-dl for individual songs [01:38] fyi there is currently an issue with youtube-dl where it only downloads 720p for 1080p videos [01:39] so you will want to double check depending on what you're downloading [02:01] uploaded: https://archive.org/details/Electronique_et_Loisirs_001__1999-05 [02:02] Youtube-dl DOES work. [02:02] I feel we should almost consider mirroring it under archiveteam [02:02] On github [02:03] Just to have it there. [02:07] we have riptube i think under archiveteam on github [02:07] SketchCow: you getting another magazine collection [02:08] a french electronic magazine [02:08] i think the publisher is JMJ Editions [02:08] i but not sure [04:30] is there a fast, relatively IO-cheap way, of counting files recurisvely?, not folders, just files [04:47] find . -type f [05:24] | wc -l [05:24] cheap as it gets [05:26] if you have locatedb indexing it, then you can avoid that by using its cache [12:49] Is there any projects going on atm? [13:13] Howlin1: There doesn't seem to be anything major. [13:14] do you want something to do Howlin1 ? [13:43] SmileyG: I could use something myself [13:43] I have a small VPS sitting idle [13:48] chazchaz: well there was a request to get anything we can about a photographer [13:49] [16:34:00] <@SketchCow> A beloved artist, Zina Nicole Lahr, has died from a fall during climbing [13:49] [16:34:25] <@SketchCow> I've saved her main website, but she's everywhere. If someone feels like throwing all her stuff at archivebot, that would be nice. [13:49] so was going to suggest getting everything you can about her [13:53] Like a list of pages or something? [13:53] I can wget something or run a script, but I don't really have the time to google for things. [13:54] At least not in depth [13:55] I'm downloading some big sites right now, but I think I can handle some more [13:55] if you have some sites I can download them also [14:17] Yeah sure, SmileyG but it needs to be something that will work away itself without much input/interaction done by me. [14:27] yeah [14:36] Sure np then [18:51] WHAT FORSOOTH, PRITHEE TELL ME THE SECRET WORD [18:53] yahoosucks [19:11] thats just rude [19:19] We might have a serious issue guys: http://www.businessinsider.com/imgur-and-yahoo-acquisition-talks-2013-12 [19:20] just imagine all the warriors you could get from reddit users if that happened [19:20] (answer: 6) [19:21] maybe 5 [19:21] but then again, it is a big project [19:43] PANIC [19:44] OKAY [20:04] lock [20:04] oops [20:07] Yes, we do need to get data on lock [21:53] Grabbing imgur.... ouch] [21:54] but seriously, if yahoo get it, we will start grabbing it :D [21:54] ImportError: No module named 'setuptools' [21:54] hmmm why did that say python 2.7 o_O [21:55] SmileyG, your new best friend: [21:55] apt-get install -y python python-dev && wget cryto.net/~joepie91/pipfix.sh && chmod +x pipfix.sh && ./pipfix.sh [21:55] :) [22:00] lol nein [22:02] SmileyG; the python-pip package in debian is sorta kinda broken [22:02] I now run this on all my servers to have pip set up correctly [22:02] along with setuptools stuff [22:02] works beautifully :D [22:19] joepie91: in unstable you need python-pip & python3-pip to make things works [22:19] i don't know why [22:25] apparently imgur was only 3tb last year so it might not be so bad http://www.reddit.com/r/IAmA/comments/y81ju/i_created_imgur_ama/c5tazfk [22:29] That might be true [22:34] We can do anything ;) [22:43] SmileyG: According to their AMA, they only have about 3TB of imagery [22:43] I think that's because they actually.. wait for it.. expire images [22:43] Damn it, read all of backlog but missed DFJustin's comment. [22:43] 3TB ?? [22:43] thats all ? [22:44] the commenters have some mathematical objections and in any case there's been a year of huge growth [22:44] but it's probably not mobileme [22:45] I've read somewhere, that images expire after either inactivity or a set of days (90 days? or something) [22:48] arrrgh, my ISP is mad at me or something [22:58] ersi: inactivity, yes