#archiveteam 2013-03-14,Thu

↑back Search

Time Nickname Message
00:01 πŸ”— ivan` Google Reader contains feeds for thousands of deleted blogs
00:06 πŸ”— Andres_ akkuhn,
00:06 πŸ”— Andres_ apparently
00:06 πŸ”— Andres_ you cannot run rm -rf /
00:06 πŸ”— Andres_ successfully in that emu
00:06 πŸ”— Andres_ it's not a good emo
00:06 πŸ”— Andres_ *u
00:07 πŸ”— sep332_ lol
00:12 πŸ”— ivan` I think we should start collecting everyone's .opml files so that the obscure dead blogs can get backed up
00:13 πŸ”— ivan` is there a convenient HTTP server for collecting uploads?
00:47 πŸ”— viseratop ivan`: It's true, all those entries I starred years ago on blogs that no longer exist are still available in Google Reader.
00:48 πŸ”— balrog_ well remember, if we put a ton of pressure on google, they might just backpedal
00:48 πŸ”— balrog_ they did for google video (though they eventually killed it anyway)
00:51 πŸ”— ivan` viseratop: I have some ideas for getting as many feed URLs as possible
00:51 πŸ”— ivan` the best is to search some giant web crawl for *.blogspot.com *.tumblr.com etc and infer the RSS URL
00:52 πŸ”— ivan` another thing is a bookmarklet or userscript that uploads your feed list somewhere
00:52 πŸ”— ivan` should be more convenient than Google Takeout
01:00 πŸ”— Smokinn I'd like to run the Posterous backup
01:00 πŸ”— Smokinn Anything special I need to do?
01:01 πŸ”— balrog_ run warrior, click posterous
01:01 πŸ”— Smokinn "NOTE: Posterous will ban you. Ask on EFnet IRC, #archiveteam, before running this."
01:02 πŸ”— balrog_ yeah, they will ban you after a while for a bit
01:02 πŸ”— Smokinn ok
01:02 πŸ”— balrog_ unless you browse posterous regularly you shouldn't have to worry
01:02 πŸ”— Smokinn I don't
01:02 πŸ”— Smokinn Just thought there might be a special workaround like routing your requests through tor or something
01:03 πŸ”— balrog_ the bans aren't really slowing us down; rather the posterous infrastructure is very fragile
01:04 πŸ”— Smokinn ok thanks
01:06 πŸ”— balrog_ if you need further help, please join #preposterus
01:14 πŸ”— robbiet4- so uh
01:14 πŸ”— robbiet4- do we save google reader?
01:19 πŸ”— ivan` robbiet4-: get every single feed URL first
01:20 πŸ”— robbiet4- heh
01:20 πŸ”— robbiet4- heh
01:20 πŸ”— robbiet4- uh
01:20 πŸ”— * robbiet4- /parts
01:20 πŸ”— ivan` grabbing the text content from those is probably easy
01:20 πŸ”— * ivan` misread; thought there was a "how" in there
01:23 πŸ”— SketchCow Wow, I love running around punching the living shit out of people about this RSS thing.
01:23 πŸ”— SketchCow That's how I want to spend the time
01:33 πŸ”— Carson_ what's this place about, is there a website?
01:34 πŸ”— Carson_ I know it's for archiving information, maybe databases, open source media?
01:34 πŸ”— ivan` http://www.archiveteam.org/index.php?title=Main_Page
01:42 πŸ”— Carson_ thanks
01:58 πŸ”— akkuhn SketchCow: I sense the thud of thousands of twitter insects against glass this evening.
02:05 πŸ”— SketchCow yeah
02:05 πŸ”— SketchCow Well, just a few.
02:06 πŸ”— SketchCow So, now that I've been living the Mascot of Archive Team lifestyle for a few years, and thanks for that, by the way....
02:06 πŸ”— SketchCow ... there's a lot of "oh look, THAT again" horseshit that gets in my face.
02:06 πŸ”— SketchCow The question that comes to mind is whether I want to go ahead and engage, or let it roll off like rain. It's a debate.
02:06 πŸ”— SketchCow So we get the "who cares about this old shit" people.
02:07 πŸ”— SketchCow We get the "well, of course I won't PAY for this service, fuck that" people.
02:07 πŸ”— SketchCow You know, the people who use their shopping bags six times and have a drawer of mustard packets
02:08 πŸ”— wp494 requesting permission to run posterous project on my warrior client
02:09 πŸ”— akkuhn i used to be one of those "well of course i won't PAY for this service" people. then i realized i work for a software company and people paying for services is what lets me eat.
02:15 πŸ”— SketchCow Well, look.
02:15 πŸ”— SketchCow Outside of the "why pay for it" situation where I realize you don't want to get wallet-raped, i.e. pay $60 for a game you pwn in 5 hours...
02:16 πŸ”— SketchCow ....at some point, you realize if you're spending days, endless days, using an item that has a back infrastructure you benefit from, a few bucks makes sense.
02:16 πŸ”— SketchCow Then at least you can complain.
02:16 πŸ”— akkuhn agreed. there's a pervasive sense of entitlement. motherfuckers took away my free service!
02:16 πŸ”— SketchCow So I have, for example, Flickr, GMail for Business, Namecheap/EasyDNS, Newsblur, and a few others I pay for.
02:16 πŸ”— akkuhn HOW DARE YOU take away my free service. i doth protest!
02:17 πŸ”— SketchCow Outside of the DNS ones that are notably more expensive, I bet I probably spend $200 a year.
02:17 πŸ”— SketchCow With DNS and remembering a few other things I pay for, I can't be paying more than $500 in total.
02:17 πŸ”— SketchCow $500!
02:17 πŸ”— akkuhn somehow $200 of network infrastructure seems absurd, but folks will gladly piss that away on 40 coffee's.
02:18 πŸ”— dashcloud and then there's Pinboard, which is definitely worth every dollar I paid for it
02:18 πŸ”— SketchCow For shit I use and depend on 24/7. And as you ALL know I am one outlier motherfucker, I use these services until the disks actually make Paranormal Activity Quality wailing sounds
02:19 πŸ”— akkuhn partly a perception of value at scale too. it's easier for someone to process $12 a year for an RSS feeder when it's all a company does, much less for them to see the value in Google Toiletpaper because "dude, that costs them like nothing to run"
02:20 πŸ”— akkuhn coincidentally, i will laugh if google reader becomes a google apps-paid only feature. oh the uproar.
02:21 πŸ”— SketchCow "@textfiles I've got 4TB of 356K of github repos downloaded, I'm quickly running out of harddrive space. Whats the best way to get you a copy"
02:21 πŸ”— * SketchCow flashes the horns
02:21 πŸ”— SketchCow ROCK. ON.
02:22 πŸ”— toomuchto NICE
02:22 πŸ”— SketchCow Told him to come here, he's officially a member.
02:24 πŸ”— wp494 reposting a request for info on the posterous project, since the warrior prompts people to ask for info here
02:24 πŸ”— chronomex permission granted, go ahead
02:24 πŸ”— wp494 ty
02:24 πŸ”— chronomex perhaps we should remove the warning
02:25 πŸ”— WiK helllloooooooooo nurse
02:26 πŸ”— akkuhn the hero github deserves, but not the one it needs right now.
02:26 πŸ”— akkuhn greetings.
02:26 πŸ”— akkuhn (actually i don't know what the hell they need right now)
02:27 πŸ”— SketchCow They need a hug
02:27 πŸ”— WiK so give got about 4TB of github repos (369734) downloaded for a project im workig on
02:27 πŸ”— SketchCow I should point out their offices are great.
02:27 πŸ”— SketchCow What's the archive format, WiK.
02:28 πŸ”— WiK right now its all just git clones, none compressed
02:28 πŸ”— WiK i was gona use bitcasa to dump them all too, but it keeps crashing when i try to robocopy
02:28 πŸ”— chronomex 'git bundle' is a good storage format
02:28 πŸ”— ivan` WiK: I hope you are using git --mirror (implies --bare) to save space
02:28 πŸ”— WiK ya, but i dont want them bundled, otherwise its more of a pita for me to process
02:28 πŸ”— WiK ivan`: im not, as i want all the branches
02:29 πŸ”— WiK https://github.com/wick2o/gitDIgger
02:29 πŸ”— ivan` WiK: you still get all of the branches
02:29 πŸ”— WiK ivan: then whats the difference
02:29 πŸ”— ivan` you don't waste a ton of space on a checkout of HEAD
02:30 πŸ”— WiK ahhh, well ill modify my script to use that now..brb
02:30 πŸ”— ivan` you can convert existing repos to bare repos but try not to accidentally rm 4 TB ;)
02:30 πŸ”— chronomex :P
02:31 πŸ”— WiK well, pytongit wont let me juse the --mirror with their .clone()
02:32 πŸ”— WiK ivan`: thats where find / -iname HEAD -exec would come in handy
02:35 πŸ”— ivan` I just use os.system
02:35 πŸ”— WiK I just submitted a CFP to defcon for this little project
02:35 πŸ”— ivan` actually often subprocess.call et al
02:36 πŸ”— WiK had someone today fullful a wishlist 4TB harddrive since i was almost outta space
02:36 πŸ”— chronomex nice.
02:36 πŸ”— WiK ya, ive got 5 external drives connected to my machine right now :)
02:37 πŸ”— balrog_ how much sense would it make to bundle and upload?
02:37 πŸ”— balrog_ since bundles seem to be good for archiving...
02:38 πŸ”— WiK idk, i posed to jscott on twitter and he told me to join here (didndt know there was an irc channel)
02:38 πŸ”— balrog_ also you're probably aware that we have a list of GH users from late last year...
02:38 πŸ”— balrog_ SketchCow is jscott
02:38 πŸ”— WiK ahhhh
02:38 πŸ”— WiK idnt know that, im just using their api and started with repo id # 1 and goin from there
02:39 πŸ”— chronomex nice
02:39 πŸ”— balrog_ aaah :P
02:39 πŸ”— balrog_ how does their API deal with private repos? just throws a 403?
02:39 πŸ”— WiK i had the anonymous api limit lowered from 1k request an hour to 60 :)
02:39 πŸ”— balrog_ oh you were using the anonymous API?
02:39 πŸ”— WiK i was at first :)
02:40 πŸ”— balrog_ heh it probably could be done without the API
02:40 πŸ”— WiK when i was just testing my spider
02:40 πŸ”— balrog_ ah :D
02:40 πŸ”— SketchCow People, talk to him, and based on what the result is, I can either have a drive that goes to IA, or some other solution.
02:40 πŸ”— WiK well you get 5k request an hour, and with a threaded cloner using 10 threads at a time, i RRARELY hit anythin gclose to that hourly limit
02:41 πŸ”— chronomex I'd suggest an item per user, with each repo being a single-file bundle in that item
02:41 πŸ”— balrog_ remember our userlist is out of date :(
02:41 πŸ”— balrog_ it's on IA though
02:41 πŸ”— ivan` I still feel like a dummy for not backing up bugs.sun.com
02:42 πŸ”— WiK SketchCow: i was tring to get a sponser to build a 20TB nas and then hand the thin over to you after my talk at defcon (if the CFP gets accepted) or firetalk
02:42 πŸ”— balrog_ ivan`: :(
02:42 πŸ”— balrog_ did you back up any of sun?
02:42 πŸ”— WiK i have a database that dumps the user/project name
02:43 πŸ”— balrog_ I'm quite annoyed that Oracle paywalled the whole thing
02:43 πŸ”— WiK so i can spit out user list for all useres ive seen so far
02:43 πŸ”— balrog_ have old hardware that needs bios updates or restore data? screw you unless you have an expensive contract
02:43 πŸ”— balrog_ WiK: how are you crawling right now?
02:44 πŸ”— WiK via api and a threaded python script i wrote
02:44 πŸ”— balrog_ still via API? :P
02:44 πŸ”— WiK yep, its the best/fastest way to amke sure i get everything without getting banned
02:45 πŸ”— WiK i was using a custom webcrealwer to get users and projects, but keep getting 'blocked'
02:45 πŸ”— WiK before i knew they had an api
02:45 πŸ”— ivan` balrog_: funny thing is that I would have noticed they were about to JIRA up the thing if I kept my Reader feeds better organized
02:45 πŸ”— WiK in my db i keep track of username, project name, harddrive, and if ive greeped or processed it yet
02:46 πŸ”— ivan` of course they neglected details like "we'll rm all the user comments" but that could have been infered
02:46 πŸ”— SketchCow The short form is that we can certainly take this item.
02:46 πŸ”— ivan` I don't think I have any of sun, no
02:47 πŸ”— akkuhn anyone have a suggestion for simplest way to get wget 1.14 onto a ec2 instance?
02:47 πŸ”— akkuhn i recall fighting with the SSL compile options under arch a few weeks back, wondering if i'll get the same ...
02:48 πŸ”— WiK SketchCow: https://github.com/wick2o/gitDigger those are my resultin wordlists if your interested (so far)
02:48 πŸ”— SketchCow -rw-r--r-- 1 root root 24420816 Mar 14 02:22 Electronic Entertainment (December 1995).iso03.cdr
02:48 πŸ”— SketchCow -rw-r--r-- 1 root root 1321824 Mar 14 02:22 Electronic Entertainment (December 1995).iso04.cdr
02:48 πŸ”— SketchCow -rw-r--r-- 1 root root 43180032 Mar 11 2011 Explore the World of Software - Kids Graphics for Windows (1995).iso
02:48 πŸ”— SketchCow -rw-r--r-- 1 root root 676304896 Sep 26 2009 Interactive Entertainment - Episode 13 (1995).iso
02:48 πŸ”— SketchCow -rw-r--r-- 1 root root 656586752 Mar 12 2011 bootDisc 14 (October 1997).iso
02:48 πŸ”— SketchCow -rw-r--r-- 1 root root 683087872 Mar 12 2011 bootDisc 20 (April 1998).iso
02:50 πŸ”— WiK fyi: im in no hurry to offload any of this data. but do want to put it in your hands, and then when your bored you can run github updates and keep em updated ;)
02:53 πŸ”— SketchCow Well, I'll be at DEFCON, and you can always bring a drive. :)
02:55 πŸ”— WiK 7zip them up per username and see how small of a drive i can get them onto
02:56 πŸ”— ivan` git packs are already compressed
02:57 πŸ”— WiK so your telling me i can 7zip it up and get no better?
02:57 πŸ”— ivan` you might save 1% or so
02:58 πŸ”— WiK with over 400k repos, that 1% could add up
02:58 πŸ”— ivan` I wouldn't bother
02:58 πŸ”— ivan` an ideal github mirror would have bare repos in a form that can be easily updated with git pull --rebase
02:59 πŸ”— WiK ivan so you want me to remove the HEAD's from all of them?
02:59 πŸ”— ivan` there's no reason to have a checkout of the HEAD
03:00 πŸ”— WiK thats easy enough to fix
03:00 πŸ”— ivan` (did I say git pull --rebase? I meant git fetch)
03:01 πŸ”— ivan` also, `git` will annoyingly prompt you for a username and password when you `git fetch` a repo that has been deleted
03:02 πŸ”— ivan` you can work around this by changing the remote to https://dummyuser:dummypass@github.com/user/repo.git
03:02 πŸ”— WiK you can work around this by changing the remote to https://dummyuser:dummypass@github.com/user/repo.git
03:02 πŸ”— WiK you can work around this by changing the remote to https://dummyuser:dummypass@github.com/user/repo.git
03:02 πŸ”— WiK oops
03:05 πŸ”— WiK so what do you guys use for storage of all this archiveal stuff?
03:06 πŸ”— chronomex massive drive arrays
03:07 πŸ”— chronomex you know, same way anyone does ;)
03:08 πŸ”— SketchCow We use depressed arrays
03:08 πŸ”— SketchCow http://www.flickr.com/photos/textfiles/8273584676/
03:09 πŸ”— WiK hahaha
03:09 πŸ”— WiK ya, thats a bit more funds then i have for my project :)
03:10 πŸ”— chronomex :P
03:10 πŸ”— chronomex I have 5T of ~idle space, happy to store a copy of something
03:20 πŸ”— akkuhn that's one sad petabox
03:25 πŸ”— akkuhn Request denied: source address is sending an ecessive volume of requests ... ok then, i'm doing something right :)
03:44 πŸ”— hdevalenc SketchCow: is there an Internet Archive Archive?
03:46 πŸ”— SketchCow Only somewhat
03:46 πŸ”— SketchCow We've tried to make export utilities for the archive, we should do more
03:48 πŸ”— hdevalenc how much does 10 PB (?) cost to build, anyways?
03:52 πŸ”— ivan` http://blog.backblaze.com/2013/02/20/180tb-of-good-vibrations-storage-pod-3-0/
03:54 πŸ”— ivan` about $600K plus humans to put everything together
03:55 πŸ”— hdevalenc that's pretty doable for some agency like a national library / archives
06:04 πŸ”— viseratop hdevalenc: we had SO MANY wasted PBs at San Diego Supercomputer Center, it's never stopped bothering me--there are people who use small grants well and there are people who use large grants terribly--full on bureaucratic hell, and so much barely used storage. </soapbox>
06:06 πŸ”— hdevalenc viseratop: do you know of any efforts to get large archives to mirror archive.org? universities, national archives, etc
06:08 πŸ”— viseratop hdevalenc: I'm not all that up to date, but I'm now having the urge to check in with friends still at the SC centers. I also have some friends who do digi-preservation at Library of Congress. Worth investigating, though I'm sure SketchCow already has a pretty good feel on this.
06:08 πŸ”— viseratop hdevalenc: I'll ask my network just for kicks, never know what stirring up some dust will do.
06:10 πŸ”— hdevalenc make a petition, lol
06:10 πŸ”— hdevalenc that will be very effective
06:10 πŸ”— SketchCow yeah petition the shit out of that
06:11 πŸ”— hdevalenc dead-tree petitions are actually fun
06:11 πŸ”— hdevalenc you can get your MP to read them, they go in the records, etc
06:12 πŸ”— hdevalenc online petitions really piss me off; they manage to eliminate the one thing that petitions are actually useful for
06:15 πŸ”— viseratop SketchCow: I'm fairly strapped in with UCSD, Calit2.net in particular. Let me know if it's helpful to stir up some dust on this, always happy to--also have won a few NSF grants, but those are bristly as hell (as I'm sure you know). More dog-and-pony shows and less actual achievements.
06:16 πŸ”— viseratop hdevalenc: Interesting socially though how whitehouse.gov has to keep upping their threshold due to pranksters.
06:17 πŸ”— hdevalenc indeed
06:18 πŸ”— hdevalenc see, the thing about deadtree petitions is that in order to make them you have to go up to people and convince them to sign on
06:18 πŸ”— hdevalenc best case, they're now on-side since you persuaded them
06:19 πŸ”— hdevalenc worst case, they know that it's an issue, because someone cares enough to run about with a clipboard
06:19 πŸ”— hdevalenc I don't really think that online petitions do that as well
06:25 πŸ”— chronomex woop woop woop off-topic siren
06:25 πŸ”— hdevalenc is there a convenient way to set the bind ip address of the seesaw script?
06:26 πŸ”— hdevalenc on the wiki it says I should use --bind-
06:26 πŸ”— hdevalenc address
06:27 πŸ”— hdevalenc but that option doesn't seem to exist in any of the scripts -- should I add that as a param passed to wget-lua in pipeline.py?
06:39 πŸ”— Cameron_D I think it was changed to --address
06:40 πŸ”— hdevalenc afaik that changes the ip of the web interface
06:40 πŸ”— hdevalenc (also)
06:40 πŸ”— hdevalenc does it do both?
06:41 πŸ”— Cameron_D er, not sure
06:41 πŸ”— hdevalenc also, is posterous accessible over ipv6?
06:41 πŸ”— Cameron_D sadly, no
07:00 πŸ”— SketchCow Dude, posterous is barely available over ipv4
07:04 πŸ”— SketchCow I just got tapped by someone with 900 radio shows going 20 years back, 3-4 hours a show, to host an archive of them.
07:04 πŸ”— SketchCow Going to happen.
07:04 πŸ”— SketchCow Very exciting.
07:04 πŸ”— SketchCow He's like "it's half a terabyte"
07:04 πŸ”— SketchCow I'm like PFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFT
07:04 πŸ”— ersi PFFFFFFf indeed
07:05 πŸ”— ersi &awat away
07:05 πŸ”— hdevalenc I know a radio station that doesn't keep more than 3 months of recordings due to cost
07:05 πŸ”— hdevalenc it's really depressing
07:15 πŸ”— pinwale hey all! This is Ismail from World Backup Day...
07:16 πŸ”— pinwale err...I mean that I'm working on that.
07:16 πŸ”— pinwale not that I'm from the future.
07:18 πŸ”— pinwale I was hope to ask if it'll be alright to have a callout to this Posterous project on the webpage on ~ March 31st?
07:19 πŸ”— SketchCow Sure, but the fact is that I don't know if that's the source of the probem.
07:20 πŸ”— SketchCow Check #preposterus and ask, and bear in mind it's 12-3am in the US
07:20 πŸ”— pinwale Yeah, it 3 am here as well. live in ohio. :)
07:21 πŸ”— pinwale pardon the typos
07:24 πŸ”— pinwale I was also looking for comments on an effort to create some sort of manifesto for startups to have some sort of end-of-life procedures.
07:25 πŸ”— pinwale but it's late and probably better to bring this up tomorrow.
09:45 πŸ”— arrith omgomgomg google reader
09:45 πŸ”— arrith little known thing, they're a huge archiver of rss feeds
09:46 πŸ”— arrith "Google Reader is more than a feed reader: it's also a platform for feed caching and archiving. That means Google Reader stores all the posts from the subscribed feeds and they're available if you keep scrolling down in the interface."
09:46 πŸ”— arrith that's from http://googlesystem.blogspot.com/2007/06/reconstruct-feeds-history-using-google.html
09:46 πŸ”— arrith shutdown blogposts are here: http://googlereader.blogspot.com/2013/03/powering-down-google-reader.html
09:46 πŸ”— arrith and here: http://googleblog.blogspot.com/2013/03/a-second-spring-of-cleaning.html
12:09 πŸ”— arrith need a google reader wikipage
12:11 πŸ”— arrith i'm thinking have a simple webpage that allows people to to upload their subscriptions (i think the files have the extension opml), then a thing for people to submit throwaway/dummy google accounts (since google reader doesn't display anything if one isn't logged in) then use the accounts to grab the archived copies of the blogs submitted
12:11 πŸ”— Smiley o_O
12:11 πŸ”— Smiley hmmmmmm that doesn't really make sense to me.
12:11 πŸ”— arrith then periodically check the blog pages, until it goes down
12:12 πŸ”— Smiley Oh wait I see
12:12 πŸ”— Smiley you want to grab the archives.
12:12 πŸ”— arrith pretty sure archiveteam warrior would do fine for the downloading
12:13 πŸ”— arrith Smiley: yep!
12:13 πŸ”— arrith ""
12:13 πŸ”— arrith "Google Reader is more than a feed reader: it's also a platform for feed caching and archiving. That means Google Reader stores all the posts from the subscribed feeds and they're available if you keep scrolling down in the interface.
12:13 πŸ”— arrith "
12:14 πŸ”— arrith if there's not one already i'll start on a wikipage tomorrow and i guess look into how to make that simple webpage
12:14 πŸ”— arrith the effort needs a name though also
12:54 πŸ”— ersi arrith: Create a wikipage with the info you got
12:55 πŸ”— wp494_afk is there not already a section for reader on the google page?
12:55 πŸ”— ersi That sentence makes no sense
12:55 πŸ”— ersi Oh, you mean this? http://www.archiveteam.org/index.php?title=Google#Google_Reader
12:55 πŸ”— Smiley ersi: append "on the wiki"
12:56 πŸ”— ersi That's not a project page though.
12:56 πŸ”— Samuel_Mi its called "backup tools"Ҁ¦ the irony
14:45 πŸ”— byter concord and county sheriffs each spent over in years 150 TRILLION. This is jut trying to find a way to put people in jail as dispatchers are listening to government fres. they open mail and refuiser to topand wring return letters in a negative way. this is YOU TAX DOLALRS AT WORK. many things stolen outof mail distribution every large ticket and pesent gifts and military and government
14:45 πŸ”— byter back. theydispatchers used, given way and sold the dispatcdher go ina stolen police get up to get citizens mail 3x a day. dispatchers want to be the 1st to stop anyone they dont like fora sucessful life. also if no black mail paid they label you like they did the our friend as a child molesfer and regtisterede ass the victim was in his opwn home.
14:45 πŸ”— byter orders. they also f0000 over my friens life as since dispatchdr signed a quiet contract with government the decidded to keep going against p3eoeple Concord Ca believes to get anything they waqnt out of us is to make uip worlds first over 1 million chargds and assignemt of jail time for not showing resopece then taledabout withohut illegal we are askinb begging for help ingetting as much
14:47 πŸ”— Samuel_Mi Did we just get spammed?
14:54 πŸ”— thasmudya yup
14:58 πŸ”— omf_ Γ’Β”ΒŒΓ’ΒˆΒ©Γ’Β”Β(Γ’Β—Β£_Γ’Β—Β’)Γ’Β”ΒŒΓ’ΒˆΒ©Γ’Β”Β
14:59 πŸ”— omf_ This is what we do to spammers
14:59 πŸ”— omf_ (ҕ¯Â°Ò–‘°)ҕ¯ï¸¡ Γ‰ΒΉΓ‡ΒΓŠΒ‡ΓŠΒŽq
15:18 πŸ”— byter When dispatch been ask as to why no hlep for ;peole and a new person" we wnat to be the 1st to get a eprson out of government that we dont like. this is in reverse of the letter form white hose. theywant to retre with heads high adn if jew then goes to jail. dispatrcher admtiited making juphargdes and on comomputer. disaptc h admitte that studdents using stolencomps weregivden jobs paind
15:18 πŸ”— byter in cash orocaine to reun atarget. THIS GJUY IS DYING PLESE FOR GOD AND OUR SAVOR HELP
15:26 πŸ”— chazchaz What's the boint of running a gibberish spam bot?
15:27 πŸ”— Smiley where are ze ops?
15:27 πŸ”— Smiley FIRE THE COANNONS
15:27 πŸ”— Smiley we could fire warrior pings at him
15:27 πŸ”— Smiley :D
15:27 πŸ”— Smiley with messages in the UA :D
16:51 πŸ”— SketchCow OH MY GOD GUYS DID YOU HEAR ABOUT WHAT THE CONCORD COUNTY SHERIFFS DID
16:52 πŸ”— underscor WHAT DID THEY DO
16:52 πŸ”— SketchCow SOMETHING SOMETHING 150 TRILLION
16:52 πŸ”— SketchCow MY TAX DOLALRS AT WORK
16:53 πŸ”— SketchCow IRC spam is always best spam - it's like hobos who blow other hobos for crack money.
16:53 πŸ”— SketchCow It's like, way to go downmarket
16:54 πŸ”— underscor hahahaha
16:56 πŸ”— SketchCow So, I don't know if anyone else wants to help write capsule summaries of computer platforms, but I could really use more help with more.
16:56 πŸ”— SketchCow I want to spiff up TOSEC.
16:57 πŸ”— mistym SketchCow: What platforms do you need?
16:58 πŸ”— SketchCow A lot. Come to #iatosec
17:15 πŸ”— ivan` arrith: the best way to get more feed URLs is a web crawl and also inferring based on a blogspot, tumblr, etc domains
17:39 πŸ”— WiK sup gents
17:52 πŸ”— WiK arrith: check out https://github.com/wick2o/stPepper
17:52 πŸ”— WiK its distributred software where you can divide up the ipv4 address space and spider the internet for links
17:53 πŸ”— WiK i started doing it to generate a good seed file for a spider, got alot of the space done
17:53 πŸ”— WiK but then the ppl who were helping me kinda lost interest
18:07 πŸ”— WiK i also pulled all the url soutta the wikipedia data which was a good start as well
19:56 πŸ”— chronomex btw. a friend reports he had a CVS project on sourceforge a long time ago, and it has kind of disappeared
20:09 πŸ”— omf_ SketchCow, Do you want a pull down of what is left of jmanga
20:12 πŸ”— SketchCow Not sure
20:17 πŸ”— omf_ Nothing to be saved from here but I thought I would mention the fuck you Adobe just handed out. http://developers.slashdot.org/story/13/03/14/189204/adobe-shuts-down-browser-testing-service-browserlab
21:16 πŸ”— omf_ I lay good money MS is going to kill that IE shit with no warning like Adobe
21:19 πŸ”— Smiley ie shit?
21:19 πŸ”— omf_ The online multiple IE tester
21:19 πŸ”— omf_ and those self expiring VM images too
21:49 πŸ”— ivan` didn't another browser testing service die a few weeks ago too?
21:49 πŸ”— ivan` getting serious deja vu with that adobe announcement
22:51 πŸ”— arrith ersi: getting on that today
22:52 πŸ”— arrith ivan`, WiK: hm, spidering might be something to look into but i'm at least hoping to get the top 90% from various 'top blog' lists, and ideally some stuff like the recommendations internal to google reader itself

irclogger-viewer