#archiveteam-bs 2012-07-04,Wed

โ†‘back Search

Time Nickname Message
00:16 ๐Ÿ”— * Coderjoe binks
00:16 ๐Ÿ”— Coderjoe BLINKS
00:17 ๐Ÿ”— Coderjoe review of a taco bell on google's store review system: Kind of a flamer for a cashier. But whatev's it was good food.
00:17 ๐Ÿ”— Coderjoe wtf does having a "flamer for a cashier" have to do with anything?
00:23 ๐Ÿ”— underscor haha
00:23 ๐Ÿ”— underscor obviously it's something to take into consideration when deciding to frequent said establishment
00:23 ๐Ÿ”— Coderjoe well, this is a crazy conservative area. perhaps someone is afraid he'll put The Gay in their food
00:23 ๐Ÿ”— underscor :D
00:24 ๐Ÿ”— underscor dat gay powder, recruitin' all them y'ung folks to their craaaazy buttfuckin' ways
01:10 ๐Ÿ”— mistym Wow, Twitter loves the Montreal Mirror.
01:10 ๐Ÿ”— DFJustin you mean the montreal mirror mirror?
01:11 ๐Ÿ”— mistym mirror^2
01:20 ๐Ÿ”— Coderjoe crazy germans
01:20 ๐Ÿ”— Coderjoe http://www.liveleak.com/view?i=4d6_1341254855
01:20 ๐Ÿ”— Coderjoe and http://www.youtube.com/watch?v=RobaJKGMMiE
01:25 ๐Ÿ”— joepie92 underscor: it's interesting how all those people claiming that 'the gays are trying to infect others' never think about a reasoning as to why 'the gays' would want to do tghat
01:25 ๐Ÿ”— joepie92 that *
01:26 ๐Ÿ”— pzuraq OH SHIT
01:26 ๐Ÿ”— pzuraq joepie92!
01:26 ๐Ÿ”— joepie92 ohai :P
01:26 ๐Ÿ”— * joepie91 ninjas into discussion
01:26 ๐Ÿ”— arrith1 i wonder how widely the just do it summer project will cover. like i guess different kinds of media formats, but that's basically a separate category from file/archive/etc formats
01:26 ๐Ÿ”— Coderjoe because they're EVIL and want to DESTROY HUMANITY
01:27 ๐Ÿ”— pzuraq oh hai joepie91
01:27 ๐Ÿ”— pzuraq I thought there were 2 joepie's for a minute
01:27 ๐Ÿ”— pzuraq I didn't know which one to shoot
01:27 ๐Ÿ”— joepie91 hah
01:27 ๐Ÿ”— joepie91 haha *
01:27 ๐Ÿ”— arrith1 joepie91: the gay agenda is as mysterious as it is nonsensical
01:28 ๐Ÿ”— joepie91 lol
01:29 ๐Ÿ”— arrith1 hm "FileTeam", maybe. for the summer project thing
01:38 ๐Ÿ”— Coderjoe O_O
01:38 ๐Ÿ”— Coderjoe http://boingboing.net/2012/07/03/cisco-locks-customers-out-of-t.html
01:47 ๐Ÿ”— joepie91 jesus
03:33 ๐Ÿ”— dashcloud in case you thought it was just their consumer division that had it out for people, think again: http://arstechnica.com/tech-policy/2011/07/a-pound-of-flesh-how-ciscos-unmitigated-gall-derailed-one-mans-life/
05:16 ๐Ÿ”— BlueMax SketchCow, those BBS textfiles gone up yet?
05:16 ๐Ÿ”— joepie91 ummmm
05:16 ๐Ÿ”— joepie91 Google Video stopped taking uploads in May 2009. Later this summer weรƒยข??ll be moving the remaining hosted content to YouTube. Google Video users have until August 20 to migrate, delete or download their content. Weรƒยข??ll then move all remaining Google Video content to YouTube as private videos that users can access in the YouTube video manager. For more details, please see our post on the YouTube blog.
05:29 ๐Ÿ”— omf_ yeah I never figured out why they didn't kill google video on time
05:29 ๐Ÿ”— omf_ everytime I see a video on it I think "This relic is still around?"
05:32 ๐Ÿ”— omf_ I am all for competition in terms of product offerings but gv was never good
05:39 ๐Ÿ”— Coderjoe because of us
05:40 ๐Ÿ”— Coderjoe and complaints about there not being an easy migration to youtube
05:52 ๐Ÿ”— omf_ really? I wrote a Perl script to do it for a friend. It fetched the video from gv and then posted it to youtube via the api
05:52 ๐Ÿ”— omf_ not even 50 lines
05:52 ๐Ÿ”— Coderjoe mmm
05:52 ๐Ÿ”— Coderjoe postmortem of the AWS us-east-1 outage
05:53 ๐Ÿ”— Coderjoe http://www.theregister.co.uk/2012/07/03/amazon_outage_post_mortem/
05:53 ๐Ÿ”— Coderjoe omf_: complaints about google not providing an easy means for non-techie people to do so
05:53 ๐Ÿ”— arrith1 omf_: well i'd rather have GV up than not have those videos be available. and they're going to be private on youtube i guess, so as good as gone unless AT/archive.org puts their stuff in action
05:54 ๐Ÿ”— BlueMax ^ what he said. if they go private, we may never see them again
05:56 ๐Ÿ”— omf_ I would rather have google give us a straight dump of all the content and then close the site down. Running "competing" services in a company can cause serious problems. Yahoo always being a great example of this
05:56 ๐Ÿ”— arrith1 a bunch are apparently grabbed already, so hopefully it's almost done
05:56 ๐Ÿ”— omf_ I always worry about companies messing up already working products
05:56 ๐Ÿ”— arrith1 omf_: i think the only sites to give dumps have been url shortener places after they've shut down
05:56 ๐Ÿ”— omf_ AIM / ICQ and how that worked so well
05:57 ๐Ÿ”— omf_ arrith1, that is a shame
05:57 ๐Ÿ”— arrith1 omf_: yeah. but also probably some liability google lawyers wouldn't want to worry about
05:57 ๐Ÿ”— omf_ this reminds me again of when the jQuery admins borked their whole plugin system
05:58 ๐Ÿ”— omf_ lost everything on the site
05:58 ๐Ÿ”— omf_ no backups
05:58 ๐Ÿ”— omf_ and then tried to play it off as a good things(tm)
05:58 ๐Ÿ”— omf_ and this was last year I believe
05:58 ๐Ÿ”— arrith1 wow
05:59 ๐Ÿ”— arrith1 i'd hope using some kind of dvcs they'd have backups 'for free'
05:59 ๐Ÿ”— arrith1 lots of sites mirror even just site code/html/etc to like github
05:59 ๐Ÿ”— omf_ they had a content management system that has an automated backup option
06:00 ๐Ÿ”— omf_ it kills me
06:00 ๐Ÿ”— omf_ more than half of what they had has no reappeared in the wild yet
06:00 ๐Ÿ”— arrith1 sites like that are probably good candidates for the Deathwatch page on the wiki
06:01 ๐Ÿ”— arrith1 for people to get ideas of what sites to preemptively archive
06:01 ๐Ÿ”— arrith1 like ff.net is being backed up currently
06:03 ๐Ÿ”— omf_ yeah I have added a few things to deathwatch
06:04 ๐Ÿ”— godane SketchCow: I understand now why you didn't upload bbs interviews
06:04 ๐Ÿ”— omf_ I have also been looking back through my bookmarks for sites that are candidates as well
06:04 ๐Ÿ”— arrith1 omf_: nice
06:04 ๐Ÿ”— arrith1 the more of that the better
06:05 ๐Ÿ”— omf_ yep. I have 4 more entries I am working on.
06:05 ๐Ÿ”— omf_ eventually I am going to join the ranks of archiving sites that are allive
06:05 ๐Ÿ”— omf_ I do some of reddit now
06:06 ๐Ÿ”— arrith1 yeah reddit sure could use it, it's not that big too
06:06 ๐Ÿ”— omf_ hackernews is next for me
06:06 ๐Ÿ”— arrith1 omf_: looking into the archiveteam warrior project might be good
06:06 ๐Ÿ”— omf_ reddit has a massive amount of new content per day
06:07 ๐Ÿ”— arrith1 the archive team warrior is making it easier for lots of people to spring into action, even preemptively
06:07 ๐Ÿ”— omf_ oh yeah and that is where it would have to go
06:07 ๐Ÿ”— omf_ the problem is also mapping out all the subreddits. I have been working with a stats tracker on that
06:07 ๐Ÿ”— arrith1 omf_: well at least compared to like video sites, reddit is only a few hundred GB iirc, unlike other places
06:07 ๐Ÿ”— arrith1 omf_: might want to put what you've got so far on a page on the wiki for reddit
06:08 ๐Ÿ”— omf_ one thing about reddit is the posts and associated comments get moved to a cold storage after a certain date
06:08 ๐Ÿ”— arrith1 people rallied together to get fanfiction.net preemptive archiving that way
06:08 ๐Ÿ”— arrith1 ah yeah
06:08 ๐Ÿ”— omf_ you can only go 1000 entries back in a subreddit
06:08 ๐Ÿ”— arrith1 also hard to get past 1k items
06:08 ๐Ÿ”— arrith1 yeah
06:08 ๐Ÿ”— arrith1 that i've yet to figure out how to get around
06:08 ๐Ÿ”— arrith1 the admins have said publicly "instead of scraping, if you want a copy of reddit just ask" though i'm not sure how they feel about AT
06:08 ๐Ÿ”— arrith1 so i haven't tried
06:08 ๐Ÿ”— omf_ I have been talking one of the reddit devs into making me data dumps on certain sections
06:09 ๐Ÿ”— omf_ I want to see how much I can get out of them
06:09 ๐Ÿ”— arrith1 that would be pretty good
06:09 ๐Ÿ”— arrith1 omf_: if you get anything, it would be good to get that backed up onto archive.org servers
06:10 ๐Ÿ”— arrith1 omf_: dunno if you're familiar with the AT methods but it usually goes bunch of users scramble to dl a site / sites, then upload to reserved space generally on archive.org servers, then it gets inserted into the archive.org system
06:11 ๐Ÿ”— omf_ 570,770 is the amount of unique posts I got so far
06:12 ๐Ÿ”— omf_ not counting links in posts, comments, or reddit pages
06:13 ๐Ÿ”— arrith1 omf_: yeah i'd be curious about your methodology, for storing, verifying if you already have a post or not, etc
06:15 ๐Ÿ”— omf_ well I kinda hacked it initially but left the door open
06:16 ๐Ÿ”— omf_ the fields I wanted to track are mapped to columns in a table
06:16 ๐Ÿ”— omf_ then there is a column that holds the entire fetched json
06:16 ๐Ÿ”— omf_ so I can get any piece of data at a time
06:16 ๐Ÿ”— omf_ I use a unique key against the url and subreddit
06:17 ๐Ÿ”— omf_ because I want to see cross posts
06:17 ๐Ÿ”— omf_ the script runs on a cron job doing different sections at different times of the day
06:18 ๐Ÿ”— omf_ I also create log files on runs. The log file contains all the post urls only and a little metadata. If needed I could rebuild the database using the log file
06:21 ๐Ÿ”— arrith1 omf_: using files or a db like mysql?
06:21 ๐Ÿ”— omf_ mariadb
06:21 ๐Ÿ”— omf_ the logs are flat text files
06:22 ๐Ÿ”— omf_ I have another table that tracks reddit votes over time
06:22 ๐Ÿ”— omf_ and the comment count
06:22 ๐Ÿ”— omf_ so I can see which kinds of stories in which subreddits get the most attention
06:23 ๐Ÿ”— arrith1 hm if you tweaked that into a archiveteam warrior compatible setup you might be able to get some help downloading
06:23 ๐Ÿ”— omf_ oh that is easy to do
06:23 ๐Ÿ”— omf_ I am going to wait to see if they will just give me the data dump
06:23 ๐Ÿ”— arrith1 i'm not sure what people call it but there's also this heroku tracker thing, where it has a full status page displayed. they've had that for other projects
06:23 ๐Ÿ”— omf_ so far they have been friendly and helpful
06:23 ๐Ÿ”— arrith1 hmm yeah
06:23 ๐Ÿ”— arrith1 have they said 'yes'?
06:24 ๐Ÿ”— omf_ They said it would depend on subreddit size and how long to pull off cold storage
06:24 ๐Ÿ”— omf_ if I can get them at a lull in work they said it would be more feasible.
06:25 ๐Ÿ”— arrith1 if you can get enough of those, maintaining a full mirror would get a lot easier
06:25 ๐Ÿ”— arrith1 one would just have to maintain it by grabbing the latest stuff
06:29 ๐Ÿ”— omf_ maintaining is easy so far
06:29 ๐Ÿ”— omf_ some subreddits I poll 4 times a day others 1 time a week
06:30 ๐Ÿ”— omf_ figuring out the frequency is one of things I would have to do for at warrior
06:30 ๐Ÿ”— omf_ slamming everything on their site violates this TOS
06:30 ๐Ÿ”— arrith1 yeah, that sounds like a fun statistics problem
06:30 ๐Ÿ”— omf_ nah it is just getting a list of all the subreddits
06:30 ๐Ÿ”— arrith1 well it'd be a bunch of different people
06:31 ๐Ÿ”— omf_ I looked at the wikiteam
06:31 ๐Ÿ”— omf_ they have it divided into lists
06:32 ๐Ÿ”— omf_ I would do the same but based on update frequency. Some users would have to leave warrior running for 4 cron jobs or more a day. Not so much a hit and run like normal AT actions
06:32 ๐Ÿ”— arrith1 i think the preemptive archiving stuff will be very different from the dash and grab AT stuff
06:33 ๐Ÿ”— arrith1 since yeah people will have to leave it running. but people have unused server capacity and stuff
06:33 ๐Ÿ”— arrith1 plus they can always drop out if they need that capacity, and resume if/when they can
06:34 ๐Ÿ”— omf_ true
06:34 ๐Ÿ”— arrith1 hm that should be more prominent on the wiki. making it clear that there are dash and grab efforts and other preemptive efforts
06:35 ๐Ÿ”— arrith1 i think preemptive stuff is just so new. i haven't looked recently but i think the only preemptive AT effort currently is the fanfiction.net one
06:37 ๐Ÿ”— omf_ yep
06:38 ๐Ÿ”— omf_ that is why I am updating deathwatch
06:38 ๐Ÿ”— omf_ so people can see the types of sites we are looking for
06:39 ๐Ÿ”— arrith1 might be good to put together a 'candidates for preemptive archiving' page, if some sites look extra dire, or they're especially popular like reddit
06:39 ๐Ÿ”— arrith1 hm more for my userpage todo list
07:16 ๐Ÿ”— SmileyG hmmm
07:16 ๐Ÿ”— SmileyG google killing more stuff but nothing of import afaik
07:16 ๐Ÿ”— SmileyG all presentation stuff + google video, which is moved to youtube.
07:18 ๐Ÿ”— omf_ the best product google canceled was google squared
07:18 ๐Ÿ”— omf_ there is nothing out there like it now
07:18 ๐Ÿ”— SmileyG google squared o_O?
07:18 ๐Ÿ”— omf_ dude
07:18 ๐Ÿ”— omf_ you would run a search
07:19 ๐Ÿ”— omf_ and it would return this interactive spreadsheet
07:19 ๐Ÿ”— omf_ on the left one range of the search and the right another range
07:19 ๐Ÿ”— omf_ you could then add refining terms or select squares
07:19 ๐Ÿ”— omf_ that square would become the focus and then the results would change
07:20 ๐Ÿ”— omf_ it allowed you to visualize, sort and use desperate data
07:20 ๐Ÿ”— omf_ web pages, fact data, images, video, it did it al
07:22 ๐Ÿ”— Coderjoe there have been other preemptive panic grabs
07:23 ๐Ÿ”— omf_ here is a 37 second intro https://www.youtube.com/watch?v=__INtIXNLmI
07:24 ๐Ÿ”— omf_ and then this google talk that has it https://www.youtube.com/watch?feature=player_detailpage&v=5lCSDOuqv1A#t=1658s
07:25 ๐Ÿ”— omf_ also each use of google squared ran around 200 searches
07:26 ๐Ÿ”— Coderjoe i think i might see why they killed it :P
07:26 ๐Ÿ”— omf_ they killed all of google labs for no good reason
07:26 ๐Ÿ”— omf_ they had it open for 2 years
07:27 ๐Ÿ”— omf_ and gave a tech talk about it last year and then bam closed
07:27 ๐Ÿ”— Coderjoe $$$$
07:27 ๐Ÿ”— omf_ fucking Larry Page did that
07:27 ๐Ÿ”— omf_ he cut a ton of shit and focused everyone on g+
07:33 ๐Ÿ”— Coderjoe well, i think this kickstarter will fail. 40 hours to go and less than .6% pledged
07:33 ๐Ÿ”— omf_ kickstarter for what?
07:34 ๐Ÿ”— SmileyG ah I do remember that
07:34 ๐Ÿ”— Coderjoe an indie action/adventure hero-rescues-the-girl feature film
07:34 ๐Ÿ”— SmileyG o_O
07:35 ๐Ÿ”— SmileyG sounds inspiring.
07:35 ๐Ÿ”— Coderjoe http://www.kickstarter.com/projects/1431718519/lady-in-distress-feature-film
07:35 ๐Ÿ”— SmileyG hmmm
07:35 ๐Ÿ”— Coderjoe I "bookmarked" it on kickstarter, and they sent me the 48hr message today
07:36 ๐Ÿ”— omf_ Anyone use a remote server to upload to IA
07:36 ๐Ÿ”— Coderjoe it looked mildly interesting, but i wasn't sure i would have any funds to chip in
07:36 ๐Ÿ”— omf_ I was thinking of using my webserver to fetch and then upload a collection I have been working on
07:37 ๐Ÿ”— omf_ or know where I could rent a seed box or the like
07:37 ๐Ÿ”— Coderjoe omf_: I sorta do. it isn't really different than locally via commandline using the s3api interface
07:37 ๐Ÿ”— omf_ my local internet is super slow that being the difference
07:38 ๐Ÿ”— omf_ I am talking about a server with speed
07:38 ๐Ÿ”— omf_ and I am refactoring that s3api script
07:38 ๐Ÿ”— omf_ as part of it
07:38 ๐Ÿ”— Coderjoe the box is on my lan, but i just ssh'd in and wrote a python wrapper script around curl
07:39 ๐Ÿ”— Coderjoe the only thing to watch for is an errir telling you to slow down, but i think you can only get that if you're on the ia lan
07:40 ๐Ÿ”— omf_ I am only going to upload 1 file at a time
07:40 ๐Ÿ”— omf_ they are cd and dvd isos
07:41 ๐Ÿ”— omf_ I have been writing scrapers for years. I am polite about file transfers because it does not draw attention
07:41 ๐Ÿ”— omf_ reddit for examples bans bots all the time for hitting their service too fast.
07:41 ๐Ÿ”— omf_ How hard is it really to add a sleep statement to your code to fix it?
07:42 ๐Ÿ”— Coderjoe sure. SketchCow was able to overwhelm it with one at a time uploads while uploading to the s3api from within IA's network
07:43 ๐Ÿ”— Coderjoe for s3 insertion you don't generally need a sleep statement
07:48 ๐Ÿ”— Coderjoe https://sphotos.xx.fbcdn.net/hphotos-ash3/529310_337414343002615_1271453806_n.jpg
07:50 ๐Ÿ”— omf_ ha
08:03 ๐Ÿ”— joepie91 lol @ running across a page with anonnews links on the archiveteam wiki
08:37 ๐Ÿ”— arrith1 omf_: disparate*
08:41 ๐Ÿ”— ersi joepie91: nice haha
08:42 ๐Ÿ”— joepie91 it's still a bit odd to browse a site and run across a link to a site of my own, lol
08:42 ๐Ÿ”— joepie91 not sure if I'll ever get used to that
10:34 ๐Ÿ”— SmileyG So teh higgs was behind the sofa after all...
11:09 ๐Ÿ”— Soojin https://www.youtube.com/watch?v=bl_1OybdteY
13:11 ๐Ÿ”— godane uploading episode 123 of dl.tv
13:11 ๐Ÿ”— godane :-D
13:23 ๐Ÿ”— godane uploading episode 124 of dl.tv
14:24 ๐Ÿ”— winr4r godane: :D
15:21 ๐Ÿ”— DFJustin http://cardiganponi.tumblr.com/post/26444462697/important-message-about-the-bronycon-orgy
15:23 ๐Ÿ”— winr4r ...the orgy?
15:27 ๐Ÿ”— balrog Sounds like trolling to me
15:39 ๐Ÿ”— winr4r SketchCow: thanks for the ops *bows head as sword is tapped on his shoulder*
15:53 ๐Ÿ”— SketchCow Well, -bs is quite the bush leagues
15:53 ๐Ÿ”— winr4r good morning jason :)
15:53 ๐Ÿ”— winr4r (bush leagues?)
16:06 ๐Ÿ”— winr4r (never mind, looked it up)
16:09 ๐Ÿ”— SketchCow Oh, damn, forgot you're english
16:10 ๐Ÿ”— SketchCow I'm, like, the idiom-o-matic
16:10 ๐Ÿ”— SketchCow When Sockington used Spastic as a term, his UK fans flipped
16:10 ๐Ÿ”— BlueMax Oh hey I got ops too :D
16:11 ๐Ÿ”— SketchCow yeah, everyone gets to party in -bs
16:11 ๐Ÿ”— BlueMax SketchCow, did you upload those BBS textfiles from 1987 upped somewhere?
16:11 ๐Ÿ”— BlueMax I wouldn't mind having a look
16:11 ๐Ÿ”— SketchCow those won't be up for a tad
16:11 ๐Ÿ”— SketchCow I have to go on my next trip
16:12 ๐Ÿ”— BlueMax OK
16:12 ๐Ÿ”— BlueMax Let me know when you do
16:12 ๐Ÿ”— BlueMax If you remember
16:12 ๐Ÿ”— winr4r SketchCow: haha
16:12 ๐Ÿ”— SketchCow Oh, wait
16:12 ๐Ÿ”— SketchCow http://www.textfiles.com/F/
16:12 ๐Ÿ”— SketchCow that's some of them
16:12 ๐Ÿ”— winr4r yeah, in the 1980s "spastic" was a standard awful playground insult
16:12 ๐Ÿ”— SketchCow But there's over 100, they'll all be put there before splitting off into their permanent home
16:13 ๐Ÿ”— mistym I remember hearing about a Nintendo game that got recalled because the translators had used the word without knowing what it meant in the UK.
16:14 ๐Ÿ”— BlueMax Fair enough SketchCow, cheers
16:15 ๐Ÿ”— Schbirid mistym: pooper mario?
16:17 ๐Ÿ”— winr4r SketchCow: so about this new project, the basic idea is document how to get stuff out of old formats (digital or otherwise) into new ones?
16:17 ๐Ÿ”— winr4r i could probably spend some time working on that, at least gathering what information is available
16:19 ๐Ÿ”— BlueMax winr4r, that does appear to be the point
16:20 ๐Ÿ”— SketchCow I'll be writing what I'm planning in more detail on the wiki page.
16:20 ๐Ÿ”— SketchCow http://www.archiveteam.org/index.php?title=Just_Solve_the_Problem_2012
16:23 ๐Ÿ”— winr4r 'This is not a "sprung from the forehead of Zeus" attempt to completely re-boot the process of enumerating the many formats out there. Much work has been done and there is much to share.'
16:23 ๐Ÿ”— winr4r hm
16:24 ๐Ÿ”— winr4r a lot of work has been done there, but all the sites that are out there that give information on lots of file formats, are also pretty sketchy
16:25 ๐Ÿ”— winr4r like, they'll have some brief documentation on it then it's like "if you can't open .WPS files, you MAY HAVE ERRORS IN YOUR REGISTRY!!! click to download REGISTRY FUCKFACE v13.9 to correct these errors!!!"
16:25 ๐Ÿ”— SketchCow Well, not QUITE true.
16:26 ๐Ÿ”— SketchCow Realize, one of the sad things to happen here with archive team is I'm constantly exposed to all the sort of insider crapola in the arhciving biz.
16:26 ๐Ÿ”— SketchCow And there's actually a pile of initiatives
16:26 ๐Ÿ”— winr4r i can imagine
16:26 ๐Ÿ”— SketchCow Some are very good, in fact.
16:30 ๐Ÿ”— winr4r so, what is the goal behind the new project that these other projects aren't doing?
16:30 ๐Ÿ”— SketchCow Be unemcumbered by funding, politics and justification.
16:31 ๐Ÿ”— winr4r so it's still about documenting various formats and how to get stuff out of them? :)
16:31 ๐Ÿ”— SketchCow Yes
16:32 ๐Ÿ”— winr4r but unencumbered by anyone saying "well, do we *really* need to know how to read raw files from a canon EOS D30?"
16:32 ๐Ÿ”— SketchCow Not quite the problem.
16:32 ๐Ÿ”— SketchCow The problems are usually
16:32 ๐Ÿ”— SketchCow - "Look, we only have these interns until Sept. 13, do NOT send them off to document non-critical formats"
16:33 ๐Ÿ”— SketchCow - "Why the fuck are we including BBS texts as canonical documents"
16:33 ๐Ÿ”— SketchCow - "Oh, we're just doing DIGITAL formats. Punch cards are self-evident"
16:33 ๐Ÿ”— winr4r ah!
16:33 ๐Ÿ”— SketchCow I want boundary-less hothouse
16:34 ๐Ÿ”— winr4r or even, "books are self-evident so we don't need to digit-LOLJK pulping the stuff that isn't important to us now"?
16:35 ๐Ÿ”— winr4r either way, i'm all for throwing some time into it
16:35 ๐Ÿ”— BlueMax I'd help if I had a way to
16:35 ๐Ÿ”— SketchCow I think we can stand on the picplz thing.
16:35 ๐Ÿ”— SketchCow 36 hours, we duped that bitch
16:36 ๐Ÿ”— winr4r i was more than impressed by you guys doing that
16:37 ๐Ÿ”— BlueMax The picplx team have bigger e-peens than most people on the net right now
16:38 ๐Ÿ”— Schbirid <- just received his Internet Archive sweater. e-peen++
16:38 ๐Ÿ”— winr4r Schbirid: more than jealous
16:38 ๐Ÿ”— Schbirid kinda hoped for a shirt tbh but still ncie
16:38 ๐Ÿ”— winr4r pft :P
16:38 ๐Ÿ”— BlueMax I will only wear a shirt that advertises IA if it has SketchCow in a tutu on the back saying "There are things we need to save. This is not one of them."
16:39 ๐Ÿ”— Schbirid i dont think underscor would borrow SketchCow his tutu
16:41 ๐Ÿ”— DFJustin that reminds me I have a few 90s text files I was gonna put together and send in
16:42 ๐Ÿ”— BlueMax Just, if we could all chip in $50 and got Jason to do that for an IA t-shirt...I would be in heaven :P
16:49 ๐Ÿ”— underscor oh man
16:49 ๐Ÿ”— underscor that definitely needs to be a reward
16:50 ๐Ÿ”— Schbirid !
16:52 ๐Ÿ”— underscor they could be part of reg for archiveteamcon 2013
16:52 ๐Ÿ”— underscor too
16:52 ๐Ÿ”— underscor :D
17:02 ๐Ÿ”— BlueMax I'd come to that conference
17:25 ๐Ÿ”— winr4r pft
17:25 ๐Ÿ”— winr4r jason would be like the only extrovert there
17:55 ๐Ÿ”— winr4r the dangerous trend we've seen in afghanistan and iraq, btw, is that our militaries are getting very good at subduing insurrections
17:55 ๐Ÿ”— winr4r (i have a brother in afghanistan, btw)
17:56 ๐Ÿ”— winr4r wrong window
17:56 ๐Ÿ”— winr4r sorry, i should sleep more than two hours a night :/
18:57 ๐Ÿ”— Schbirid comodo <3 http://isc.sans.edu/diary.html?storyid=13606
20:38 ๐Ÿ”— Coderjoe ugh. this ocsp thing seems like an information leak, on the privacy front
20:41 ๐Ÿ”— balrog ugh where?
21:08 ๐Ÿ”— Coderjoe eh?
21:19 ๐Ÿ”— omf_ what irc clients are people using. I am thinking about trying out a new one
21:20 ๐Ÿ”— omf_ I still use xchat
21:20 ๐Ÿ”— DFJustin you should be able to do /VERSION #archiveteam-bs
21:20 ๐Ÿ”— omf_ god xchat is 14 years old
21:21 ๐Ÿ”— DFJustin heck I'm still on mIRC
21:22 ๐Ÿ”— omf_ ugh that should have shown up in the server tab
21:22 ๐Ÿ”— omf_ it is a few little things that are starting to annoy me
21:23 ๐Ÿ”— mistym omf_: I've been enjoying Limechat, but I've only tried it on OS X.
21:24 ๐Ÿ”— balrog try Textual (unfortunately it's free only if you compile it yourself)
21:26 ๐Ÿ”— omf_ I am a Linux user
21:28 ๐Ÿ”— mistym Ah, okay. Not sure I can help then :( (I see there's a Limechat Windows, but I don't know about Linux...)
21:28 ๐Ÿ”— omf_ I see irssi representing
21:28 ๐Ÿ”— omf_ why do Apple users assume everyone uses a Mac?
21:28 ๐Ÿ”— mistym omf_: I don't, I just mentioned what IRC client I was using.
21:28 ๐Ÿ”— mistym I said I
21:29 ๐Ÿ”— omf_ sorry. I was just reading about more fighting between apple and everyone else
21:29 ๐Ÿ”— mistym 'd only tried the OS X version because I don't know what the other platforms are like.
21:29 ๐Ÿ”— balrog I was saying that to mistym
21:29 ๐Ÿ”— omf_ this patent shit really sucks
21:29 ๐Ÿ”— omf_ the cell phone market is going to implode
21:30 ๐Ÿ”— mistym It's like MAD where someone pressed the button anyway.
21:32 ๐Ÿ”— mistym It's crazy :/
21:33 ๐Ÿ”— mistym balrog: Yeah, I've been meaning to give it a try.
21:36 ๐Ÿ”— Paradoks DFJustin: I'm on mIRC, too. I occasionally think about switching to something newer, but I'm used to it and it works. And, unlike the web, I don't imagine IRC has changed much since 1999.
21:39 ๐Ÿ”— joepie91 omf_: I'm using nettalk.
21:39 ๐Ÿ”— joepie91 it's <3.
21:39 ๐Ÿ”— joepie91 runs under WINE quite nicely
21:57 ๐Ÿ”— omf_ I remember the first time I built ytalk
21:57 ๐Ÿ”— omf_ I dialed into a friends computer and we chatted for hours
21:58 ๐Ÿ”— omf_ ytalk to mirc when on windows to xchat

irclogger-viewer