[00:00] pfff..do you even hexchat?? [00:00] :D [00:00] irssi ftw. [00:00] :D [00:02] *** Arctic_ has quit IRC (Ping timeout: 260 seconds) [00:03] just 12 hours of #netneutrality tweets are fucking ~250MB ...how's that even 'micro blogging' [00:07] *** vitzli has joined #archiveteam-bs [00:09] *** vitzli has quit IRC (Client Quit) [00:13] what annoys me greatly about webrecorder.io is that their 'sessions' is limited to 3 ours..It takes 3 hours of autoscroll to get past 17h of tweets :/ [00:15] so it's a great tool, but kind of useless for infininte scrolling [00:17] It's open source though, so you could set it up on your own machine and remove the 3 hour limit there. [00:17] (I'm sure there's a reason for that limit though; might be memory usage or something like that.) [00:17] aye [00:18] i notice the browser (probably viewed trough some vnc thingy) got slower and slower [00:18] noticed* [00:20] if they could run it without view, and present all url's in a file, that would be enough for me [00:25] Yeah, that's what I want to make, a script that retrieves all tweet URLs for an account or hashtag. [00:25] I'm sure I'll get around to it eventually. Let's see if Twitter's still around by then. ;-) [00:26] i found some thing on github that does 3000+ 'latest tweets on any account', but webrecorder running locally seems to be the best option for such [00:29] can you link me to it ola_norsk :)? [00:29] trying to find it. one sec [00:39] is there a way to filter already visited 'purple' google links when searching? [00:39] The one I came across was tweep: https://github.com/haccer/tweep [00:39] I haven't tried it yet though. [00:39] And not sure if it can do hashtags. [00:40] There are a bunch of other tools, too. Search for "twitter scraper" on GitHub, for example. [00:40] If you want to grab everything, make sure to use a scraper, not something that uses the API. The API is limited to a little over 3000 results. [00:40] i was specifiaclly looking for things that did not need twitter API, i do not think that is the same i found, but maybe it will do [00:41] oh [00:41] Tweep does not use the API IIRC. [00:41] Just yet another "feature" of Twitter to make things harder. [00:42] im not registering shit on twitter ever again [00:42] i wrote 'lewd' tweets, just to get myself banned. Because apparently it takes 30 days to do it otherwise. [00:44] i think i canceled the Disabling twice before deciding to do that [00:44] accidentally [00:47] "Hey! Welcome back ol' buddy ol' pal!"..just by clicking on some damn link forgetting to clear browser [00:51] JAA: i think this is what i was referering to https://github.com/PhantomKidding/GotchaTwitter [00:55] JAA: but seeing 'register pushbullet' seems even shittier. But i think that was the one i meant since it's one of the "user specific" in browserlog [00:57] JAA: it makes sense also seeing it does use the twitter api since i think i found it trough another website describing that ~3000 tweets limit [00:59] 1000 tweets is like 2 months for an active twitter user.. :/ [01:02] *** kristian_ has joined #archiveteam-bs [01:07] *** PurpleSym has quit IRC (Ping timeout: 245 seconds) [01:09] *** ez has quit IRC (Read error: Operation timed out) [01:09] *** ez has joined #archiveteam-bs [01:09] *** PurpleSym has joined #archiveteam-bs [01:24] *** ZexaronS has quit IRC (Quit: Leaving) [01:53] *** Arctic has joined #archiveteam-bs [01:58] *** Arctic has quit IRC (Ping timeout: 260 seconds) [02:25] *** nyaomi has quit IRC (Ping timeout: 250 seconds) [02:26] *** ola_norsk has quit IRC (it's all fun and games until someone looses their storagecloud) [02:32] *** kristian_ has quit IRC (Quit: Leaving) [02:35] *** du_ has joined #archiveteam-bs [03:45] *** nyaomi has joined #archiveteam-bs [03:54] *** Arctic has joined #archiveteam-bs [04:10] *** qw3rty111 has joined #archiveteam-bs [04:16] *** qw3rty119 has quit IRC (Ping timeout: 600 seconds) [04:24] *** Arctic has quit IRC (Ping timeout: 260 seconds) [04:26] *** Arctic_ has joined #archiveteam-bs [04:27] Sorry I left. What did you say (if anything)? [04:57] *** Arctic_ has quit IRC (Ping timeout: 260 seconds) [05:30] *** Jusque has quit IRC (Read error: Operation timed out) [05:35] *** dashcloud has quit IRC (Read error: Operation timed out) [05:44] *** Jusque has joined #archiveteam-bs [05:47] *** dashcloud has joined #archiveteam-bs [06:26] *** Rai-chan has quit IRC (Ping timeout: 248 seconds) [06:33] *** Rai-chan has joined #archiveteam-bs [06:49] *** Rai-chan has quit IRC (Ping timeout: 248 seconds) [06:53] *** Rai-chan has joined #archiveteam-bs [07:52] *** CoolCanuk has quit IRC (Quit: Connection closed for inactivity) [08:38] SketchCow: the image in the linux journal description is a cover of linux format magazine [08:42] *** schbirid has joined #archiveteam-bs [09:04] *** BlueMaxim has quit IRC (Ping timeout: 633 seconds) [09:05] *** BlueMaxim has joined #archiveteam-bs [09:19] *** schbirid has quit IRC (Quit: Leaving) [09:44] *** pizzaiolo has joined #archiveteam-bs [09:44] *** Odd0002 has quit IRC (Ping timeout: 260 seconds) [11:42] *** BlueMaxim has quit IRC (Read error: Connection reset by peer) [11:52] anyone a SNES ROM addict? [11:52] looking for i believe it's homebrew [11:52] spc_dsp6.sfc by blargg [11:53] First search result is a Reddit thread discussing how nobody has that file. https://redd.it/6vd3wk [11:53] RIP [11:54] seems like this community would like their ROM collections [11:54] Looks like blargg replied to that thread, by the way. [11:54] have a few SNES collections on HDDs that need a computer for me to hook up to once i can afford a spare [11:55] for me to go thru [11:57] oh [11:57] "However, the test ROM wasn't widely shared because the people who had it didn't like non-devs passing around test ROMs that those non-devs didn't understand. Then, over a period of years and various hardware failures, etc., the people that had it all lost it, but nobody realized that they had all lost it until ... they had all lost it." [12:01] *** wabu has joined #archiveteam-bs [12:21] *** dashcloud has quit IRC (Read error: Operation timed out) [12:24] *** dashcloud has joined #archiveteam-bs [12:34] I just found a 20 pack of LTO 7 tapes (new) for $15. Seems legit. [13:42] *** CoolCanuk has joined #archiveteam-bs [14:52] godane: hahah [15:20] *** dashcloud has quit IRC (Read error: Operation timed out) [15:25] *** dashcloud has joined #archiveteam-bs [15:36] *** K4k has quit IRC (Read error: Operation timed out) [15:37] ranma: yeah I don't have it, and if it circulated at all outside a couple dudes I would [15:43] Lol, I just noticed that the new(ish) AdBlock Plus doesn't even offer any way to list the blocked elements. You can only get the number of blocked elements. [15:44] And their blog post and comment replies are essentially "Please blame Mozilla, they forced us to switch to a WebExtension!". [15:45] I just don't understand how/why they'd decide to do it like this. This switch was known since a very long time ago, so they would've had time to prepare. [15:49] This is hilarious: https://issues.adblockplus.org/ticket/6 [15:55] *** ZexaronS has joined #archiveteam-bs [16:15] *** RichardG has quit IRC (Ping timeout: 250 seconds) [16:17] AdBlock has gotten a little.... corrupt-y over the years [16:17] Bear in mind, that's to be expected, and they had a great run [16:18] And they don't OWN the market - I think a bundle of crazy archiveteamy kids could eat their lunch now [16:18] But they did do something you gotta hand to them - they charged certain places to overcome the blocks [16:18] Protection racket! [16:19] Yeah, "acceptable ads". [16:20] Also, I just discovered (by mistake) another insane mofo who has been uploading piles of awesome stuff, bless them [16:20] https://archive.org/details/@archiver849271 [16:21] I'm running the "cover fixer" on them and then I'm going to make a nice collection of some of the subsets of the materials. [16:28] yeah, archivist is the man :) [16:29] https://archive.org/details/cbmagazine created [16:29] (Covers will be fixed over time) [16:45] Holy cow, thats a ton of CB magazines [16:51] *** RichardG has joined #archiveteam-bs [17:02] *** CoolCanuk has quit IRC (Quit: Connection closed for inactivity) [17:38] *** dashcloud has quit IRC (Read error: Connection reset by peer) [17:39] *** dashcloud has joined #archiveteam-bs [17:51] *** K4k has joined #archiveteam-bs [17:53] *** CoolCanuk has joined #archiveteam-bs [18:02] *** Pixi has quit IRC (Quit: Pixi) [18:07] *** Pixi has joined #archiveteam-bs [18:12] Laverne, that's not -Archivist that's @archiver849271 -Archivist is https://archive.org/details/@ohhdemgirls on ia [18:27] apparently Canuck is sometimes offensive? RIP [18:32] hm... citi seems secure (sent to me by a friend) https://usercontent.irccloud-cdn.com/file/YxOlVwkv/image.png [18:33] possibly storing in plain text.... since hash is the same length. Nice. [18:42] *** Odd0002 has joined #archiveteam-bs [18:42] *** kristian_ has joined #archiveteam-bs [18:46] *** Odd0002 has quit IRC (Client Quit) [18:48] *** Odd0002 has joined #archiveteam-bs [19:03] *** vantec has joined #archiveteam-bs [19:04] *** Dimtree has joined #archiveteam-bs [19:16] *** Mateon1 has quit IRC (Read error: Operation timed out) [19:17] *** Mateon1 has joined #archiveteam-bs [19:19] *** Dimtree has quit IRC (Quit: Peace) [19:22] *** Dimtree has joined #archiveteam-bs [19:44] *** antomati_ is now known as antomatic [19:58] Hey... can I ask some advice? [19:58] So early last year I started getting the feeling that the world was kind of going to hell and that it might just be useful/interesting/important to start recording and keeping TV news shows. [19:59] I’m in the UK so the domestic nonsense with Brexit was top of mind, but since then there’s been Trump and something worse pretty much every day. [19:59] In 2016 I was mainly grabbing national bulletins. In 2017 I kicked it up a notch and have almost full-spectrum archiving of local and regional news too. [19:59] Most nights I’m scraping at least 40 channels off satellite simultaneously, at least while the local news is on. [19:59] Yeah, I have a problem. [20:00] Obviously the TV stations /probably/ have some kind of archives of this stuff themselves, but they definitely don’t make it available to the public. That doesn’t seem useful. [20:00] But nearly two years and 30 Terabytes later I’ve got to ask… is this useful? Or just masturbation? [20:00] I’m kind of out of disc space so it seems like a good time to decide if it’s time to wipe it and stop auditioning for /r/datahoarder, or start uploading it somewhere useful. [20:01] Sounds silly saying it out loud, really. [20:01] *** BartoCH has quit IRC (Quit: WeeChat 1.9.1) [20:01] (End question) :) [20:02] hm [20:02] good question though [20:03] i'd ask archive.org if they want to add it to their tv news archive [20:03] uploading it would be a bunch of work and take a while, and there's no sense in duplicating effort [20:03] (nods) [20:03] uploading it would be a bunch of work and take a while, and there's no sense in duplicating effort if they already have it [20:05] true. I started recording BBC News and Sky News 24 hours a day but I think IA may grab those already (or at least whatever versions are available in the USA) [20:06] *** BartoCH has joined #archiveteam-bs [20:06] only been doing the 24 hour channels since June this year, though. Main focus was on 'network' news and local/regional stuff. [20:06] I tried to focus mainly on stuff that was subtitled (captioned) so that it might at least be possible to search it to find stuff more easily. [20:08] I could probably upload it somewhere - I considered YouTube (and might still do that) but I know how easily stuff suddenly gets retrospectively flagged when a new company decides they own certain footage, etc. [20:09] sports footage during bulletins especially seems to get hit like that (I sometimes upload really old bulletins to YT when I find them) [20:11] local news sounds fantastic, please share that [20:11] and the cycle of upload it-see what's copyright-blur it out-upload it again doesn't work at scale [20:14] I can attest that the BBC does archive everything, not daily but they do archive into warcs [20:14] FYI: OVH EU is having more network issues tonight [20:16] Cool. I know the BBC have an internal off-air archive of everything since about 2009 (I think) but it's not news focused and of course isn't publically available, which is what gets me. [20:18] oh, my mistake, 2007. [20:19] they don't record the regions, either [20:24] *** jschwart has joined #archiveteam-bs [20:31] *** midas3 has quit IRC (Ping timeout: 250 seconds) [20:34] JAA, https://twitter.com/AugustAmesxxx archive her stuff, news is just hitting she killed herself seemingly due to online abuse. [20:34] *** ZexaronS has quit IRC (Read error: Connection reset by peer) [20:35] *** ZexaronS has joined #archiveteam-bs [20:35] I'll throw it into ArchiveBot, but I don't have any tools yet to do more than that. [20:35] It will only grab the most recent tweets. [20:45] *** midas3 has joined #archiveteam-bs [20:49] *** Valentine has quit IRC (Ping timeout: 506 seconds) [20:49] *** kristian_ has quit IRC (Quit: Leaving) [21:19] *** schbirid has joined #archiveteam-bs [21:23] https://usercontent.irccloud-cdn.com/file/HlWFbf2A/AugustAmesxxx_Last%203000%20Tweets.zip [21:23] there you go JAA [21:24] Cheers! [21:24] That's 3193 tweets, at least. [21:25] its in a shitty format but it covers a ton of what went down [21:27] I'm throwing the individual statuses (stati?) into ArchiveBot now. [21:28] thats fucking team work right there [21:30] Highfive! [21:30] http://cdn.smosh.com/sites/default/files/2016/01/high-five-into-hug.gif [21:33] *** tuluu has quit IRC (Read error: Operation timed out) [21:33] *** tuluu has joined #archiveteam-bs [21:33] *** RichardG has quit IRC (Ping timeout: 250 seconds) [21:48] https://www.youtube.com/watch?v=hHC1ju_cL3M [21:54] *** schbirid has quit IRC (Quit: Leaving) [22:02] *** Valentine has joined #archiveteam-bs [22:10] *** BlueMaxim has joined #archiveteam-bs [23:02] *** jschwart has quit IRC (Quit: Konversation terminated!) [23:17] *** ola_norsk has joined #archiveteam-bs [23:19] *** dashcloud has quit IRC (Read error: Connection reset by peer) [23:20] hello. So, i've got a 500gb machine that i want to set up as a warrior. Problem is, there's no screen to it. I'm making an install usb stick now. But, without having to google to move to move that onto the pc from USB stick; Anyone have experience with such? [23:21] plug it into your tv temporarily to access BIOS. Boot from USB [23:21] "cp -p" the contents of the stick, updating the fstab with the real 'blkid' should make it boot right? [23:22] CoolCanuk: i have not tv [23:22] CoolCanuk: i have an old VGA screen that i tried using, problem is, i have no adapter. So i have to do it all by ssh [23:23] right [23:23] (the only output on the pc is hdmi) [23:23] *** dashcloud has joined #archiveteam-bs [23:24] im not sure. I've never done it without a screen [23:24] it depends on the pc as well. eg: press arrow down 4 times to select the 4th menu item, might be the 8th menu item on a different kind of PC or BIOS version [23:25] CoolCanuk: but. Partinioning the drive, copying the system and it's files onto it, and updating the fstab should techinally make it boot of the harddrive. or? [23:25] (and installing grub onto it) [23:25] it's a crap mini-itx machine thats been sitting collecting dust [23:26] im not sure [23:27] it's so shit it's not useable for anything else really. [23:27] you might be able to use Etcher to easily create a bootable usb [23:29] im making a system by using virtualbox, and then i hopefully will be able to write that harddrive image onto a usb stick [23:29] problem is how to move all that, onto the actual harddrive of the pc :/ [23:30] *** Dimtree has quit IRC (Peace) [23:30] as a working system, that will boot from the harddrive instead of the usb stick, i mean [23:30] you need to boot to usb [23:31] then install linux to the hard drive [23:31] use the usb as a live image [23:32] it will be, but i'd need to move that usb system onto the physical drive. [23:33] sorry, i'm not explaining the problem accuratly [23:35] SketchCow, what does ia do with 4K video? [23:36] ola_norsk: stick the drive in another PC / cradle? [23:37] Igloo: yeah. But, then, how to 'clone' that usb linux system, onto the disk of the pc? [23:38] Put the disk from the PC in another PC with a working screen [23:38] Install to it? [23:38] Put it back & boot it? [23:39] ok. ill try that if all else fails [23:56] odemg: without knowing for sure, i'd think it would get scaled down ogv conversion as normal. I don't think IA touches the original files. [23:57] odemg: so the 4K video uploaded would remain unchanged, i mean [23:57] that was my thinking i just wondered if they're setup to handle dealing with 4k(+) files [23:57] *** Dimtree has joined #archiveteam-bs [23:58] probably [23:59] i uploaded a video once that had no 'ftl' (faster than light) in it, it worked, but, it did buffer some times [23:59] might've been at my end though