[00:19] *** ZexaronS has joined #archiveteam-bs [00:35] *** Mateon1 has quit IRC (Remote host closed the connection) [00:36] *** Mateon1 has joined #archiveteam-bs [00:39] JAA: oh yeah, I was going to point out re: your earlier remark about my code missing some cases, that CF occasionally changes up their code a bit [00:44] *** dd0a13f37 has quit IRC (Quit: Connection closed for inactivity) [00:50] *** icedice has joined #archiveteam-bs [00:58] Yeah. I just implemented all valid 1-3 character combinations of + and ! to be safe. [00:59] joepie91: I also implemented generic addition, subtraction, and multiplication rules for all combinations between bools, ints, and strings according to JS's rules. [01:00] I just hope they won't switch to real JSFuck code. But that would probably be counterproductive as the code they'd have to send out would become huge. [01:53] *** svchost03 has joined #archiveteam-bs [01:53] *** svchfoo1 sets mode: +o svchost03 [02:08] *** schbirid has quit IRC (Ping timeout: 255 seconds) [02:09] *** MrDignity has quit IRC (Read error: Connection reset by peer) [02:12] *** MrDignity has joined #archiveteam-bs [02:16] *** MrDignity has quit IRC (Read error: Connection reset by peer) [02:20] *** schbirid has joined #archiveteam-bs [02:33] *** godane has quit IRC (Quit: Leaving.) [02:45] *** godane has joined #archiveteam-bs [03:17] *** bwn has quit IRC (Quit) [03:40] *** bwn has joined #archiveteam-bs [03:50] *** tomatokin has joined #archiveteam-bs [03:57] *** MrDignity has joined #archiveteam-bs [04:16] *** icedice has quit IRC (Quit: Leaving) [04:48] *** qw3rty113 has joined #archiveteam-bs [04:54] *** qw3rty112 has quit IRC (Read error: Operation timed out) [05:14] *** MrDignity has quit IRC (Read error: Connection reset by peer) [05:47] *** tomatokin has quit IRC (Ping timeout: 360 seconds) [07:27] *** dashcloud has quit IRC (Read error: Operation timed out) [07:35] *** dashcloud has joined #archiveteam-bs [07:37] *** jacketcha has quit IRC (Read error: Connection reset by peer) [07:38] *** jacketcha has joined #archiveteam-bs [08:33] *** kimmer1 has quit IRC (Ping timeout: 633 seconds) [09:12] *** schbirid has quit IRC (Quit: Leaving) [09:27] *** jacketcha has quit IRC (Read error: Connection reset by peer) [09:28] *** jacketcha has joined #archiveteam-bs [09:35] *** schbirid has joined #archiveteam-bs [10:42] *** BlueMaxim has quit IRC (Quit: Leaving) [10:58] *** tar-xvf has joined #archiveteam-bs [11:02] *** odemg_ has quit IRC (Read error: Operation timed out) [11:03] *** REiN^ has quit IRC (Ping timeout: 600 seconds) [11:28] *** REiN^ has joined #archiveteam-bs [11:29] *** jacketcha has quit IRC (Remote host closed the connection) [11:37] *** ZexaronS has quit IRC (Quit: Leaving) [11:37] *** jacketcha has joined #archiveteam-bs [11:38] *** jacketcha has quit IRC (Remote host closed the connection) [11:39] *** drumstick has quit IRC (Ping timeout: 248 seconds) [11:40] *** jacketcha has joined #archiveteam-bs [11:40] *** jacketcha has quit IRC (Remote host closed the connection) [11:42] *** tomatokin has joined #archiveteam-bs [11:43] *** jacketcha has joined #archiveteam-bs [11:44] *** jacketcha has quit IRC (Read error: Connection reset by peer) [11:47] *** jacketcha has joined #archiveteam-bs [11:48] *** jacketcha has quit IRC (Remote host closed the connection) [11:48] *** jacketcha has joined #archiveteam-bs [11:49] *** jacketcha has quit IRC (Remote host closed the connection) [11:51] *** jacketcha has joined #archiveteam-bs [11:52] *** jacketcha has quit IRC (Remote host closed the connection) [11:52] *** tomatokin has quit IRC (Quit: Leaving) [11:53] *** jacketcha has joined #archiveteam-bs [11:56] *** jacketcha has quit IRC (Read error: Connection reset by peer) [11:57] *** jacketcha has joined #archiveteam-bs [11:58] *** jacketcha has quit IRC (Remote host closed the connection) [12:00] *** jacketcha has joined #archiveteam-bs [12:00] *** jacketcha has quit IRC (Remote host closed the connection) [12:01] *** jacketcha has joined #archiveteam-bs [12:02] *** pizzaiolo has joined #archiveteam-bs [12:02] *** jacketcha has quit IRC (Read error: Connection reset by peer) [12:02] *** jacketcha has joined #archiveteam-bs [12:04] *** jacketcha has quit IRC (Remote host closed the connection) [12:06] *** jacketcha has joined #archiveteam-bs [12:08] *** jacketcha has quit IRC (Remote host closed the connection) [12:09] *** ZexaronS has joined #archiveteam-bs [12:10] *** jacketcha has joined #archiveteam-bs [12:11] *** jacketcha has quit IRC (Read error: Connection reset by peer) [12:11] *** jacketcha has joined #archiveteam-bs [12:13] *** jacketcha has quit IRC (Remote host closed the connection) [12:16] *** jacketcha has joined #archiveteam-bs [12:17] *** jacketcha has quit IRC (Remote host closed the connection) [12:18] *** jacketcha has joined #archiveteam-bs [12:19] *** pizzaiolo has quit IRC (Read error: Operation timed out) [12:19] *** pizzaiolo has joined #archiveteam-bs [12:24] *** jacketcha has quit IRC (Remote host closed the connection) [12:26] *** jacketcha has joined #archiveteam-bs [12:29] *** jacketcha has quit IRC (Read error: Connection reset by peer) [12:30] *** jacketcha has joined #archiveteam-bs [13:25] jacketcha: Fix your connection please. [14:19] JAA: lol [14:19] *** dashcloud has quit IRC (Read error: Operation timed out) [14:38] *** dashcloud has joined #archiveteam-bs [15:17] *** tar-xvf has quit IRC (Read error: Operation timed out) [15:33] *** tar-xvf has joined #archiveteam-bs [15:51] SketchCow: if you want to add this to any notes you have for the wiki https://www.archiveteam.org/index.php?title=User:Jrwr [15:52] I put some comments on my userpage for if anyone comes in behind me [16:09] *** schbirid has quit IRC (Ping timeout: 255 seconds) [16:21] *** schbirid has joined #archiveteam-bs [16:58] *** zino has quit IRC (Remote host closed the connection) [17:36] *** jschwart has joined #archiveteam-bs [18:09] *** Stiletto has quit IRC (Ping timeout: 260 seconds) [18:12] *** kristian_ has joined #archiveteam-bs [18:12] so some good news on my new rpi3 [18:13] 1 the case is alot better and not falling apart when i move it [18:13] 2 i found a way to maybe port slax linux-live scripts to rpi [18:26] *** Stilett0 has joined #archiveteam-bs [18:26] *** Stilett0 is now known as Stiletto [18:31] *** ola_norsk has joined #archiveteam-bs [18:33] jrwr: are there any statements made or pages that show IA's diskusage? [18:33] Sketch was complaining about it yesterday [18:33] Ill see if I can dig one up [18:33] We budgeted for 1pb of disk space last year, We used 2pb [18:35] jrwr: In my case, i think that could be a powerful 'punch point' towards getting some other organizations/museums etc to opens their eyes (and their wallets and datacenters) [18:39] so [18:39] I know its more then 16PB [18:39] in use [18:45] Somebody2: sorry if i remember incorrectly, but it was you who also sent email to that guy in DND/NCS ? If so, did you recieve a response yet? [18:45] Disk space graphs: https://archive.org/~tracey/mrtg/df.html [18:47] ty [18:56] NCS are holding the 'Yggdrasil Conference' at Sandefjord 23. og 24. april 2018 . And it would be cool if the importance of preserving IA became a topic there. [18:57] sadly there no english version of this: https://no.wikipedia.org/wiki/Yggdrasil_(konferanse) [19:05] i do not even know how conferences work, since i've never been to one. But, the contact person is Elisabeth Kras of Den norske dataforening (NCS), at elisabeth.kras@dataforeningen.no [19:06] https://yggdrasilkonferansen.no/kontakt-oss/ [19:12] *** bitspill has quit IRC (Read error: Connection reset by peer) [19:12] *** ThisAsYou has quit IRC (Read error: Connection reset by peer) [19:12] *** jiphex has quit IRC (Read error: Connection reset by peer) [19:12] *** c0mpass has quit IRC (Read error: Connection reset by peer) [19:12] *** DrasticAc has quit IRC (Read error: Connection reset by peer) [19:12] *** hook54321 has quit IRC (Ping timeout: 245 seconds) [19:13] *** JSharp has quit IRC (Read error: Connection reset by peer) [19:13] *** HCross2 has quit IRC (Read error: Connection reset by peer) [19:13] *** tklk has quit IRC (Read error: Connection reset by peer) [19:13] *** johtso has quit IRC (Read error: Connection reset by peer) [19:14] *** jrwr has quit IRC (Ping timeout: 260 seconds) [19:16] *** Mateon1 has quit IRC (Read error: Operation timed out) [19:16] *** Mateon1 has joined #archiveteam-bs [19:19] btw, since someone mentioned that the mirror in Alexandria, Egypt is outdated/behind IA by ~10 years..Was it political climate in Egypt that caused that? [19:20] or did it simply not manage to keep up with hardware? [19:22] FWIW I asked if Dropbox would sponsor a mirror (even in part), they said no [19:23] nah, to get shit done these days it needs at least some "twitter rage" i guess :D [19:23] oh, i work there [19:24] for context :) [19:24] then some carefully crafted hashtag, and you putting into "trending" should do it ;) [19:24] hehe [19:25] at dropbox, not at twitter [19:25] ah [19:26] could you make it so that they offer less than 1TB solutions? (I already asked them, and they said no to that as well).. [19:27] anyway, it needs some outrage [19:27] 1GB i meant [19:30] ola_norsk: why do you want that? and i can't do anything personally but i can advocate :) [19:31] kisspunch: i'm guessing it will take Associations or government departments. Private companies, unless there's PR to be garnered, is less likely to be willing i think [19:32] *** DrasticAc has joined #archiveteam-bs [19:33] *** ThisAsYou has joined #archiveteam-bs [19:35] *** jrwr has joined #archiveteam-bs [19:36] *** bitspill has joined #archiveteam-bs [19:36] *** hook54321 has joined #archiveteam-bs [19:36] *** JSharp has joined #archiveteam-bs [19:39] *** xarph has joined #archiveteam-bs [19:39] *** riking has joined #archiveteam-bs [19:40] kisspunch: the reason i'd prefer dropbox to have a smaller solution than 1TB, is that my harddrive is 100GB (i know, it's small). And price of their 1TB. [19:41] get google drive suite [19:41] gsuite [19:41] their 10$ plan gets you unlimited right now [19:42] i can not afford $10/month :/ [19:42] and i like the niceness of dropbox integrated into my filemanager [19:43] google drive stream does the same as dropbox + the files don't exist on your machine anymore, they get streamed in on access [19:46] jrwr: the dropbox 1TB is ~9-10USD, and, the _main_ reason it would be nice if dropbox were to offer smaller plans is that i can simply not afford a monthly charge of 10usd [19:46] ah [19:48] i could, in a pinch if i really really needed it, but a smaller plan would both meet my need and wallet :) [19:49] i'm well aware that 1TB is small these days though :D [19:52] $10/month for unlimited is... nothing o.o [19:53] *** zyphlar has joined #archiveteam-bs [19:59] Frogging: "All things are relative"..sadly, it's something when already at strained economy for various reason :/ [20:00] oh I meant relative to the usual prices for such things, not the value of $10 [20:01] aye indeed. How long will be they offering unlimited though? In the spring i will have no problem affording it [20:02] it's definitely not sustainable, so that's a good question [20:02] and, when that offering ends, then what? :/ [20:03] Frogging: nether is youtube [20:03] with the PBs of videos per minute being uploaded [20:03] indeed [20:03] if they don't plan to suddenly increase price, or delete files, couldn't people just upload dummyfiles like madmen? [20:04] just to pre-secure storage space, i mean [20:10] more specifically, using e.g 'dd' to generate huge files, upload them to that google solution at hearts content (pun required), to set the drivespace if/when the unlimited offer should end [20:14] i'm thinking (fearing) people might want to read the small print in the usage of that google solution :/ [20:15] im storing about 600TB of Gdrive right now [20:15] I know a pair that are storing 2PB on it [20:16] for $10 a month? [20:17] and you are certain those 600TB will always be accesible to you for ~10USD ? [20:17] yeah that's the rub [20:17] Its "linux isos" [20:18] so Its not a loss if I lose it [20:19] my first thought is that _if_ that happens, you're unlikely to be only one :/ [20:23] but just in case it's google being nice, offering a 'first come first serve' ..there's 'dd' that could easily generate 'iso' files ;) [20:29] *** kristian_ has quit IRC (Quit: Leaving) [20:33] *** jschwart has quit IRC (Quit: Konversation terminated!) [20:37] jrwr: by 'a pair' do mean a couple of indiviuals having 2PT stored each? [20:37] you mean* [20:45] anyway. Someone with twitter might want to ask like (https://youtu.be/w7e1Z0PL1CY) review the terms of use of that google service in worst case scenario. [20:45] *** dashcloud has quit IRC (Read error: Connection reset by peer) [20:46] *** dashcloud has joined #archiveteam-bs [21:00] *** kimmer1 has joined #archiveteam-bs [21:01] maybe google is aiming to compete with yahoo of being the ones that deleted the most amount of history? :D [21:06] *** kimmer1 has quit IRC (Quit: Yaaic - Yet another Android IRC client - http://www.yaaic.org) [21:07] *** kimmer1 has joined #archiveteam-bs [21:08] *** kimmer1 has quit IRC (Client Quit) [21:09] *** kimmer1 has joined #archiveteam-bs [21:21] *** dashcloud has quit IRC (Read error: Operation timed out) [21:21] ola_norsk: Backblaze B2 is $5 per TB and month. If you store just a few GB, you'll pay some cents per month only. There is a fee for downloading though. [21:24] *** dashcloud has joined #archiveteam-bs [21:25] JAA: that sounds like a more realistic offer than google indeed. But the downloading fee though.. :/ One of the reasons i like dropbox is the syncing [21:32] *** BlueMaxim has joined #archiveteam-bs [21:37] JAA: i kind of think of cloudstorage in ways of "Don't risk it if you can't afford to loose it".. I wouldn't store in the cloud what i couldn't realisticly, if i absolutely had to, be able to keep a copy of by myself [21:38] kind of cruel to say it, but i think Vidme might agree :d [21:38] Oh I agree, cloud storage is just a neat way of doing an off-site backup, but it should never, ever be the only copy of the data. [21:39] aye [21:42] ola_norsk: You might want to look at Online.net's C14 as well. There is an "Intensive" service level for €5 per TB and month and no download fees. It's not as straight-forward though. [21:42] It's really more of an archival service. [21:53] JAA: hehe, yeah, it looks to more like that and not so much as a 'drop or delete a file or folder' kind of thing :D "Create an electronic safe-deposit box, upload up to 40TB of data using FTP, SFTP, Rsync or SCP, and use our simple control panel or API to archive your data in no time." [21:53] i'm not exactly keeping the cure for cancer in my Dropbox [21:55] JAA: but damn, that seems to be good service for important stuff [22:02] It does. I'm not sure how good they are in practice though. And that archival/unarchival could be really annoying. [22:04] aye "the average time to recover your data is approximately 2 hours." [22:04] Yep, similar to Amazon Glacier. [22:05] it's some 'butler' running around on a segway or roller-blades in some tunnels i guess :) [22:07] at least that means it's inpenetrable to hacks [22:08] *** HCross2 has joined #archiveteam-bs [22:09] Apologies for the spam.. IRCcloud was having a meltdown [22:10] HCross2: https://youtu.be/1VD_pJOFnZ0 [22:11] it's quite amazing that IRC have been around this long :D [22:12] *** zyphlar has quit IRC (Quit: Connection closed for inactivity) [22:15] the efnet server i'm using 'irc.homelien.no' , was there in the times of 'Spider' a norwegian tv program in ~94-96 :D [22:15] *** RichardG_ has joined #archiveteam-bs [22:15] and still today, Twitch uses irc [22:16] *** RichardG has quit IRC (Read error: Connection reset by peer) [22:17] lol..'pentium 133, running netscape'..wow :D https://youtu.be/4yd9khAQCug [22:18] back in the times when DAT tapes were fancy.. [22:19] Getting quite offtopicky in here. [22:20] not if i archive it .. https://tv.nrk.no/serie/spider [22:20] I have a request for help, for me [22:21] textfiles.com doesn't handle .txt properly [22:21] I'm sure I fucked up an option [22:21] the http server? [22:21] ola_norsk: shh [22:21] https://stackoverflow.com/questions/10542639/apache-returns-content-type-text-plain-instead-of-text-html helps a bit [22:21] I'm sure I just have apache configured wrong [22:22] thats what i meant by http server [22:22] I know. [22:22] Shhhh. [22:22] aye [22:23] http://www.textfiles.com/drugs/ should have .txt open in browser [22:23] I am going to have to track down why it treats it as a download [22:26] [22:26] ForceType text/plain [22:26] Header set Content-Disposition inline [22:26] [22:26] ^^ SketchCow [22:26] might this be relevant? https://serverfault.com/questions/82505/how-can-i-set-apache-to-serve-files-as-text [22:26] what igloo said [22:26] SketchCow: Just randomly checked 2015.txt, and that starts with a NUL byte. Cf. https://stackoverflow.com/a/7877252 [22:27] amph.txt works, for example. [22:27] So I doubt it's a configuration error; it's just that the browser finds the NUL byte, thinks it isn't a text file, and therefore offers the download instead. [22:30] not sure how Apache rules work in priority, but maybe "AddType text/plain .txt" could at least be used to override any config mistakes until the cause is found? [22:31] The server sends "Content-Type: text/plain" already. [22:31] There's also no "Content-Disposition: download" header. [22:32] It might be necessary to also specify the encoding of the files, but I doubt it'll ever work if the file starts with a NUL byte. [22:32] Every file except 2015.txt seems to work. [22:36] SketchCow: we are very close to having tagesschau 20 clock evening news up to 1997-03-31 [22:36] *** tragique has joined #archiveteam-bs [22:36] its doing 1997-03-30 right now [22:36] *uploading [22:40] i really hate the way the newer xbox one works [22:40] its like you always need to download something in order for the game to work [22:41] godane: Fantastic [22:41] i'm trying to get up to november of 1997 [22:41] godane: that's the way to circumvent the complaining about the 'always online' requirement :) [22:43] godane: "ok, your xbox doesn't have to stay online all the time.. but no one said anything about having to download" [22:56] Turns out that the Azure Postgres preview cant handle transferring 300 millions rows at once with pg_restore [22:57] Every time I try, the server bombs out. That’s what I get for dogfooding. [23:00] Trying to get this Miiverse web archive site up by MAGfest, since by then I’m on vacation and I’m gonna do my best not to touch any code. [23:09] jrwr: any idea if there is a nice and safe way to share huge amounts of files between gdrive accounts? [23:09] eg i want to let my friend sync my iso collection [23:12] *** Bond__ has joined #archiveteam-bs [23:13] hello everyone [23:15] hello [23:16] how is everyone [23:18] Getting my hip hop torrents collection working again [23:22] [Throttle off/off KB] [Rate 36.6/14265.3 KB] [Port: 6951] [23:22] Working [23:22] [Throttle off/off KB] [Rate 113.2/45712.1 KB] [Port: 6951] [23:22] Really working [23:22] [Throttle off/off KB] [Rate 156.5/64651.7 KB] [Port: 6951] [23:23] How much hip hop we talking? [23:23] Hair is standing up on end for 1/2 mile around [23:23] Not much, frankly. [23:23] Probably 40-50 albums. [23:23] The mixtapes that have come out since I got the FOS machine to the new hardware [23:23] sketch, have you ever thought about maybe downloading a bunch of newspapers? [23:24] i feel like its a category thats mostly regected by people [23:24] 1. Yes. 2. We have [23:24] 3. No it's not [23:25] Where is the best place to get them if i might ask? [23:25] newspapers.com [23:25] But we have them on the archive in archive.org/details/newspapers [23:25] With more coming every day [23:26] would you know of a free account for newspapers.com, to download the whole issue? [23:27] *** drumstick has joined #archiveteam-bs [23:28] Not offhand [23:31] i just singed up for a trial account, what is the best way to download from there? [23:42] is there any good way to archive instagram yet? even when archiving it by hand it's a nightmare for me [23:45] yall seen this? https://www.youtube.com/watch?v=U-_-BO99woI [23:46] jacketcha: Not really. I'm usually clicking the button, put something on the "page down" key to let it scroll to the bottom, extract the posts of the individual URLs, and throw those into ArchiveBot. [23:47] I usually click* [23:47] There's definitely a better way than this, but at least it works. [23:47] yeah [23:48] is there anyway to simulate cookies in http requests? because there are many pages that I would love to archive without asking them to go public (including mine) [23:48] What do you mean by "simulate"? [23:48] wget/wpull can handle cookies just fine. [23:48] nevermind then [23:49] i would use the api, but last time I used the api of a site I got kicked off of pastebin [23:49] kinda [23:50] love watching this kinda stuff: https://www.youtube.com/watch?v=fyKsNOTIwJk [23:50] jacketcha: https://github.com/rarcega/instagram-scraper might be a good basis for writing something to archive Instagram users. [23:51] Bond__: I usually just watch stuff like https://www.youtube.com/watch?v=VoWRSM2_DF0&t=8s [23:51] JAA: That looks promising, i'll check it out [23:52] I think im using youtube wrong [23:54] *** jiphex has joined #archiveteam-bs [23:55] Didn't see this mentioned in here yet: https://www.theverge.com/2017/12/26/16819748/library-of-congress-twitter-archive-project-stalled [23:56] *** octarine has joined #archiveteam-bs [23:56] *** floogulin has joined #archiveteam-bs [23:56] *** tklk has joined #archiveteam-bs [23:56] anyone have copies of past archived instagram crap on archive.org? [23:57] *** Ctrl-S___ has joined #archiveteam-bs [23:59] *** c0mpass has joined #archiveteam-bs