#archiveteam-bs 2020-07-08,Wed

↑back Search

Time Nickname Message
00:36 πŸ”— nicolas17 lennier1: http://data.nicolas17.xyz/reckful/ I got the reckful clips yesterday, finally got around to uploading
00:36 πŸ”— nicolas17 (and setting up a more generic URL so I'm not tied to S3 :P)
00:38 πŸ”— lennier1 Thanks! What are you using to download clips/VODs?
00:42 πŸ”— nicolas17 I started archiving two friends' VODs like two years ago, my python script for that has been growing since
00:43 πŸ”— nicolas17 for this archive there were also some adhoc bash horrors
00:44 πŸ”— nicolas17 like for f in *.json; do jshon -e video_url -u < $f; done | aria2c -i - --auto-file-renaming=false -R -j5
00:51 πŸ”— JAA "Horrors"? You're using the right tools for the job. That's wonderful.
00:52 πŸ”— JAA I'm guilty of processing WARCs, JSON, and HTML with grep & Co.
00:54 πŸ”— nicolas17 renaming the clips :D for f in *.json; do video_id=$(jshon -e id -u < $f); video_url=$(jshon -e video_url -u < $f | sed 's/%7C/|/g'); filename=${video_url##*/}; ln -v "./$filename" "byid/$video_id.mp4"; done
00:54 πŸ”— twigfoot has quit IRC (Operation timed out)
00:55 πŸ”— JAA Still wonderful.
00:59 πŸ”— lennier1 I don't know how much trouble it would be, but would be useful to have it on-line somewhere. I've definitely found myself spending too much time downloading videos one at a time.
01:00 πŸ”— JAA Surely youtube-dl supports Twitch?
01:01 πŸ”— tobbez has left
01:07 πŸ”— lennier1 Cool, sounds like it probably does (based on the search I just did). Twitch Leecher is the popular program, and does do the job, but gets kind of tedious if you want to download several VODs.
01:18 πŸ”— Arcorann What about chat?
01:20 πŸ”— lennier1 The only program I know for that is Twitch Chat Downloader. https://github.com/PetterKraabol/Twitch-Chat-Downloader
01:41 πŸ”— benjins JAA I noticed that Twitter search is, at least for me, now sending out guest tokens via a script tag again, instead of via a set-cookie header. sn-scrape might be broken as a result
01:41 πŸ”— JAA benjins: Yep, thanks.
01:53 πŸ”— nicolas17 JAA: youtube-dl supports Twitch and I'm taking advantage of it :)
01:54 πŸ”— nicolas17 I use the Twitch API myself to get the list of VODs, but then I pass "https://www.twitch.tv/videos/640057509" to youtube-dl and get "https://d2nvs31859zcd8.cloudfront.net/b1b5218f5381aa7b04a1_reckful_38454871120_1472326520/chunked/index-muted-9C3480QJI9.m3u8" back
01:55 πŸ”— nicolas17 reimplementing that part seems annoying, and wasteful if youtube-dl can already do it for me :)
01:59 πŸ”— Ryz !archive https://www.vgcats.com/ --explain "VGCats.com revamped (in temporary form) on 2020 May: https://www.vgcats.com/index.php#MAY-12-2020" --concurrency 1
01:59 πŸ”— Ryz ...Oops
02:01 πŸ”— twigfoot has joined #archiveteam-bs
02:48 πŸ”— ats has quit IRC (Ping timeout: 622 seconds)
02:54 πŸ”— ats has joined #archiveteam-bs
03:21 πŸ”— qw3rty_ has joined #archiveteam-bs
03:29 πŸ”— qw3rty__ has quit IRC (Read error: Operation timed out)
05:04 πŸ”— acridAxid has quit IRC (Quit: marauder)
05:07 πŸ”— acridAxid has joined #archiveteam-bs
05:14 πŸ”— mgrandi has joined #archiveteam-bs
05:27 πŸ”— fuzzy802 has joined #archiveteam-bs
05:32 πŸ”— fuzzy8021 has quit IRC (Read error: Operation timed out)
05:37 πŸ”— fuzzy802 is now known as fuzzy8021
06:29 πŸ”— drcd has quit IRC (Ping timeout: 186 seconds)
06:29 πŸ”— Deewiant has quit IRC (Ping timeout: 186 seconds)
06:30 πŸ”— drcd has joined #archiveteam-bs
06:30 πŸ”— Deewiant has joined #archiveteam-bs
06:46 πŸ”— kiska has quit IRC (Remote host closed the connection)
06:49 πŸ”— kiska has joined #archiveteam-bs
07:21 πŸ”— Raccoon has quit IRC (Ping timeout: 272 seconds)
07:32 πŸ”— Darkstar has quit IRC (Read error: Operation timed out)
07:35 πŸ”— BlueMax has quit IRC (Quit: Leaving)
07:37 πŸ”— Darkstar has joined #archiveteam-bs
08:34 πŸ”— mgrandi has quit IRC (Leaving)
08:39 πŸ”— schbirid has joined #archiveteam-bs
08:51 πŸ”— puppefan has joined #archiveteam-bs
08:55 πŸ”— puppefan has quit IRC (Ping timeout: 252 seconds)
09:17 πŸ”— Gallifrey has joined #archiveteam-bs
09:21 πŸ”— Gallifrey Hi, is this the right place to talk about manually* adding a large number (1000+) of pages to the Wayback Machine?
09:22 πŸ”— Gallifrey (*Or semi-manually, using my script that uses the pastpages/savepagenow library on github)
10:06 πŸ”— OrIdow6 Gallifrey: I'd say here
10:29 πŸ”— Gallifrey Well, I suppose my first question is, at what point should I feel guilty about 'overloading' archive.org? 10k pages? 100k pages?
10:49 πŸ”— Smiley has quit IRC (Ping timeout: 272 seconds)
11:31 πŸ”— ats has quit IRC (se.hub irc.efnet.nl)
11:31 πŸ”— colona_ has quit IRC (se.hub irc.efnet.nl)
11:31 πŸ”— Ctrl has quit IRC (se.hub irc.efnet.nl)
11:31 πŸ”— wessel152 has quit IRC (se.hub irc.efnet.nl)
11:32 πŸ”— Arcorann has quit IRC (Read error: Connection reset by peer)
11:35 πŸ”— ats has joined #archiveteam-bs
11:35 πŸ”— colona_ has joined #archiveteam-bs
11:35 πŸ”— Ctrl has joined #archiveteam-bs
11:35 πŸ”— wessel152 has joined #archiveteam-bs
11:38 πŸ”— OrIdow6 I don't think there's much of a good answer for that
11:55 πŸ”— wp494 has quit IRC (LOUD UNNECESSARY QUIT MESSAGES)
12:00 πŸ”— SmileyG has joined #archiveteam-bs
12:02 πŸ”— Arcorann has joined #archiveteam-bs
12:12 πŸ”— Gallifrey Second question is, are there any more 'official' channels for doing this work (e.g. ArchiveBot), rather than going rogue and doing it with a script? And if so, would it depend on the type of content I'm trying to archive? I don't know if several thousand handpicked Reddit URLs is a suitable use of ArchiveBot's time.
12:36 πŸ”— FlyWalk has joined #archiveteam-bs
12:36 πŸ”— FlyWalk has quit IRC (Client Quit)
12:37 πŸ”— WalkFly has joined #archiveteam-bs
12:42 πŸ”— Arcorann_ has joined #archiveteam-bs
12:45 πŸ”— Arcorann has quit IRC (Ping timeout: 265 seconds)
12:48 πŸ”— Datechnom has quit IRC (Read error: Connection reset by peer)
12:49 πŸ”— Datechnom has joined #archiveteam-bs
12:59 πŸ”— JAA Gallifrey: Are those threads on a particular subject or similar? If it's of general public interest, yes, we can run it through AB. Note though that it won't grab comment pagination and deeply nested comments. Also, it'll have to be old.reddit.com since the new design sucks.
13:04 πŸ”— HP_Archiv has joined #archiveteam-bs
13:17 πŸ”— Gallifrey This would be for soon-to-be-banned subreddits like /r/TumblrInAction, /r/WatchRedditDie + any others I can't think of. Already using old.reddit.com and grabbing the ?limit=500 link as well if there are >200 comments
13:18 πŸ”— HP_Archiv !a https://www.youtube.com/watch?v=pIZrHCXIPkY
13:18 πŸ”— HP_Archiv oops
13:18 πŸ”— HP_Archiv wrong channel
13:20 πŸ”— Gallifrey I also grabbed the www.reddit.com links for a sense of completion, plus the image or the link if there is one. But I ran into a problem when I archived a page every 10 seconds - the dreaded 429 error.
13:22 πŸ”— Gallifrey To be clear, this was not the WB machine rate-limiting my requests. Rather it was WB being rate-limited by Reddit itself.
13:23 πŸ”— HP_Archiv has quit IRC (Quit: Leaving)
13:25 πŸ”— qw3rty_ has quit IRC (se.hub efnet.deic.eu)
13:25 πŸ”— kiskaWee has quit IRC (se.hub efnet.deic.eu)
13:25 πŸ”— Maylay has quit IRC (se.hub efnet.deic.eu)
13:25 πŸ”— maxfan8 has quit IRC (se.hub efnet.deic.eu)
13:25 πŸ”— Arcorann_ has quit IRC (Read error: Connection reset by peer)
13:25 πŸ”— Arcorann_ has joined #archiveteam-bs
13:25 πŸ”— ats has quit IRC (se.hub irc.efnet.nl)
13:25 πŸ”— colona_ has quit IRC (se.hub irc.efnet.nl)
13:25 πŸ”— Ctrl has quit IRC (se.hub irc.efnet.nl)
13:25 πŸ”— wessel152 has quit IRC (se.hub irc.efnet.nl)
13:26 πŸ”— fredgido has joined #archiveteam-bs
13:30 πŸ”— ats has joined #archiveteam-bs
13:30 πŸ”— colona_ has joined #archiveteam-bs
13:30 πŸ”— Ctrl has joined #archiveteam-bs
13:30 πŸ”— wessel152 has joined #archiveteam-bs
13:31 πŸ”— fredgido_ has quit IRC (Read error: Operation timed out)
13:39 πŸ”— VoynichCr is there a project/collection for all Linux distros isos?
13:39 πŸ”— VoynichCr look any distro and you have a download page full of links to different versions, sizes, etc https://alpinelinux.org/downloads/
13:40 πŸ”— VoynichCr does it worth saving all them?
13:42 πŸ”— VoynichCr the same for their websites
13:45 πŸ”— qw3rty_ has joined #archiveteam-bs
13:45 πŸ”— kiskaWee has joined #archiveteam-bs
13:45 πŸ”— Maylay has joined #archiveteam-bs
13:45 πŸ”— maxfan8 has joined #archiveteam-bs
13:55 πŸ”— Arcorann_ has quit IRC (Read error: Connection reset by peer)
13:55 πŸ”— Arcorann_ has joined #archiveteam-bs
13:59 πŸ”— ats has quit IRC (se.hub irc.efnet.nl)
13:59 πŸ”— colona_ has quit IRC (se.hub irc.efnet.nl)
13:59 πŸ”— Ctrl has quit IRC (se.hub irc.efnet.nl)
13:59 πŸ”— wessel152 has quit IRC (se.hub irc.efnet.nl)
13:59 πŸ”— qw3rty_ has quit IRC (se.hub efnet.deic.eu)
13:59 πŸ”— kiskaWee has quit IRC (se.hub efnet.deic.eu)
13:59 πŸ”— Maylay has quit IRC (se.hub efnet.deic.eu)
13:59 πŸ”— maxfan8 has quit IRC (se.hub efnet.deic.eu)
14:00 πŸ”— Arcorann_ has quit IRC (Read error: Connection reset by peer)
14:00 πŸ”— Arcorann_ has joined #archiveteam-bs
14:02 πŸ”— ats has joined #archiveteam-bs
14:02 πŸ”— colona_ has joined #archiveteam-bs
14:02 πŸ”— Ctrl has joined #archiveteam-bs
14:02 πŸ”— wessel152 has joined #archiveteam-bs
14:02 πŸ”— qw3rty_ has joined #archiveteam-bs
14:02 πŸ”— kiskaWee has joined #archiveteam-bs
14:02 πŸ”— Maylay has joined #archiveteam-bs
14:02 πŸ”— maxfan8 has joined #archiveteam-bs
14:45 πŸ”— Arcorann_ has quit IRC (Read error: Connection reset by peer)
15:01 πŸ”— HP_Archiv has joined #archiveteam-bs
15:05 πŸ”— Ryz has quit IRC (Remote host closed the connection)
15:05 πŸ”— kiska1825 has quit IRC (Remote host closed the connection)
15:05 πŸ”— kiska1825 has joined #archiveteam-bs
15:06 πŸ”— Ryz has joined #archiveteam-bs
15:17 πŸ”— DogsRNice has joined #archiveteam-bs
15:27 πŸ”— nicolas17 wtf
15:28 πŸ”— nicolas17 apparently Twitch's clip API has a limit to how much it will return, even after you follow the pagination cursor stuff to the end
15:28 πŸ”— nicolas17 someone on reddit suggested sending a starting timestamp
15:29 πŸ”— nicolas17 and instead of 1019 clips all-time, I got 814 clips for the first day of the stream's history alone
15:36 πŸ”— lunik13 has quit IRC (Quit: :x)
15:39 πŸ”— lunik13 has joined #archiveteam-bs
16:22 πŸ”— HP_Archiv has quit IRC (Quit: Leaving)
17:02 πŸ”— schbirid has quit IRC (Quit: Leaving)
17:08 πŸ”— igloo25 has joined #archiveteam-bs
17:08 πŸ”— wp494 has joined #archiveteam-bs
17:11 πŸ”— Kaz_ has joined #archiveteam-bs
17:12 πŸ”— Kaz_ being nick squatted is fun, isn't it
17:12 πŸ”— Kaz_ (will confirm identity outside of irc, pls don't ban)
18:08 πŸ”— jodizzle Gallifrey: I think we've already had some people go through and archive various at-risk subreddits.
18:08 πŸ”— jodizzle Though there are various pagination limits.
18:10 πŸ”— jodizzle Ryz might know more.
18:12 πŸ”— Ryz Gallifrey, I usually just archive subreddits via ArchiveBot; I'm assuming you're archiving via WBM itself? Are you logged in? If not, there's a 12 links to archive per minute (rather than the 15 they claimed to say)
18:13 πŸ”— Ryz I grabbed both https://old.reddit.com/r/WatchRedditDie/ and https://old.reddit.com/r/TumblrInAction/ several days ago when they did a rumor announcement of banning those sub-Reddits with the updated rules
18:14 πŸ”— jodizzle But there are depths limits basically, right? AB doesn't paginate the entire history of the subreddit?
18:15 πŸ”— Ryz Unfortunately yes, there are pagination limits; even when browsing on the sub-Reddits normally; how far it goes back is a bit strange and random
18:15 πŸ”— Ryz Some go back up to 6 months; others a month
18:17 πŸ”— jodizzle Right. So I guess if Gallifrey has particular threads in mind that are old enough, we could still get those separately.
18:18 πŸ”— nicolas17 okay, I don't know if this actually got *everything*, but
18:18 πŸ”— nicolas17 135173 twitch clips
18:35 πŸ”— Kaz_ is now known as Kaz
18:35 πŸ”— Kaz ayy, we back :)
18:50 πŸ”— qw3rty has joined #archiveteam-bs
18:50 πŸ”— SmileyG has quit IRC (Remote host closed the connection)
18:50 πŸ”— qw3rty_ has quit IRC (irc.efnet.nl efnet.deic.eu)
18:50 πŸ”— kiskaWee has quit IRC (irc.efnet.nl efnet.deic.eu)
18:50 πŸ”— Maylay has quit IRC (irc.efnet.nl efnet.deic.eu)
18:50 πŸ”— maxfan8 has quit IRC (irc.efnet.nl efnet.deic.eu)
18:50 πŸ”— Smiley has joined #archiveteam-bs
18:54 πŸ”— kiskaWee has joined #archiveteam-bs
18:54 πŸ”— Maylay has joined #archiveteam-bs
18:54 πŸ”— maxfan8 has joined #archiveteam-bs
19:19 πŸ”— xit has joined #archiveteam-bs
19:19 πŸ”— mgrandi has joined #archiveteam-bs
19:27 πŸ”— horkermon has joined #archiveteam-bs
19:28 πŸ”— mgrytbak has joined #archiveteam-bs
19:29 πŸ”— picklefac has joined #archiveteam-bs
19:29 πŸ”— Kaz whee, here comes the irccloud crowd
19:30 πŸ”— riking_ has joined #archiveteam-bs
19:30 πŸ”— diggan has joined #archiveteam-bs
19:33 πŸ”— DrasticAc has joined #archiveteam-bs
19:34 πŸ”— pnJay has joined #archiveteam-bs
19:36 πŸ”— justcool3 has joined #archiveteam-bs
19:37 πŸ”— Vito` has joined #archiveteam-bs
19:38 πŸ”— jesse-s has joined #archiveteam-bs
19:40 πŸ”— abartov__ has joined #archiveteam-bs
19:41 πŸ”— nicolas17 man wtf is up with this Twitch API
19:42 πŸ”— jrwr has joined #archiveteam-bs
19:42 πŸ”— nicolas17 I get clips from 2018-01-01 to 2018-07-01, then from 2018-07-01 to 2019-01-01... if I then ask for clips from 2018-01-01 to 2019-01-01 I get a few that weren't returned in either of the former two requests
19:44 πŸ”— lenary has joined #archiveteam-bs
19:44 πŸ”— amelia386 has joined #archiveteam-bs
19:45 πŸ”— starlord has joined #archiveteam-bs
19:45 πŸ”— revi has joined #archiveteam-bs
19:45 πŸ”— mgrandi are you using their graphql api thing?
19:47 πŸ”— nicolas17 https://api.twitch.tv/helix/clips
19:47 πŸ”— lennier2 has joined #archiveteam-bs
19:49 πŸ”— Stilett0 has joined #archiveteam-bs
19:50 πŸ”— mgrandi the script i saw when people were freaking out over the dmca stuff was some graphql api
19:51 πŸ”— nicolas17 well, deleting all clips is easier... it doesn't matter if you only get some arbitrary subset of 1000, once you delete them you can search again, until it's empty :P
19:51 πŸ”— nicolas17 but I'll look into that...
19:52 πŸ”— mgrandi oh no, its using the helix api
19:52 πŸ”— mgrandi but it using some graphql api to get info about clips i guess
19:52 πŸ”— mgrandi https://github.com/danefairbanks/TwitchClipManager/blob/master/Program.cs#L315
19:53 πŸ”— nicolas17 ah yes, to get the actual .mp4 URL I guess
19:53 πŸ”— Stiletto has quit IRC (Ping timeout: 622 seconds)
19:53 πŸ”— nicolas17 I'm delegating that part to youtube-dl ^^
19:53 πŸ”— lennier1 has quit IRC (Read error: Operation timed out)
19:53 πŸ”— lennier2 is now known as lennier1
19:54 πŸ”— nicolas17 see GetClipsApi
19:54 πŸ”— tchaypo_ has joined #archiveteam-bs
19:55 πŸ”— mgrandi yeah, thats using helix
19:55 πŸ”— mgrandi i've heard...not great things about helix
19:55 πŸ”— mgrandi so maybe its just some 'eventually consistent' thing
19:56 πŸ”— nicolas17 apparently if you keep using the pagination cursor to get next page, you get n per page (the &first= parameter says how many) but a total of 900-1000
19:56 πŸ”— nicolas17 I'll see if I can do something recursive instead of arbitrarily picking a small time period...
19:57 πŸ”— nicolas17 if I get more than 500 clips, subdivide the time period and try again
19:57 πŸ”— xit has quit IRC ()
19:57 πŸ”— Stiletto has joined #archiveteam-bs
19:58 πŸ”— Stilett0 has quit IRC (Ping timeout: 260 seconds)
19:59 πŸ”— mgrandi there is also TwitchLeecher that has the same functionality you are wanting, maybe look into how it does it
19:59 πŸ”— mgrandi https://github.com/Franiac/TwitchLeecher/releases
20:00 πŸ”— JSharp___ has joined #archiveteam-bs
20:01 πŸ”— Kaz_ has joined #archiveteam-bs
20:02 πŸ”— Kaz has quit IRC (Quit: leaving)
20:02 πŸ”— Kaz_ is now known as Kaz
20:06 πŸ”— hook54321 has joined #archiveteam-bs
20:06 πŸ”— svchfoo1 sets mode: +o hook54321
20:15 πŸ”— SJon___ has joined #archiveteam-bs
20:18 πŸ”— fallenoak has joined #archiveteam-bs
20:18 πŸ”— HCross has joined #archiveteam-bs
20:19 πŸ”— mattl has joined #archiveteam-bs
20:21 πŸ”— ThisAsYou has joined #archiveteam-bs
20:25 πŸ”— Ctrl-S___ has joined #archiveteam-bs
20:26 πŸ”— c0mpass has joined #archiveteam-bs
21:00 πŸ”— kyledrake has joined #archiveteam-bs
21:05 πŸ”— Craigle has quit IRC (Quit: Ping timeout (120 seconds))
21:05 πŸ”— apache2_ has quit IRC (Remote host closed the connection)
21:05 πŸ”— PotcFdk has quit IRC (Quit: ~'o'/)
21:05 πŸ”— coderobe has quit IRC (Quit: Ping timeout (120 seconds))
21:06 πŸ”— DopefishJ has joined #archiveteam-bs
21:06 πŸ”— apache2 has joined #archiveteam-bs
21:06 πŸ”— Craigle has joined #archiveteam-bs
21:06 πŸ”— coderobe has joined #archiveteam-bs
21:06 πŸ”— mtntmnky_ has quit IRC (Remote host closed the connection)
21:07 πŸ”— mtntmnky_ has joined #archiveteam-bs
21:07 πŸ”— PotcFdk has joined #archiveteam-bs
21:12 πŸ”— Yurume_ has joined #archiveteam-bs
21:12 πŸ”— Yurume has quit IRC (Read error: Connection reset by peer)
21:14 πŸ”— HP_Archiv has joined #archiveteam-bs
21:15 πŸ”— DFJustin has quit IRC (Ping timeout: 745 seconds)
21:15 πŸ”— HP_Archiv has quit IRC (Client Quit)
21:16 πŸ”— HP_Archiv has joined #archiveteam-bs
21:20 πŸ”— fredgido has quit IRC (Read error: Operation timed out)
21:47 πŸ”— nicolas17 "pop datetime interval from queue, request clips from Twitch, save any metadata that we don't already have, if the request returned >500 clips, split interval into two and push the two back into the queue"
21:47 πŸ”— nicolas17 my queue currently has 190 5-day intervals and it keeps finding new videos x_x
21:49 πŸ”— HP_Archiv has quit IRC (Quit: Leaving)
22:42 πŸ”— Ryz I recall someone was talking about archiving Twitch content earlier, is there?
22:42 πŸ”— Ryz There's another person that died via suicide - https://www.twitch.tv/ohlana - https://www.dexerto.com/general/twitch-streamer-ohlana-passes-away-at-26-from-suicide-1389840
22:43 πŸ”— nicolas17 Ryz: apparently she has subscriber-only VODs
22:43 πŸ”— Ryz Well, that's an oof s:
22:43 πŸ”— nicolas17 https://www.reddit.com/r/LivestreamFail/comments/hnfkd5/twitch_streamer_ohlana_has_passed_away_clip_is/fxbqe4x/
22:43 πŸ”— Ryz I didn't check further since there doesn't seem to be an indicator if the content is subscriber only~
22:49 πŸ”— nicolas17 Ryz: apparently there was some kind of "suicide cluster" :/
22:51 πŸ”— nicolas17 Ryz: https://twitter.com/venomous_pyscho/status/1280932651193503747
22:57 πŸ”— Ryz Mm, ran through the stuff I could accordingly~
23:07 πŸ”— mgrandi if you want an existing thing to download the raw videos, you can use TwitchLeecher to just download the videos, won't get any of the other stuff like the web pages or anyhting
23:10 πŸ”— mgrandi has quit IRC (Leaving)
23:15 πŸ”— Arcorann_ has joined #archiveteam-bs
23:53 πŸ”— Gallifrey Jodizzle - that's fantastic news! Do you know how many threads were archived per subreddit?
23:54 πŸ”— JAA Gallifrey: The pagination lists (about) 1000 threads on each of the top, new, etc. lists.
23:54 πŸ”— JAA So depending on how much overlap there is, anywhere between 1k and I think 10k?
23:54 πŸ”— Gallifrey JAA - That was going to be my next question!
23:55 πŸ”— JAA hot + new + rising + controversial + gilded + top hour + top day + top week + top month + top year + top all time, 1k each
23:55 πŸ”— JAA So between 1k and 11k
23:57 πŸ”— JAA There's no way to archive anything further than that except by bruteforcing all possible thread IDs (unless you know the URL, obviously).
23:57 πŸ”— JAA We'll have a project for archiving all of Reddit soonβ„’. The channel for that is #shreddit on hackint.

irclogger-viewer