[00:02] *** Cameron_D has joined #urlteam [01:31] *** kiskabak has quit IRC (Remote host closed the connection) [01:32] *** kiskabak has joined #urlteam [01:32] *** Fusl sets mode: +o kiskabak [01:32] *** Fusl_ sets mode: +o kiskabak [02:55] *** kiska1 has quit IRC (Remote host closed the connection) [02:56] *** kiska1 has joined #urlteam [02:56] *** Fusl sets mode: +o kiska1 [02:56] *** Fusl_ sets mode: +o kiska1 [03:20] *** CyberVenu has joined #urlteam [03:25] *** CyberVenu has quit IRC (Ping timeout: 264 seconds) [03:50] *** a_ has joined #urlteam [03:51] *** treora has quit IRC (Remote host closed the connection) [03:51] *** treora has joined #urlteam [03:52] *** odemg has quit IRC (Read error: Operation timed out) [04:06] *** a_ has quit IRC (Quit: Page closed) [04:08] *** odemg has joined #urlteam [05:27] *** systwi has quit IRC (Read error: Operation timed out) [06:14] *** systwi has joined #urlteam [15:16] *** Wingy has joined #urlteam [16:29] *** Dj-Wawa has joined #urlteam [19:01] I know I'm going to regret this, but I can theoretically maybe eventually write a multi-threaded tracker in rust [19:20] *** slang has joined #urlteam [19:21] It looks like the tracker is stopped. I'm getting "507 Server Error: The tracker needs an operator for manual maintenance." [19:23] also, @kpcyrd what about making a RabbitMQ based tracker? [19:24] *** britmob has joined #urlteam [19:24] Are you all aware of this? "Error communicating with tracker: 507 Server Error: The tracker needs an operator for manual maintenance. Try again later. for url: https://tracker.archiveteam.org:1338/api/get." [19:25] slang: I was thinking about that too, but the "only one shortener per IP" enforced on the tracker might make this more complicated to implement [19:25] Yep, looking into it. [19:25] *one task per shortener per ip [19:27] Flashfire: Did you add cort-as? It's been throwing errors all day. (Also, please mention in here when you add a project or change settings.) [19:28] UnexpectedNoResult, examples: http://cort.as/1gB http://cort.as/1Nc http://cort.as/4gl http://cort.as/5O9 [19:29] Cleared the cort-as errors and disabled the auto-queueing on that for now. [19:41] kpcyrd: oh yeah, I guess that's tricky. I was going to say per-consumer queues, but that wouldn't prevent someone from running multiple instances of the scraper on 1 IP. [19:48] you could probably do it with the HTTP API... poll and close any duplicate connections (more than 1 connection per IP address) [20:05] *** britmob has quit IRC (Quit: Page closed) [20:05] *** britm0b has joined #urlteam [22:16] *** slang has quit IRC (Quit: Page closed)