[05:50] I just started running my archive team warrior after a long time [05:50] urlteam seems like the only active project [05:50] and there aren't any jobs [05:51] is there something wrong with my vm or are there actually jo bjobs [05:51] no* [05:51] no jobs* [05:51] sorry, [06:01] no jobs at the moment [06:02] any idea when some might pop up [06:07] not entirely sure [06:07] there are some jobs not related to urlteam in the works [06:35] Hey there. [06:35] We should talk about who wants to work on this and if I need to bring in more people. [06:39] mmm, I have a little free time, in theory [06:40] what kind of work needs to be done [06:41] there's a project to rewrite the tracker [06:42] the current one was deemed unsatisfactory [06:42] I see [06:42] although it obviously works; I'm running it on a server right now [06:42] is there a project page somewhere for that or something [06:42] there's a github page for the replacement [06:44] somewhere [06:45] Can we just turn it on? [06:45] I can provide machinery [06:45] ah, right [06:45] https://github.com/ArchiveTeam/terroroftinytown [06:46] I don't know the status of the new one [06:47] I've got the old one running at argonath.db48x.net [06:47] SketchCow: we can just keep adding jobs to it [06:48] OK. [06:48] I really think we should be going after these now. [06:48] yea [06:48] there is one tweak I want to make before I run it again [06:48] or rather, add jobs to it [06:51] Well, please, whatever support you need from me, you have. [06:54] actually, a couple of tweaks [06:54] it doesn't handle warriors that fail very well [06:54] but really we just need people to sit down and figure out what to download [06:59] are you referring to specific link shorteners to download links from [07:00] or the format of links from said sites [07:00] or both [07:01] or I guess other archive team projects too eh? [07:01] both [07:03] how do you guys manage continuing to maintain link shorteners that you've already started to archive? [07:03] I'm assuming they assign the shortened url link on a sequential basis [07:04] so do you just kinda go until you start getting 403 or 404s [07:04] and then pick up again from that point later [07:08] unknown [07:11] figaro: you could grab the last release or two and compare the data :) [07:12] I might just do that [07:14] I poke around with python every once in a while [07:14] we should make some definite plans [07:14] thanks for the info [07:14] I was hoping to put all this vacation bandwidth to some use [07:14] maybe next time [07:14] night all [07:15] good night [07:15] and thanks for thinking of us :) [07:18] I have to get back to work, but lets plan for this saturday [07:19] the 5th [07:19] SketchCow: could you put that in the topic? [07:21] Saturday the 5th for a meeting? [07:21] Just talk in here. [07:21] That'll work. [07:22] yea, in here, the 5th, to do some work [07:22] that works to