[05:20] what the urlteam project needs is basically a large disk on a static ip [05:22] On a scale of 1 to 4T, with 1 being 1G, how large of a disk are we talking about? [05:25] uh [05:25] 1T or better [05:26] latest torrent is 51G but that's rather heavily compressed [08:32] Do we really need all of the old data, uncompressed? [10:58] Uncompressed old data? [11:01] I'm not sure what xmc is hinting at, that's why I asked :) [11:04] I mean, to just run the tracker - you don't need terabytes of storage. For seeding the releases, you'll need some storage however. If I recall correctly [11:12] Well you have the tracker, which is responsible for handing out tasks and temporarily storing the results. It needs a couple of GB storage, depending on how often you move the tasks off of it. [11:13] And then you have the actual database, which is about 700 GB or so, so better plan for at least 1 TB. The finished tasks from the tracker will be imported into the db. Then on release time you can create the release from the database [11:13] I always kept the finished tasks around until I made the next release, in case the db got lost/corrupted. [11:14] The DB can be restored/initialized from a release, and then you take all the finished tasks and import them. [11:20] Ah, alright - wasn't familar with how the releases was generated - but that makes sense. If I've understood it correctly, one would at minimum have the database available when it's time for making releases. (So a very minimal setup could be just the tracker and temp store of results) [11:46] So, we'd need a box with at least 2TB (for futureproofing)? [11:50] soultcer: what was the CPU and RAM usage for it like? [11:55] Damn, 35EUR to install a 2TB external HDD, and 10EUR/month for it [12:26] GLaDOS: CPU is mostly needed for initial db import (due to db indexing) and then release generation (xz compression), but it is not necessary to have a extremly fast CPU [12:26] I used an AMD Athlon 64 X2 4800+ at 2x2,5 GHz, but most of the work is single-threaded. [12:27] I'm wondering whether it'd be better to get a 2T HDD added onto anarchive, or whether it'd be better just getting a whole new dedi [12:27] You don't necessarily need it on a dedi [12:27] http://fdcservers.net/dedicated_servers.php is pretty good, their HDD upgrade option is a one-time payment [12:27] Yeah, but I want something with a large enough HDD to hold it all [12:28] Guess what? No VPSes support that [12:28] The generated release (about 1/10th the size of database) can be uploaded to a seedbox. The database does not really need a fast or stable internet connection [12:29] my company is about to start offering dedicated hosting, too bad we wont be ready for a few more months [12:29] More important is to get a good fast HDD. I used a Velociraptor in the beginning and later switched to a Caviar Black when the database got bigger than the HDD. Both in RAID 1 configs [12:29] So approximately how much transfer is used for the DB? [12:29] You need to download the finished tasks, which is a couple of gigabytes per week as far as I remember. And then you need to upload the release once very 6 months, which is at the moment 75 gb [12:30] But of those 75 gb, most were the same as the 50 gb of the release before, so the upload was only a bit more than 25 gb. [12:30] In fact, the release creation process tries to leave as many files unchanged as possible, avoiding recompression of the same files and retransmission of the same data [12:30] I could manage that on my home VPS.. [12:31] The DB, that is. [12:31] That's how I did it as well. Home server with some nice HDDs (but otherwise rather old hardware) worked just fine. [12:32] Difference between you and me is the average internet quality. [12:34] There is no need for a stable internet connection. You can download the tasks from the tracker using git-annex or rsync, and then do the upload of the release via rsync or bittorrent. [12:35] Alright [12:36] Now I just need to go search my cupboard for some decent sized HDDs. [12:37] Wait, I have a 1T mounted in my PC now. [13:09] SPEAKING OF LACK OF STABLE INTERNET CONNECTIONS [13:11] lol [21:24] I could probably run it at my house [21:24] I have quite stable neternets