[00:44] so i got the very first ac360 podcast [00:44] so looks like i'm maybe able to get all of them [03:43] SketchCow: I expect a 150M .txt.gz of ip addresses [03:44] about 26,400,000 addrs [04:42] now I have a way to check them automatedly [04:42] ncftp is fortunately not shitty [05:00] https://github.com/ArchiveTeam/ftp-nab [05:27] blit wanted an asstr archive , already done one recently [05:42] bsmith093: I can't parse that, say again? [05:43] i'm scrolling the the logs, and what i meant to say was that i've already uploaded a fairly recent asstr archive, as of about a year ago [05:56] hm, ok. what's asstr? [07:37] Today I discovered two million+ pages wikis we archived just in time: http://wikiindex.org/Wikinfo (2013) http://wikiindex.org/WebsiteWiki (2012) [07:40] xmc: great! I want to join the party and download some of those FTP sites when you're done nicely listing them :) no more huge ones though, at most 4-500 GB or so [07:42] I finally wrote a script to deal with this nightmare site. [07:43] It will: utterly download an entire subdirectory, remove the index.html?*=* files that happen, tar up the directory, delete it, and put a "already got it, don't get again" into a list. [07:47] It's already yanked the machine back from the brink - had one drive full of 8tb of material. [07:49] always nice if it keeps downloading the same data over and over again [07:50] yow [07:58] SketchCow: how big will the uploaded items be? [07:59] Probably 200gb apiece [07:59] a script to download FTP sites without worrying about running our of disk or *cough cough* uploading 2 TB items would be wonderful [07:59] sensible [07:59] Hmmph. [08:00] Well, the big issue right now is that a lot of things break on archive.org dealing with that. [08:01] -rw-r--r-- 1 root root 46146570240 Jan 15 02:35 ftp.icm.edu.pl_amiga.tar [08:01] -rw-r--r-- 1 root root 1003048960 Jan 15 07:39 ftp.icm.edu.pl_beos.tar [08:01] -rw-r--r-- 1 root root 1185730560 Jan 15 07:14 ftp.icm.edu.pl_garbo.tar [08:01] etc [08:05] I suspect the issue is the FreeBSD and BSD directories. [08:06] I think they're as big as they get [08:09] i found flash video of cnn going back to 2008 [08:09] this a website video NOT a podcast [08:09] look here: http://money.cnn.com/sitemap_videos_0001.xml [08:10] if you know the image path you can find the video [08:10] money/video/news becomings money/big/news [08:11] then change host domain to ht3.cdn.turner.com and add _576x324_dl.flv to replace the Wxh.jpg [08:49] SketchCow: looks like the older videos maybe still around [12:28] Does anyone have a full backup of this site? http://www.heavensgate.com/ [12:30] if not yet, then we will soon [12:30] * joepie91 added it to archivebot [12:32] Thanks! [12:34] I hear you're going after every FTP site now too? [12:37] Might I ask you make sure ftp://ftp.modland.com/ is on that list? It's 81.4GB of Amiga mods and their derivatives (ie the music part of the demoscene), including tracker software and related utilities for multiple platforms. I would be grabbing it myself already but my Raspberry Pi doesn't have that much space, and I'm not leaving my laptop on for several weeks straight... :) [12:37] Username "anonymous", no password, I believe. Be gentle / slow! [12:39] 81.4GB of MODs? oh my. having Inertia Player flashbacks now... [12:46] ZoeB: sure, I'll grab that FTP too [12:46] I think wget does FTP [12:46] what [12:46] er [12:46] what kind of delay between requests would you recommend * [12:47] * joepie91 has 4TB disk now, so that wouldn't be a problem to grab [12:47] * joepie91 is also on 100mbit [12:48] 100 both up and down? [12:49] delay for FTP sites? hahahahhahahahahaha [12:50] Nemo_bis: theoretical, yes [12:51] practically, it's more like 85/55 [12:51] because my ISP is balls [12:51] this is FttH, not cable, so such a large difference between theoretical and practical is ridiculous [12:51] I -very- rarely hit 90mbit up [12:53] mine is only 10 Mb/s but I always have 100 % of it [12:59] joepie91: he said be gentle [12:59] put some lube on your fiber. [12:59] lol [12:59] that's why I asked about a delay [13:51] Sorry, was having lunch [13:51] Back now! [13:51] To give you an idea of how busy that FTP site usually is, there server_stats.txt file says: [13:51] Number of bytes downloaded last 24 hours: 1751.2 MB [13:51] Number of files downloaded last 24 hours: 19740 [13:52] s/there/their [13:52] So, uh, please don't dwarf that, I guess! :) [13:54] And yes, 81.4GB of MODs. :) 30GB of Fast Tracker 2 files, 17GB of Impulse Tracker files, etc etc. It's quite the collection! [13:55] Thank you very much! ^.^ [17:37] isp claims I have 10M up, looks reasonable: http://zeppelin.xrtc.net/corp.xrtc.net/kyat.corp.xrtc.net/if_eth1-day.png [17:50] had to stop the ftp scan early [17:50] 17:16:00 83% (3h36m left); send: 3553485314 57.3 Kp/s (57.2 Kp/s avg); recv: 22972016 355 p/s (369 p/s avg); drops: 0 p/s (4 p/s avg); hits: 0.65% [17:51] output from that run is at http://bl-r.com/trx/ftp.txt.gz [17:52] (had to stop because the isp sent the hosting provider 5 nastygrams in 10 minutes) [18:13] zcat | wc -l gives 22,961,651 addresses in that file [18:36] ding-dong ditch [18:36] people bitch [18:40] I've downloaded modland before. [18:40] All for it being downloaded again. [18:40] The owner hates me [18:42] is schemer.com already fully saved? [19:12] [2:12:01 PM] Hank Bromley (Internet Archive): for anyone keeping score at home, anand has succeeded in changing the size column in the metadata table from integer to bigint, and that monstrous 2.1 TB item has managed to update its row, which now shows a "size" value of 2331388015 (that's in KB) [19:12] For the people not keeping track, that means that the Archive Team just forced Internet Archive to work with 2.1 terabyte items [19:22] http://archive.org/details/http://tectonicablog.com/wp-content/uploads/2010/04/lakata.org-01.jpg [19:22] http://archive.org/details/ftp-ftp.hp.com_pub-2013-10 sorry [19:26] wow... [19:26] that a really big item SketchCow.. great! :) [19:27] are the other directories from ftp://ftp.hp.com/ also going to be done? [19:28] and do you some kind of tutorial on how to create a ftp copy to upload like the other ftp uploads? [19:29] something tells me it'd be funny to write an FTP server that used IA as a backend [19:29] although I guess you can do that now with the IA FUSE module