[02:12] woop woop woop, life check siren chronomex ersi soultcer S[h]O[r]T joepie91 [02:13] what hi [02:13] where are the files generated by the scraper being put, because i see tons of activity, but no file creation [02:13] im using this screen -SL grab ./run.py --tracker=http://tracker.tinyarchive.org/v1/ --sleep=20 -n 20 --temp-dir=./data --username=bsmith093 -d -c [02:29] im alive [02:29] no idea tho [06:33] bsmith094: I dunno.. I don't even know if it generates any files [06:34] ersi: so where is the data going, is it just updating a remote file? [06:35] it's motherfuckin' links, I guess you could keep them in RAM and then just fart something out [06:35] directly [06:35] check the source man [06:36] it's python, should hopefully be readable :) [08:24] bsmith094: It writes them to a temporary file, but [08:24] a) the files are very small, since the limit is not bandwidth, but those damn shorteners that keep rate-limiting our efforts [08:25] b) the file is uploaded and deleted as soon as it is finished (one task takes about 5 minutes) [08:25] c) The file is unlinked directly after creation, so while it does exist and takes up space, it won't show up in any directory listing [09:35] Which are all totally fine btw [22:58] Haha I think kl.am is not able to handle the load [23:03] UPDATE task SET status = 'free' WHERE id IN (SELECT id FROM task WHERE status = 'paused' AND service_id = 6 LIMIT MIN(5, 5 - (SELECT COUNT(*) FROM task WHERE service_id = 6 AND (status = 'assigned' OR status = 'free')))); [23:03] Ghetto rate limiting