#urlteam 2012-10-30,Tue

↑back Search

Time Nickname Message
02:12 🔗 bsmith094 woop woop woop, life check siren chronomex ersi soultcer S[h]O[r]T joepie91
02:13 🔗 chronomex what hi
02:13 🔗 bsmith094 where are the files generated by the scraper being put, because i see tons of activity, but no file creation
02:13 🔗 bsmith094 im using this screen -SL grab ./run.py --tracker=http://tracker.tinyarchive.org/v1/ --sleep=20 -n 20 --temp-dir=./data --username=bsmith093 -d -c
02:29 🔗 S[h]O[r]T im alive
02:29 🔗 S[h]O[r]T no idea tho
06:33 🔗 ersi bsmith094: I dunno.. I don't even know if it generates any files
06:34 🔗 bsmith094 ersi: so where is the data going, is it just updating a remote file?
06:35 🔗 ersi it's motherfuckin' links, I guess you could keep them in RAM and then just fart something out
06:35 🔗 ersi directly
06:35 🔗 ersi check the source man
06:36 🔗 ersi it's python, should hopefully be readable :)
08:24 🔗 soultcer bsmith094: It writes them to a temporary file, but
08:24 🔗 soultcer a) the files are very small, since the limit is not bandwidth, but those damn shorteners that keep rate-limiting our efforts
08:25 🔗 soultcer b) the file is uploaded and deleted as soon as it is finished (one task takes about 5 minutes)
08:25 🔗 soultcer c) The file is unlinked directly after creation, so while it does exist and takes up space, it won't show up in any directory listing
09:35 🔗 ersi Which are all totally fine btw
22:58 🔗 soultcer Haha I think kl.am is not able to handle the load
23:03 🔗 soultcer UPDATE task SET status = 'free' WHERE id IN (SELECT id FROM task WHERE status = 'paused' AND service_id = 6 LIMIT MIN(5, 5 - (SELECT COUNT(*) FROM task WHERE service_id = 6 AND (status = 'assigned' OR status = 'free'))));
23:03 🔗 soultcer Ghetto rate limiting

irclogger-viewer