Time |
Nickname |
Message |
13:08
🔗
|
phuzion |
Anyone know if there's a method to handle sites that ban you after downloading a very small handful of pages? I'm getting anywhere between 5 and 15 downloads before getting a 503 error. |
13:08
🔗
|
midas |
prolonged waiting time |
13:08
🔗
|
midas |
which wiki are you grabbing? |
13:08
🔗
|
phuzion |
encyclopedia dramatica |
13:09
🔗
|
phuzion |
https://encyclopediadramatica.se |
13:09
🔗
|
midas |
yeah prolonged time |
13:09
🔗
|
phuzion |
what's the option for that? --delay? |
13:09
🔗
|
midas |
yep |
13:10
🔗
|
phuzion |
I'm getting stuck on XML dumps. It's not delaying between each page, which I think is what I'm getting 503'd for |
13:18
🔗
|
phuzion |
midas: I'm trying a totally new dump with --delay=1 for now, let's see if that does anything. |
13:18
🔗
|
phuzion |
Also, getting a new copy of iphonewiki |
13:31
🔗
|
phuzion |
midas: Seems to be working. Got the XML of 30 pages so far. |
14:39
🔗
|
Nemo_bis |
being nice often helps! |