| Time |
Nickname |
Message |
|
00:10
🔗
|
Nemo_bis |
I've finished reading all the threads in [[Support]] from the beggginning of time \o/ |
|
00:10
🔗
|
Nemo_bis |
that was about https://translatewiki.net |
|
00:12
🔗
|
Nemo_bis |
Coderjoe: are you sure about it? I'm almost certain that it once worked for me, saving me some Gbs of upload |
|
00:56
🔗
|
chronomex |
greetings from HOPE |
|
03:21
🔗
|
godane |
underscor: did you get these: http://www.demonoid.me/files/?uid=3783180&seeded=2 |
|
03:22
🔗
|
godane |
imaginefx 49-74 dvds |
|
03:22
🔗
|
godane |
saidly i think its the contents of the iso and the iso's themself |
|
03:22
🔗
|
godane |
but not sure |
|
03:35
🔗
|
godane |
i'm planning on getting them now |
|
05:36
🔗
|
Foxhack |
Um, hello? |
|
05:37
🔗
|
Foxhack |
Is there anyone who can help me out? I'm the owner of a couple of sites that are listed in the Parodius Networking project. |
|
05:59
🔗
|
Zod_ |
channel is a little slow, especially at this time |
|
08:44
🔗
|
Schbirid |
anyone gave planetphillip.com a try? i will try to let it pass |
|
16:33
🔗
|
godane |
uploaded: http://archive.org/details/cdrom-3d-world-118 |
|
16:33
🔗
|
godane |
uploaded: http://archive.org/details/cdrom-3d-world-115 |
|
17:40
🔗
|
godane |
uploaded: http://archive.org/details/cdrom-3d-world-124 |
|
17:43
🔗
|
ersi |
uploaded: http://archive.org/details/cdrom-3d-world-199 |
|
17:45
🔗
|
godane |
very funny |
|
17:48
🔗
|
ersi |
:) |
|
18:45
🔗
|
godane |
uploading this now: http://archive.org/details/cdrom-inside-mac-games-jan-feb-1995 |
|
18:45
🔗
|
godane |
its a hfs iso image |
|
19:39
🔗
|
godane |
finally was uploaded: http://archive.org/details/cdrom-inside-mac-games-jan-feb-1995 |
|
21:07
🔗
|
Coderjoe |
we don't need the status updates, really |
|
21:08
🔗
|
underscor |
#archiveteam-godane |
|
21:08
🔗
|
Coderjoe |
Nemo_bis: well, I think for it to be able to resume, it would need to do a HEAD request to see what's already been uploaded, and the last time I poked, the s3api did not give the correct size in the HEAD response |
|
21:09
🔗
|
Coderjoe |
(actually, I think it might have been nginx's fault, but I am not sure) |
|
21:09
🔗
|
underscor |
s3 in general doesn't support resume like that, afaik |
|
21:09
🔗
|
underscor |
Anyway, use multipart and send the missing pieces at the end |
|
21:10
🔗
|
underscor |
https://gist.github.com/764224 this is what we use internally for a lot of stuff |
|
21:10
🔗
|
Coderjoe |
well, you have to find out SOMEHOW what the current state is in order to resume. |
|
21:10
🔗
|
Coderjoe |
particularly across multiple runs |
|
21:11
🔗
|
underscor |
Well, yeah. In that case, you have to look and see which pieces are missing, and send those, and then send the multiput finished command |
|
21:11
🔗
|
underscor |
it's not very clean |
|
21:12
🔗
|
underscor |
anyway, if you use that python script, modify line 162 to: |
|
21:12
🔗
|
underscor |
s3 = boto.connect_s3(key, secret, host='s3.us.archive.org', is_secure=False) |
|
21:12
🔗
|
underscor |
and it will handle automatic retries and stuff |
|
21:13
🔗
|
Nemo_bis |
Coderjoe: yes, dunno |
|
21:13
🔗
|
Schbirid |
i am retrying planetphillip :( |
|
21:28
🔗
|
Coderjoe |
meh |
|
21:28
🔗
|
Coderjoe |
I was not aware that amazon had to make multipart stupidly more complext |
|
21:28
🔗
|
Coderjoe |
-t |
|
21:36
🔗
|
Coderjoe |
well, I suppose this makes sense for multipart |
|
21:36
🔗
|
Coderjoe |
but for resuming, without parallel uploads, multipart is stupidly complex |
|
21:43
🔗
|
underscor |
yeah |
|
21:43
🔗
|
underscor |
:( |
|
22:42
🔗
|
omf_ |
shit I was working on planetphillip too. Duplicate effort. I will stop until Schbirid gets back |