Time |
Nickname |
Message |
12:52
🔗
|
emijrp |
script for downloading all wikipedia pages http://code.google.com/p/wikiteam/source/browse/trunk/wikipediadownloader.py 100GB needed |
13:28
🔗
|
soultcer |
I don't think the wikipedia texts are endangered that much |
13:28
🔗
|
soultcer |
But the images/media files are |
13:29
🔗
|
soultcer |
There is no offsite backup, and the only safety net is a raid array |
13:42
🔗
|
emijrp |
there is no official mirror for the texts |
14:00
🔗
|
emijrp |
about the images... they resync the images to an external server |
14:00
🔗
|
emijrp |
but yep, the images issue is real |
14:01
🔗
|
emijrp |
i think we have to split the images into chunks by upload date |
14:02
🔗
|
emijrp |
and make incremental backups |