Time |
Nickname |
Message |
00:00
๐
|
|
fie_ has joined #archiveteam-bs |
00:02
๐
|
|
fie has quit IRC (Read error: Operation timed out) |
00:08
๐
|
|
tomwsmf-a has quit IRC (Read error: Operation timed out) |
00:16
๐
|
|
w0rp has quit IRC (Read error: Operation timed out) |
00:20
๐
|
|
fie_ has quit IRC (Quit: Leaving) |
00:21
๐
|
|
fie has joined #archiveteam-bs |
00:21
๐
|
|
w0rp has joined #archiveteam-bs |
00:31
๐
|
|
Stiletto has joined #archiveteam-bs |
00:31
๐
|
|
Fletcher_ has joined #archiveteam-bs |
00:50
๐
|
SketchCow |
http://fos.textfiles.com/ARCHIVETEAM/ has proper timestamps now |
00:52
๐
|
|
FalconK has quit IRC (Ping timeout: 260 seconds) |
00:53
๐
|
|
FalconK has joined #archiveteam-bs |
01:00
๐
|
|
JesseW has joined #archiveteam-bs |
02:01
๐
|
|
Fletcher_ has quit IRC (Ping timeout: 250 seconds) |
02:01
๐
|
|
koon has quit IRC (Ping timeout: 250 seconds) |
02:14
๐
|
godane |
SketchCow: the first 3 links on that page has tar balls that are 0 bytes |
02:23
๐
|
|
Coderjoe_ has joined #archiveteam-bs |
02:25
๐
|
|
Coderjoe has quit IRC (Read error: Operation timed out) |
02:28
๐
|
SketchCow |
Yes |
02:28
๐
|
SketchCow |
That's a thing I will fix soon. |
02:28
๐
|
SketchCow |
The system has always made zero-length tarballs. |
02:29
๐
|
SketchCow |
Now that I've centralized the upload script, I will put a check in. |
02:31
๐
|
SketchCow |
Line added. Thanks for the tip. |
02:39
๐
|
godane |
i'm up to 2015-10-31 with kfpa |
02:40
๐
|
godane |
so that weight of kpfa is mostly off my shoulder's now |
02:40
๐
|
godane |
and we don't have make a bot to grab either |
02:51
๐
|
|
Fletcher_ has joined #archiveteam-bs |
02:51
๐
|
|
koon has joined #archiveteam-bs |
02:55
๐
|
|
fie_ has joined #archiveteam-bs |
02:56
๐
|
|
bwn has quit IRC (Read error: Operation timed out) |
03:00
๐
|
|
fie has quit IRC (Read error: Operation timed out) |
03:09
๐
|
|
bsmith094 has quit IRC (Ping timeout: 190 seconds) |
03:10
๐
|
|
Yoshimura has quit IRC (Ping timeout: 190 seconds) |
03:16
๐
|
|
fie__ has joined #archiveteam-bs |
03:16
๐
|
|
fie_ has quit IRC (Read error: Connection reset by peer) |
03:22
๐
|
godane |
SketchCow: looks like Hard Knock Radio is encoded at 256kps from at least 2015-10 on |
03:31
๐
|
|
koon has quit IRC (Ping timeout: 250 seconds) |
03:32
๐
|
|
Fletcher_ has quit IRC (Ping timeout: 250 seconds) |
03:43
๐
|
dashcloud |
Here's an interesting DOS game I enjoyed playing before: https://archive.org/details/SeaRogue (Underwater archeology & treasure hunting) |
04:04
๐
|
|
Sk1d has quit IRC (Ping timeout: 194 seconds) |
04:13
๐
|
|
Sk1d has joined #archiveteam-bs |
04:18
๐
|
|
Yoshimura has joined #archiveteam-bs |
04:21
๐
|
|
koon has joined #archiveteam-bs |
04:22
๐
|
|
Fletcher_ has joined #archiveteam-bs |
04:34
๐
|
|
dashcloud has quit IRC (Read error: Operation timed out) |
04:37
๐
|
|
dashcloud has joined #archiveteam-bs |
05:00
๐
|
|
Sk1d has quit IRC (Ping timeout: 194 seconds) |
05:06
๐
|
|
Sk1d has joined #archiveteam-bs |
05:06
๐
|
|
wyatt8740 has quit IRC (Read error: Operation timed out) |
05:22
๐
|
|
wyatt8740 has joined #archiveteam-bs |
05:30
๐
|
|
decay_ has quit IRC (Read error: Operation timed out) |
05:33
๐
|
|
SN4T14 has quit IRC (Read error: Operation timed out) |
05:39
๐
|
|
decay has joined #archiveteam-bs |
05:43
๐
|
|
SN4T14 has joined #archiveteam-bs |
05:47
๐
|
|
metalcamp has joined #archiveteam-bs |
05:57
๐
|
|
metalcamp has quit IRC (Ping timeout: 244 seconds) |
06:01
๐
|
|
bwn has joined #archiveteam-bs |
06:03
๐
|
godane |
SketchCow: so i'm uploading more dvds to you FOS server |
06:04
๐
|
Ctrl-S___ |
anyone here who worked on this? (doing incremental grab, starting where you guys left off. Had to write things that were missing like the discovery stuff) https://github.com/ArchiveTeam/furaffinity-grab |
06:22
๐
|
|
GLaDOS has quit IRC (Quit: Oh crap, I died.) |
06:27
๐
|
|
GLaDOS has joined #archiveteam-bs |
06:32
๐
|
|
bsmith094 has joined #archiveteam-bs |
06:58
๐
|
SketchCow |
Great |
07:00
๐
|
|
vitzli has joined #archiveteam-bs |
07:01
๐
|
vitzli |
I saw that article about web scraping, I think I can do a better flowchart: http://i.imgur.com/jB7qlvr.png |
07:27
๐
|
|
Honno has quit IRC (Read error: Operation timed out) |
07:32
๐
|
|
schbirid has joined #archiveteam-bs |
07:33
๐
|
|
metalcamp has joined #archiveteam-bs |
07:39
๐
|
yipdw_ |
vitzli: nice |
07:39
๐
|
yipdw_ |
missing some es in the IRC channel names, though :P |
07:40
๐
|
vitzli |
it's not a final version, I can change anything |
07:41
๐
|
yipdw_ |
neat |
07:42
๐
|
vitzli |
too much copy-paste :( |
07:49
๐
|
JesseW |
vitzli: I thought your flowchart was going to be scarcastic -- but it's actually useful. |
07:51
๐
|
vitzli |
It's little bit sarcastic and I could add more |
07:52
๐
|
JesseW |
vitzli: "Check the Deathwatch page" -> "Check (and, if missing, add it to) the Deathwatch page" |
07:52
๐
|
JesseW |
both? Both is good. |
07:53
๐
|
JesseW |
BTW, I'm starting a new IA census, this time with an up-to-date list of about 19 million identifiers. |
07:53
๐
|
JesseW |
Currently done about a million of them. |
07:54
๐
|
vitzli |
Are you going to keep sha1s? |
07:54
๐
|
davidar |
Does the site refuse to provide a useful API? -> Yes -> Scrape it. |
07:54
๐
|
JesseW |
yes, I'm keeping sha1s this time. |
07:54
๐
|
JesseW |
eh, scrape it whether it provides an API or not. |
07:55
๐
|
JesseW |
WARCs are *always* good to have, if at all possible. |
07:55
๐
|
vitzli |
I can keep raw results |
07:55
๐
|
davidar |
JesseW: well, by "scrape" I think they're talking about extracting structured data from unstructured documents |
07:55
๐
|
JesseW |
ah, fair point |
07:55
๐
|
davidar |
obviously archiving raw data is useful no matter what |
07:56
๐
|
JesseW |
But I'd say, grab WARCs *then* think about how to extract structure from them. :-) |
07:56
๐
|
davidar |
agreed |
07:56
๐
|
JesseW |
I'd say the only time *not* to make WARCs is if the site is too large/dynamic, and even in that case, grab WARCs of a representative sample. |
07:57
๐
|
vitzli |
I hope to upload the hash archive into the IA this week |
07:57
๐
|
SketchCow |
I decided to go ahead and go the whole way, and there's now one directory for FOS pipelines, and therefore one script that runs that says "push out all the new FOS packs". |
07:57
๐
|
JesseW |
Also, if the site is too aggressive with, say CAPCHAs or other "we insist that you have a human sitting in front of a browser before our server will talk to you" -- in that case, something like webrecorder.io |
07:57
๐
|
SketchCow |
And that will add to the log, etc. |
07:58
๐
|
JesseW |
great! |
07:58
๐
|
JesseW |
vitzli: which hash archive? |
07:58
๐
|
vitzli |
the one I did |
07:58
๐
|
JesseW |
ah, cool |
07:58
๐
|
vitzli |
(md5,sha1,sha256) |
07:58
๐
|
vitzli |
plus all other stuff it calculated |
08:00
๐
|
JesseW |
do add links to it from http://archiveteam.org/index.php?title=Internet_Archive_Census once you upload it |
08:00
๐
|
davidar |
JesseW: actually, it would be cool if https://morph.io/ (spiritual successor to scraperwiki) proxied everything through something that archived WARCs by default, rather than just throwing away the sources after extracting the data from it |
08:01
๐
|
vitzli |
it's not a census or IA related stuff, just bunch of hashes I could get my hands to |
08:01
๐
|
vitzli |
some of it got to the IA or was downloaded from IA |
08:01
๐
|
JesseW |
davidar: yes, that would be much better |
08:02
๐
|
JesseW |
vitzli: in that case, pass it along to Ben Trask https://github.com/btrask |
08:02
๐
|
JesseW |
he likes hashes :-) |
08:02
๐
|
vitzli |
cool, will do, thank you |
08:03
๐
|
* |
davidar used to be involved in scraping stuff many years ago, but not so much anymore |
08:07
๐
|
bwn |
JesseW: do you want some help running through some of those? |
08:10
๐
|
JesseW |
bwn: some of what? |
08:12
๐
|
bwn |
sorry, you mentioned you were starting a census |
08:12
๐
|
JesseW |
Ah, those. |
08:14
๐
|
JesseW |
If you'd like to work on the 600,000 identifiers that were in the last list, but aren't in the current one, that'd be welcome. |
08:14
๐
|
JesseW |
I'll need to walk you through getting set up, though. |
08:14
๐
|
JesseW |
And I should probably head to sleep sooner than later... |
08:15
๐
|
bwn |
absolutely, whenever you get some time |
08:15
๐
|
vitzli |
JesseW, Is Ben Trask on IRC? |
08:16
๐
|
JesseW |
vitzli: IDK -- I haven't seen him on here, no. |
08:16
๐
|
vitzli |
ok, will email him |
08:17
๐
|
JesseW |
bwn: Do you have a unix system available? |
08:18
๐
|
SketchCow |
Are there stairs in your house |
08:18
๐
|
bwn |
yes and yes :) |
08:19
๐
|
JesseW |
You'll need to install GNU parallel, jq and iamine. |
08:19
๐
|
JesseW |
the first two should be available from your distro (and jq is a standalone binary so that's easy) |
08:21
๐
|
JesseW |
iamine you can get from https://archive.org/download/iamine-pex (I used ia-mine-0.5-py3.4.pex because I'm on py3.4) |
08:32
๐
|
JesseW |
oh, you'll also need pv for progress display |
08:32
๐
|
JesseW |
technically optional, but neat |
08:37
๐
|
JesseW |
It's made it to the C's! |
08:44
๐
|
|
JesseW has quit IRC (Ping timeout: 370 seconds) |
09:34
๐
|
|
bwn has quit IRC (Ping timeout: 492 seconds) |
09:34
๐
|
|
metalcamp has quit IRC (Ping timeout: 244 seconds) |
09:39
๐
|
|
metalcamp has joined #archiveteam-bs |
09:47
๐
|
|
bwn has joined #archiveteam-bs |
10:01
๐
|
|
dashcloud has quit IRC (Read error: Operation timed out) |
10:05
๐
|
|
dashcloud has joined #archiveteam-bs |
10:27
๐
|
|
metalcamp has quit IRC (Ping timeout: 244 seconds) |
10:54
๐
|
|
dashcloud has quit IRC (Read error: Operation timed out) |
10:58
๐
|
|
dashcloud has joined #archiveteam-bs |
11:17
๐
|
|
metalcamp has joined #archiveteam-bs |
12:03
๐
|
|
metalcamp has quit IRC (Ping timeout: 244 seconds) |
12:33
๐
|
|
Honno has joined #archiveteam-bs |
13:42
๐
|
|
VADemon has joined #archiveteam-bs |
14:13
๐
|
|
Start has quit IRC (Quit: Disconnected.) |
14:29
๐
|
SketchCow |
http://fos.textfiles.com/ARCHIVETEAM/ now is called automatically, all the uploads happen 24/7 without my intervention, so as whatever the amount of items are is hit, they get packed up and uploaded into their collections. Automatically. |
14:34
๐
|
Honno |
Hey I've got a secondary internal HDD now but it takes like 3 minutes to load a page using wayback (with pywb). I'm sure that I'm using the necessary index files and have the directories sorted in the correct way due to having that already checked out before regarding a different issue |
14:34
๐
|
Yoshimura |
What are the identifiers? What they are for? Can I help with anything? |
14:37
๐
|
Yoshimura |
Also a second question: How do I work (warrior) on FTP project? |
14:54
๐
|
|
Start has joined #archiveteam-bs |
15:06
๐
|
|
metalcamp has joined #archiveteam-bs |
15:14
๐
|
VADemon |
Yoshimura: As far as I know FTP Project is NOT for warriors |
15:14
๐
|
Yoshimura |
Oh. Why not? I feel like waste of time, it merely does 5GB a day. |
15:15
๐
|
Yoshimura |
Which is a lot with web pages, but if half connections stall. |
15:16
๐
|
Yoshimura |
Q about Archive.org: How do I upload warcs or stuff onto it myself? "Please contribute books, audio, and video files that you have the right to share." ... Does that mean I cannot archive valuable stuff that is publicly on internet and not used to make money? |
15:18
๐
|
Yoshimura |
I tried to switch to google code, but without complete shutdown it does not switch, and a single task is running 15 hours. I just do not get how regular people can help, if its not money, running (often) sluggish warrior instance, or providing code to something they have no idea about how it works and maybe does not even have public repository. |
15:19
๐
|
atrocity |
http://www.gearthblog.com/blog/archives/2016/04/big-google-earth-database.html |
15:20
๐
|
atrocity |
found that interesting |
15:21
๐
|
Yoshimura |
tldr: 'bout 3TB |
15:22
๐
|
Yoshimura |
*3PB |
15:26
๐
|
HCross |
Could we archive google earth/maps? |
15:26
๐
|
Yoshimura |
IDK if it would make sense. |
15:26
๐
|
arkiver |
HCross: yes. |
15:27
๐
|
alfie |
HCross: I'm not sure google would be happy with us ;) but yes, it would be a good idea |
15:27
๐
|
arkiver |
If we want to, I'll create a project |
15:27
๐
|
vitzli |
for personal use? There was a program that pulls the maps and stores them in local storage |
15:27
๐
|
arkiver |
But I'm quite sure that's not going to happen anytime soon |
15:27
๐
|
HCross |
I was only asking "if" it was possible |
15:28
๐
|
|
signius has quit IRC (Read error: Operation timed out) |
15:29
๐
|
HCross |
alfie, we already have our fair share of google employees who dont like us :P |
15:29
๐
|
alfie |
HCross: it's only fair, i don't like them either :P |
15:29
๐
|
Yoshimura |
Company has no problem defending its copyright, person has to give us his personal information in order to defend himself. Company has no problems defending themselves about violations of copyright blaming it on their users via ToS. Person has no way to get out. ... I have no idea how should I proceed in getting information of what I can or can not |
15:29
๐
|
Yoshimura |
do in terms of preserving knowledge that is currently publicly accessible for free, but might cease in future. If anyone has any links related, would be appreciated. |
15:30
๐
|
SketchCow |
DO NOT ARCHIVE GOOGLE MAPS |
15:30
๐
|
SketchCow |
DO NOT ARCHIVE GOOGLE MAPS |
15:31
๐
|
Yoshimura |
Q: Is there a way to get into FTP project or another, higher bandwidth one? |
15:32
๐
|
Yoshimura |
Btw, anything running on Parse Server might get out of business, unless they migrate by January. |
15:33
๐
|
vitzli |
technically it is possible, for example SAS.ะะปะฐะฝะตัะฐ / SAS.Planet / SASPlanet downloads chunks from kh.google servers - it can combine them into large .jpg maps, different resolutions are possible from 1 (entire world) to 18 (human shadows are visible) |
15:34
๐
|
alfie |
vitzli: depends if my "archiving google maps" we mean the imagery or the maps data |
15:34
๐
|
vitzli |
http://www.sasgis.org/ - can't find the english version of the website |
15:35
๐
|
arkiver |
SketchCow: don't worry, we won't |
15:35
๐
|
HCross |
^ Was only being hypothetical |
15:35
๐
|
arkiver |
:P |
15:35
๐
|
Frogging |
I think he doesn't want you do archive google maps |
15:36
๐
|
arkiver |
that's exactly what he said |
15:36
๐
|
Frogging |
:p |
15:38
๐
|
midas |
soo, archive bing maps then? |
15:38
๐
|
* |
midas hides |
15:39
๐
|
phuzion |
lol |
15:39
๐
|
HCross |
Yahoo maps - if they are a thing |
15:40
๐
|
SketchCow |
By the way, that music bootleg site CONTINUES to download. |
15:40
๐
|
midas |
slow and steady wins the race, again |
15:41
๐
|
phuzion |
I was going to say as an alternative to archiving Google Maps or Bing Maps, maybe we could look at grabbing OSM's datasets, but it's already on IA, and relatively current at that. |
15:41
๐
|
phuzion |
Granted, OSM doesn't have satellite imagery or anything like that |
15:42
๐
|
|
signius has joined #archiveteam-bs |
15:44
๐
|
Frogging |
Why is fotolog taking so long? Do you need more warriors or is it just the rate limiting? |
15:45
๐
|
arkiver |
They're website can't handle more load |
15:45
๐
|
arkiver |
err |
15:45
๐
|
arkiver |
their website* |
15:46
๐
|
Frogging |
ah |
15:46
๐
|
Yoshimura |
Rate limit, yes, its nonstop Service over load. |
15:47
๐
|
midas |
arkiver: fotolog.com right? |
15:47
๐
|
arkiver |
yes |
15:49
๐
|
midas |
the pictures we are grabbing, are we grabbing them via the cloudflare service or directly? |
15:49
๐
|
arkiver |
just like they are on the page |
15:49
๐
|
midas |
k |
15:55
๐
|
|
jut has joined #archiveteam-bs |
16:06
๐
|
|
Start has quit IRC (Quit: Disconnected.) |
16:08
๐
|
|
VADemon has quit IRC (Quit: left4dead) |
16:23
๐
|
|
JesseW has joined #archiveteam-bs |
16:25
๐
|
|
jut has quit IRC (Ping timeout: 250 seconds) |
16:51
๐
|
|
JesseW has quit IRC (Ping timeout: 370 seconds) |
17:06
๐
|
JW_work |
Yoshimura: URLTeam can always use people investigating new shorteners: see the wiki page for URLTeam for details. |
17:07
๐
|
|
dashcloud has quit IRC (Read error: Operation timed out) |
17:07
๐
|
xmc |
yes, urlteam is an endless sink of only semi-automatable labor |
17:11
๐
|
|
dashcloud has joined #archiveteam-bs |
17:13
๐
|
|
SimpBrain has joined #archiveteam-bs |
17:21
๐
|
SketchCow |
Yes, urlteam could use JesseW's "intense" attention. |
17:23
๐
|
* |
Yoshimura will look into that |
17:24
๐
|
SketchCow |
Jesse should. |
17:33
๐
|
schbirid |
SketchCow> DO NOT ARCHIVE GOOGLE MAPS <- i would LOVE it if someone did so for selected regions |
17:34
๐
|
JW_work |
SketchCow: I already keep an eye on URLteam. Were you talking about Yoshimura? |
17:34
๐
|
xmc |
i think it was a typo yes |
17:35
๐
|
JW_work |
hopefully. |
18:04
๐
|
|
Start has joined #archiveteam-bs |
18:05
๐
|
|
vitzli has quit IRC (Leaving) |
18:07
๐
|
|
Start has quit IRC (Read error: Connection reset by peer) |
18:08
๐
|
|
Start has joined #archiveteam-bs |
18:10
๐
|
|
Start has quit IRC (Read error: Connection reset by peer) |
18:56
๐
|
|
dashcloud has quit IRC (Read error: Operation timed out) |
19:00
๐
|
|
dashcloud has joined #archiveteam-bs |
19:06
๐
|
|
Start has joined #archiveteam-bs |
19:07
๐
|
SketchCow |
No, I mean you |
19:07
๐
|
SketchCow |
I want that thing singing |
19:10
๐
|
JW_work |
:-P |
19:24
๐
|
|
ohhdemgir has quit IRC (Remote host closed the connection) |
19:33
๐
|
|
bwn has quit IRC (Ping timeout: 246 seconds) |
19:36
๐
|
schbirid |
http://blog.dshr.org/2016/04/brewster-kahles-distributed-web-proposal.html |
19:38
๐
|
Yoshimura |
If there is something more specific I can do in terms of urlteam, let me know. |
19:39
๐
|
Yoshimura |
I tried finding API for writing the settings. And my warrior for url does nto have enough work |
19:40
๐
|
JW_work |
Yoshimura: come over to #urlteam and I'll try to clarify |
19:43
๐
|
|
Start has quit IRC (Quit: Disconnected.) |
19:47
๐
|
|
SimpBrai1 has joined #archiveteam-bs |
19:47
๐
|
|
SimpBrai1 has quit IRC (Read error: Connection reset by peer) |
19:59
๐
|
|
schbirid has quit IRC (Quit: Leaving) |
20:05
๐
|
|
bwn has joined #archiveteam-bs |
20:05
๐
|
|
dashcloud has quit IRC (Read error: Operation timed out) |
20:06
๐
|
|
dashcloud has joined #archiveteam-bs |
20:15
๐
|
atrocity |
i just wish we could have the warrior join more than one project at once without have to launch another VM |
20:15
๐
|
atrocity |
like make the # of projects = the number of concurrent connections or whatever |
20:16
๐
|
JW_work |
atrocity: I've wanted the same thing โ I just haven't gotten around to learning (and setting up a test environment) for the warrior code enough to implement it. |
20:22
๐
|
Yoshimura |
Btw a better way to purge the data drive, by setting the regular to Writethrough and useing snapshot |
20:22
๐
|
Yoshimura |
So you have "Clean start" snapshot. |
20:23
๐
|
Yoshimura |
atrocity: Make a VM and use multiple dockers :P |
20:24
๐
|
Yoshimura |
I might work on warrior or something code also, but everythign takes time learnign how stuff works. |
20:50
๐
|
|
metalcamp has quit IRC (Ping timeout: 244 seconds) |
20:50
๐
|
|
toad2 has joined #archiveteam-bs |
20:51
๐
|
|
toad1 has quit IRC (Read error: Operation timed out) |
20:58
๐
|
atrocity |
yeah, i won't even spend the time, haha! just working on too many other projects atm that require a lot of time/effort |
21:00
๐
|
|
Yoshimura has quit IRC (http://www.kiwiirc.com/ - A hand crafted IRC client) |
21:01
๐
|
|
Yoshimura has joined #archiveteam-bs |
21:02
๐
|
|
Honno has quit IRC (Read error: Operation timed out) |
21:13
๐
|
godane |
so i got Mighty Morphin' Power Rangers Host TMNT 2 on fox |
21:14
๐
|
godane |
i also figured out the date was 1993-11-26 when it aired |
21:14
๐
|
godane |
Black Friday Night |
21:35
๐
|
|
Yoshimura has quit IRC (http://www.kiwiirc.com/ - A hand crafted IRC client) |
21:39
๐
|
atrocity |
lol |
22:07
๐
|
|
Start has joined #archiveteam-bs |
22:41
๐
|
|
BlueMaxim has joined #archiveteam-bs |
22:57
๐
|
|
Yoshimura has joined #archiveteam-bs |
23:18
๐
|
|
Yoshimura has quit IRC () |
23:34
๐
|
|
Jonimus has quit IRC (Read error: Operation timed out) |
23:37
๐
|
|
Yoshimura has joined #archiveteam-bs |
23:46
๐
|
|
Mayonaise has quit IRC (Read error: Operation timed out) |
23:48
๐
|
|
SimpBrain has quit IRC (Ping timeout: 633 seconds) |
23:48
๐
|
|
SimpBrain has joined #archiveteam-bs |
23:50
๐
|
|
kvieta has quit IRC (Ping timeout: 633 seconds) |
23:50
๐
|
|
kvieta has joined #archiveteam-bs |
23:56
๐
|
|
Mayonaise has joined #archiveteam-bs |