Time |
Nickname |
Message |
01:19
🔗
|
chronomex |
I'm marshaling data around to create a Symbian torrent |
01:28
🔗
|
DFJustin |
heh cool just downloaded a google group with a bunch of occult manuals |
01:46
🔗
|
bsmith093 |
DFJustin: sharing it? |
01:48
🔗
|
DFJustin |
https://groups.google.com/group/enigmatic-occultism |
01:48
🔗
|
bsmith093 |
cause i have something called the occult collection, which, ( ive looked through it) is apparently every weird theory you could possibly imagine. you guys want it? |
01:51
🔗
|
bsmith093 |
holy crap is google groups closing down |
01:51
🔗
|
DFJustin |
just the files and pages feature |
01:51
🔗
|
DFJustin |
you can pitch in here http://www.archiveteam.org/index.php?title=Google_Groups_Files |
01:51
🔗
|
underscor |
0 9:51PM:alex@alex-desktop:/media/temp2/friendster-scrape 4996 Ï ps aux|wc -l |
01:51
🔗
|
underscor |
1341 |
01:52
🔗
|
underscor |
Way too many processes :( |
01:52
🔗
|
DFJustin |
I have about 100gb of google groups files now |
01:52
🔗
|
underscor |
2011-06-26 21:52:25 (21.4 MB/s) - `/tmp/ggroups-grpname.17069' saved [115/115] |
01:52
🔗
|
underscor |
Downloading boys-sex-club-6 |
01:52
🔗
|
underscor |
Downloading pages... |
01:52
🔗
|
underscor |
I don't even want to know |
01:53
🔗
|
BlueMax |
:< |
02:30
🔗
|
db48x |
lol |
02:30
🔗
|
db48x |
so the script for downloading Jamendo content has dissapeared |
04:20
🔗
|
underscor |
Error downloading wet-my-pants-pages.zip |
04:20
🔗
|
underscor |
:< |
04:51
🔗
|
underscor |
hahah oh god |
04:51
🔗
|
underscor |
http://pastebin.com/yecBUsMx |
04:51
🔗
|
BlueMax |
"Let the knowledge flow through you..." |
06:52
🔗
|
godane |
hey |
06:53
🔗
|
godane |
i hope we can backup the cloud easyer with this: http://en.wikipedia.org/wiki/Holographic_Versatile_Disc |
06:54
🔗
|
godane |
6tb on to a dvd can help save the internet :-D |
06:56
🔗
|
perfinion |
underscor: waaat. someone doesnt understand what dns is ... |
07:04
🔗
|
db48x |
godane: :) |
07:05
🔗
|
godane |
i just read that there not coming out until 2019 |
07:05
🔗
|
godane |
i was hope at least 2013 |
07:05
🔗
|
perfinion |
that sucks |
07:05
🔗
|
db48x |
ah, well |
07:05
🔗
|
godane |
this way the disc would be close to size of the hard drives |
07:06
🔗
|
db48x |
there was an interesting article a while back about the possibility of a storage device using carbon nanotubes that they estimated could hold data uncorrupted for a billion years |
07:06
🔗
|
godane |
http://ns1758.ca/winch/winchest.html |
07:06
🔗
|
db48x |
each nanotube had a tiny iron nanoparticle in side of it |
07:07
🔗
|
db48x |
an electric field could shuttle the iron particle back and forth to encode a 1 or a 0 |
07:07
🔗
|
godane |
cost of hard drives over the year |
07:07
🔗
|
godane |
*years |
07:08
🔗
|
godane |
if you go down the list hard drives were only about 21gb at the most |
07:08
🔗
|
godane |
at the end of 1999 |
07:08
🔗
|
godane |
dvd recorders started to come out around there |
07:09
🔗
|
db48x |
don't really like that graph; too much chartjunk |
07:09
🔗
|
db48x |
I like the data it shows though :) |
07:11
🔗
|
db48x |
good data collection too |
07:12
🔗
|
db48x |
good find there |
07:14
🔗
|
godane |
with way hard drives have slow grow it maybe 2019 by the time we have 20tb hard drive to need hvd discs |
07:16
🔗
|
godane |
http://www.pcworld.com/article/128400/hitachi_introduces_1terabyte_hard_drive.html |
07:16
🔗
|
godane |
it looks we should be getting to 400tb by now |
07:17
🔗
|
godane |
thats only cause it took 35 years for 1gb (1991) then 16 years to get to 1tb |
07:18
🔗
|
godane |
based on that should we been at 100tb hard drives by now :P |
09:11
🔗
|
Spirit_ |
wooot, cronjob worked |
11:31
🔗
|
alard |
Added a new page to the wiki about MobileMe (with a small script for downloading public.me.com): http://www.archiveteam.org/index.php?title=MobileMe |
12:29
🔗
|
emijrp |
Hi hoi yo sup. |
13:13
🔗
|
SketchCow |
Hurrah! |
13:13
🔗
|
BlueMax |
What's the cheer? |
13:14
🔗
|
emijrp |
:B |
13:44
🔗
|
sadcarrot |
hello room |
21:15
🔗
|
bsmith093 |
im trying to backup textfiles.com but i'm almost positive i dont have the space for a full backup, so how would i wget just the zips jason has in every directory? |
21:18
🔗
|
emijrp |
bsmith093: perhaps you are interested on this http://www.archive.org/details/textfiles-dot-com-2011 |
21:23
🔗
|
bsmith093 |
oh...wow. so these are the zips?] |
21:29
🔗
|
emijrp |
a big pack splited in chunks, contains texts in folders |
21:30
🔗
|
emijrp |
+100,000 files |
21:31
🔗
|
bsmith093 |
wow thanks, a-team:-D |
21:32
🔗
|
bsmith093 |
also one more thing the wget manual is huge, so what does this particular incarnation accomplish? wget -r -l 0 -np -nc http://www.somewebsite.com |
21:33
🔗
|
emijrp |
wget --help |
21:33
🔗
|
sadcarrot |
anyone around to chat about the yahoo project? |
21:34
🔗
|
emijrp |
which one of yahoo? |
21:34
🔗
|
sadcarrot |
video |
21:36
🔗
|
sadcarrot |
i believe i'm done rsyncing, but i just want to be sure before i delete 'em |
21:42
🔗
|
zumthing |
sadcarrot: i'd love to hear the answer to that, too. |
21:43
🔗
|
sadcarrot |
yeah |
21:43
🔗
|
sadcarrot |
and, my password stopped working, so there's no way to verify |
23:38
🔗
|
bsmith093 |
SketchCow: you mentioned a video Pete Shuvani produced in 1978, on an interview with the security justice podcast ( the epic interview). |
23:39
🔗
|
bsmith093 |
whoops, hit enter too soon. Do you happen to have a copy of that video, or know the title? |