#archiveteam 2012-02-29,Wed

↑back Search

Time Nickname Message
00:06 🔗 SketchCow http://thechive.files.wordpress.com/2012/02/a712c3b009a8175cbb751b33ce62573aec0b20ab.gif
00:20 🔗 DFJustin hmm the textfiles bitsavers mirror is rather behind by the looks of it
00:23 🔗 * Coderjoe grumbles about pulling cable
00:24 🔗 Coderjoe even better.... pulling cable through a ceiling after all kinds of furnature is already in place
00:24 🔗 Coderjoe with instructions to leave out some runs to be completed at a later date
00:25 🔗 SketchCow Archive Team loose in the warehouse again. https://lh3.googleusercontent.com/-Z4Xr4BRbg6U/TqfBUmSO08I/AAAAAAAAf0s/R2uUdpFbUN4/s270/1276570199204.gif
00:25 🔗 SketchCow the bitsavers mirror is behind because I do them by hand
00:25 🔗 SketchCow I see them delete documents out
00:25 🔗 SketchCow So I have to set time to move those away
00:32 🔗 chronomex that's an excellent gif.
00:32 🔗 shaqfu SketchCow: Sent again, hopefully
00:37 🔗 SketchCow Working now
00:37 🔗 DFJustin I hate watching gifs like that :(
00:43 🔗 shaqfu Awesome
00:44 🔗 tsp_ I know I'm 3 years too late, but anyone want ~16g of geocities? I've got 7z running on it, but it'll take a while.
00:47 🔗 SketchCow Of course.
00:47 🔗 SketchCow It's NEVER too late to get more geocities.
00:48 🔗 chronomex ^
00:48 🔗 chronomex I keep running into missing sites.
00:48 🔗 tsp_ It'll only extract on unix systems because of weird filenames. Hopefully 7z won't lose anything compared to tar/bz2, but it might give me better compression.
00:48 🔗 chronomex tar/7z is good ;)
00:49 🔗 tsp_ oh, didn't think of that. Won't 7z itself do better?
00:49 🔗 chronomex actually tar/xz is what I've found to work well
00:49 🔗 chronomex xz is lzma like 7zip
00:50 🔗 tsp_ I'll do tar/xz then. My desktop has 8gb ram, that should be enough
00:54 🔗 chronomex of course you remember that lzma is heavily biased towards slow compress/fast decompress ;)
00:55 🔗 tsp_ How long should it take on a netbook? If it's overnight, I'm fine
00:55 🔗 chronomex should be fine then
01:35 🔗 hybernaut anybody home?
01:35 🔗 hybernaut I would like to help with mobile.me
01:35 🔗 hybernaut or whatever else is on fire
01:38 🔗 dcmorton hybernaut: helping with mobileme is easy.. all you really need is a good amount of free HD space and a linux install
01:38 🔗 dcmorton have you looked at this yet? http://archiveteam.org/index.php?title=MobileMe
01:39 🔗 hybernaut no, that's what I was looking for, thanks
01:51 🔗 DFJustin hmm web.archive.org/http://foo doesn't work anymore
01:55 🔗 Coderjoe DFJustin: http://wayback.archive.org/web/*/http://foo
01:55 🔗 DFJustin yeah I know
01:56 🔗 DFJustin but a) that's more typing and b) it doesn't take you to the newest version automatically
01:56 🔗 hybernaut I am archiving
01:57 🔗 hybernaut kenethre: you here?
01:58 🔗 kennethre hybernaut: yo
01:58 🔗 hybernaut are you responsible for the MeMac Dash?
01:58 🔗 kennethre I am not
01:58 🔗 hybernaut there's more than one heroku-humper around here? excellent
01:58 🔗 kennethre haha, nah
01:58 🔗 hybernaut well I think it's very well done
01:58 🔗 kennethre hybernaut: my scraping is unrelated to the dashboard
01:59 🔗 Coderjoe he humped heroku so much he married it
01:59 🔗 hybernaut compliments to the chef
01:59 🔗 kennethre well i gave them rights to all the intellectual property I didn't think to claim as my own
01:59 🔗 kennethre so i guess that counts as marriage
02:00 🔗 hybernaut marriage is much, much worse
02:02 🔗 kennethre hahahaha
02:02 🔗 Coderjoe who are you kidding. you're their bitch
02:07 🔗 hybernaut re: MeMac, does it make sense to run multiple processes on my one machine?
02:08 🔗 chronomex not usually, unless you have insane bandwidth
02:08 🔗 chronomex 50Mbit+
02:08 🔗 kennethre hybernaut: the bottleneck is typically your box, not mobileme
02:09 🔗 hybernaut ok, this one should suffice, then, thank you
02:09 🔗 chronomex it tends to be few large files, so network speed is the limiter rather than rtt
03:03 🔗 no2pencil SketchCow: http://outer-court.com/basic/echo/T1084.HTM
03:04 🔗 no2pencil A deceased friend of mine, getting help in 1992
03:04 🔗 no2pencil just throwin that out there.
03:20 🔗 SketchCow You surely must know how much of my time is being called for and taken right now.
03:20 🔗 no2pencil sorry
03:20 🔗 no2pencil I just wanted to share
03:20 🔗 SketchCow Save it and the thing it's a part of, or let me know why I should be checking it out more than the look.
03:20 🔗 no2pencil because its' like your talk
03:20 🔗 no2pencil that something is so meaningless at one time
03:20 🔗 no2pencil & then so imporant later
03:20 🔗 SketchCow It IS like a few of my talks.
03:20 🔗 SketchCow Right.
03:21 🔗 SketchCow So save that, and write why it's important.
03:21 🔗 SketchCow That has power.
03:21 🔗 no2pencil oh I have
03:21 🔗 no2pencil I didn't meant to waste your time
03:21 🔗 no2pencil just wanted to share
03:21 🔗 SketchCow I'm rendering out Bill Budge talking about why Michael Abrash's Quake assembly language routine is so amazing
03:22 🔗 shaqfu GDC Vault?
03:34 🔗 SketchCow No, this is my documentary
03:34 🔗 SketchCow One of them
03:38 🔗 SketchCow http://cargo.dcurt.is/weird_ipad.png
03:38 🔗 SketchCow ha ha pwn
03:38 🔗 Zuu- What's that?
03:39 🔗 yipdw the world's first WQXGA iPad
03:40 🔗 Zuu- huh
03:40 🔗 yipdw unfortunately it is also 30" diagonal
03:40 🔗 SketchCow It was well within his power to become what in hip-hop is called a .weed carrier.. This is the guy who acts as a hype man at concerts, maybe gets a couple solo tracks or an opening set, and depending on the ritziness of the situation may literally be in charge of holding the drugs so the stars don.t have to worry about getting busted. It.s a good job. There.s a guy called Spliff Star who.s been doing this for Busta Rhymes for 15 years now . I bet
03:41 🔗 hybernaut that's Bill Budge again?
04:15 🔗 SketchCow Yes
04:15 🔗 SketchCow The weed carrier.
04:21 🔗 shaqfu Phew
04:21 🔗 shaqfu I was feeling good, then I scrolled down the bitsavers list...
04:43 🔗 SketchCow The more you do, the more I have.
04:43 🔗 SketchCow It's not about being The Person Who Did All Of Them
04:48 🔗 shaqfu I know
04:48 🔗 shaqfu But still, sheesh, that's a lot of documents
04:49 🔗 SketchCow Bitsavers has been at it since 1996.
04:49 🔗 SketchCow I think Al scans while he's doing other work, in a corner
04:49 🔗 shaqfu (did my last mail go through?)
04:50 🔗 shaqfu Awesome, it did, okay
05:25 🔗 godane SketchCow: i'm watching your talk at shmoocon 2010
05:26 🔗 godane i also backing up shmoocon videos
05:27 🔗 godane also shmoocon 2011 videos links are dead
05:29 🔗 godane the bass in your video is off again
06:30 🔗 godane worse audio recording ever!!!
06:34 🔗 kennethre https://twitter.com/maxfenton/status/174744236699303936
08:03 🔗 Coderjoe_ man. there are some MADs in here that use some kick ass hand-drawn artwork
08:03 🔗 Frigolit ooh~ MAD :3
08:04 🔗 Frigolit i think i have two collections of MAD at my parents' place
08:05 🔗 Coderjoe_ what the hell...
08:18 🔗 Frigolit what
08:28 🔗 Coderjoe_ some old movie clip or something that I'm not sure how to describe
08:28 🔗 Coderjoe_ heh
08:29 🔗 Coderjoe_ footage: Rozen Maiden, audio: Cruel Angel's Thesis (Evangelion opening theme)
08:47 🔗 Coderjoe_ hahah
08:47 🔗 Coderjoe a japanese-subtitled version of the German rage kid
08:57 🔗 Coderjoe heh
08:57 🔗 Frigolit lol
08:57 🔗 Coderjoe video: Lucky Star, audio: Hare Hare Yukai
09:29 🔗 ersi DFJustin: http://liveweb.archive.org/http://domain works fine for me
14:09 🔗 tsp_ 8.5g compressed. Now where to put it without running people out of bandwidth. I think my dropbox can handle it, split up, if someone downloads the pieces as I put them up
14:10 🔗 ersi whatcha got cookin?
14:11 🔗 tsp_ geocities data
15:59 🔗 emijrp freenode has a channel for internet archive #archive and there is people talking
16:00 🔗 emijrp i remember to enter long time ago and it was empty, i have it in autojoin, and now, months later, people inside
16:01 🔗 emijrp looks like archive.org staff
16:05 🔗 Nemo_bis oh
16:05 🔗 emijrp lol fail
16:05 🔗 emijrp it is archive.vg
16:09 🔗 Nemo_bis sigh
16:10 🔗 emijrp : D
16:12 🔗 emijrp http://www.archive.org/post/38352/who-wants-to-start-an-irc-channel
16:12 🔗 emijrp weird
16:13 🔗 emijrp the founder is msikma, looks like later some people from archive.vg occupied it
16:15 🔗 ersi Nice sending a response on a six year old forum thread
16:15 🔗 ersi It's not that weird actually :)
16:16 🔗 emijrp i use to reply very old messages in wikipedia talk pages, common
16:17 🔗 ersi I'm sure that works out nicely
16:21 🔗 emijrp yes, it is not about persons but the article content, so, it has sense
16:41 🔗 Nemo_bis emijrp, you should ask the founder to come back, set the topic, op you and kickban the others :p
16:49 🔗 topaz hmm
16:49 🔗 topaz ha
16:50 🔗 topaz http://archiveteam.org/index.php?title=Main_Page&useskin=monobook gets me around the viagra spam
16:50 🔗 topaz fascinating
17:17 🔗 db48x topaz: ahh. what skin do you normally use?
17:21 🔗 topaz db48x: I am new to the archiveteam wiki, actually
17:21 🔗 topaz but the crux of it is that adding &useskin to the URL parameters seems to work around the spam crud
17:22 🔗 topaz actually changing my skin in my preferences doesn't help
17:39 🔗 yipdw I've never run into a problem with spam on the AT wiki
17:39 🔗 yipdw however, I don't browse it
17:39 🔗 yipdw I just go to various pages
17:40 🔗 yipdw maybe that's why
17:43 🔗 db48x topaz: interesting
17:52 🔗 yipdw argh what the fuck
17:52 🔗 yipdw someone here has a webapp that has 500 MB of private dirty RAM
17:52 🔗 yipdw per instance
17:52 🔗 yipdw Ruby developers
17:52 🔗 * yipdw sighs
17:52 🔗 yipdw oh wait, wrong channel
17:53 🔗 yipdw meh whatever
17:59 🔗 db48x heh
17:59 🔗 tef heh
18:00 🔗 yipdw I don't mind 500 MB of private dirty if it's really justified, but per worker instance is ridiculous
18:00 🔗 yipdw and as we run dozens of applications on just a few server it's not just an annoyance but rather a real bogarting problem
18:35 🔗 hybernaut topaz are you there?
18:35 🔗 topaz hey
18:36 🔗 hybernaut what address do you get for www.archiveteam.org?
18:42 🔗 topaz $ host www.archiveteam.org
18:42 🔗 topaz www.archiveteam.org has address 69.163.228.28
18:42 🔗 hybernaut I'm still curious where that spam is coming from
18:44 🔗 topaz I'm assuming that it's giving it to me because it's convinced that my IP address belongs to a search engine, or something
18:46 🔗 topaz I'm really not chuffed about it and sorry I kicked up such a fuss, now that I know it's chiefly affecting me for some bizarre reason and now that I know a workaround :-)
18:46 🔗 topaz but it's certainly puzzling
18:56 🔗 hybernaut yea, I don't really understand what's going on, I'm just curious
18:59 🔗 emijrp if you do curl http://www.google.com do you get junk ?
19:02 🔗 topaz emijrp: no
19:03 🔗 topaz I get a big mess of JavaScript as you might expect but I am not sure I'm prepared to say with 100% assurance that it's legit
19:04 🔗 topaz though the returned code includes links to the Gioachino Rossini Google Doodle they're currently running
19:05 🔗 emijrp ok
19:09 🔗 Nemo_bis topaz, try ?foo=bar and you'll probably get the same result
19:09 🔗 Nemo_bis it's just working around the cache I guess
19:10 🔗 topaz correct
19:10 🔗 topaz which cache, though?
19:11 🔗 topaz it's not my browser cache
19:21 🔗 Nemo_bis dunno, whatever cache the wiki uses
19:21 🔗 Nemo_bis you could try ?action=purge , who knows
19:23 🔗 topaz if it's the wiki caching the page on the server side I'd expect someone other than me to be seeing it :-)
19:24 🔗 ersi lol, funny when people start talking with random tech words
19:24 🔗 topaz and you're right, adding &foo=bar or &givemeviagra=yes or anything else disables it. fascinating.
19:24 🔗 Nemo_bis because those URLs are not cached
19:25 🔗 topaz if the wiki is returning me a cached spam page, then why isn't it returning the same cached page to you?
19:25 🔗 Nemo_bis no idea
19:25 🔗 Nemo_bis simplest answer: I'm logged in
19:26 🔗 Nemo_bis anyway, retry the main page
19:27 🔗 topaz $ curl -s 'http://archiveteam.org/index.php?title=Main_Page' | grep -i cialis | head -1
19:27 🔗 topaz <div id="jump-to-nav"><a href="http://www.archiveteam.org/index.php?title=User:Proub">buy generic viagra</a><a href="http://www.archiveteam.org/index.php?title=User:Marceloantonio1">generic cialis</a></div><!-- start content -->
19:38 🔗 hybernaut topaz: I see the same
19:44 🔗 topaz hybernaut: you see the same output from curl? but not in your browser?
19:45 🔗 hybernaut yes, but I believe the engine returns different content based on your user-agent string
19:45 🔗 hybernaut I don't know, tho
19:46 🔗 hybernaut if you do the same curl with '-A Mozilla', you don't get it
19:47 🔗 topaz I do
19:47 🔗 topaz $ curl -A Mozilla -s 'http://archiveteam.org/index.php?title=Main_Page' | grep -i cialis | head -1
19:47 🔗 topaz <div id="jump-to-nav"><a href="http://www.archiveteam.org/index.php?title=User:Proub">buy generic viagra</a><a href="http://www.archiveteam.org/index.php?title=User:Marceloantonio1">generic cialis</a></div><!-- start content -->
20:33 🔗 tsp_ part 1 is up, 2 coming as fast as I can upload. If someone can figure out where there's enough space, I can send links as I go.
20:33 🔗 LordNlptp of what
20:34 🔗 tsp_ ~8.5gb of compressed geocities rips
20:36 🔗 alard Have you asked SketchCow?
20:37 🔗 tsp_ Not yet, I'm just putting it up on my dropbox. Once done, can just wget it from there as the parts complete
20:37 🔗 alard And where should it go, eventually?
20:37 🔗 tsp_ I have no idea
20:37 🔗 alard archive.org, perhaps?
20:38 🔗 tsp_ That's as good of a place as any
20:39 🔗 alard What kind of files do you have? Is there a link to have a look?
20:39 🔗 tsp_ It's just one enormous tar.xz split into 8 parts. I ran wget/some custom python script on a crapload of fanfiction sites
20:40 🔗 alard Well, then it should probably become an item on archive.org, with your 8 parts, for starters.
20:40 🔗 tsp_ the 8 parts are just split with unix split, you'll have to recombine them
20:43 🔗 alard Do you have a link somewhere? Then I'll download it and put it back together.
20:43 🔗 tsp_ I have no idea what's involved in getting something up on archive.org. Would they accept it?
20:43 🔗 alard Yes, as far as I know they accept anything, and they take it down when someone complains.
20:43 🔗 tsp_ Working on that, my upstream is slow. I'm feeding the parts to dropbox now
20:44 🔗 alard You just create an account, create an item and upload the files.
20:44 🔗 alard Okay, let me know when you have a link.
20:44 🔗 alard Or links.
20:45 🔗 * tsp_ nods
21:05 🔗 hybernaut I've modified the mobileme tool to enforce bandwidth limits
21:05 🔗 hybernaut anyone know if is this worth a pull request?
21:19 🔗 alard Why not. Put your fork on GitHub and perhaps it's useful to someone.
21:21 🔗 ersi tsp_: What is it originally? I mean inside the tar.xz :P Besides "A part of geocities"?
21:28 🔗 Schbirid any idea why the following example a stops right away but the example b works fine?
21:28 🔗 Schbirid a) wget -m -np http://forumplanet.gamespy.com/quake_1_mapping/b50020/
21:28 🔗 Schbirid b) wget -m -np http://forumplanet.gamespy.com/quake_1_mapping/b50022/
21:31 🔗 ersi I guess that depends on a) what the server responds with b) more precisely, what content it responds with
21:33 🔗 Schbirid it looks identical to me, just different links/content inside though
21:33 🔗 Schbirid identical in terms of http and being html etc
21:34 🔗 ersi hmm, I'd make wget output a log and go through that firstly
21:34 🔗 tsp_ ersi: just wgetted/pulled html files
21:34 🔗 tsp_ site rips
21:34 🔗 ersi yeah, but a random random selection? or what? :o
21:35 🔗 tsp_ mostly me googling for fanfiction and pulling anything that seemed to be related
21:37 🔗 Schbirid ah
21:37 🔗 Schbirid stupid me
21:37 🔗 Schbirid ersi: thanks for the nudge. --debug showed me how it decides to crawl further
21:38 🔗 Schbirid and i was the culprit
21:39 🔗 ersi awesome :) Good going on finding it ^_^
21:40 🔗 Schbirid heh
21:40 🔗 Schbirid i will try to get forumplanet backed up
21:41 🔗 Schbirid ign stopped giving a fuck
21:41 🔗 Schbirid there are incredible amounts of spam
21:43 🔗 Schbirid still havent found out the proper way to combine --span-hosts and --page-requisites
21:43 🔗 Schbirid i only want page requisites from those other hosts
21:43 🔗 ersi archive.. ALL THE SPAM!
21:49 🔗 topaz .... archive ALL the spam?
21:50 🔗 Nemo_bis topaz, yes. Compresses nicely with 7z due to repetition
21:51 🔗 Schbirid yeah, first step is getting it all.
21:51 🔗 Schbirid it seems to grab 2-3 times the same page in different URLs too
21:51 🔗 ersi You gotta catch 'em all!
21:52 🔗 Schbirid but url structure is nice so this is an easy (but long) job
21:52 🔗 Nemo_bis Is there any way to send a 66 MB long POST request through curl or something?
21:52 🔗 * Nemo_bis whistles
21:52 🔗 ersi Of course there is Nemo_bis
21:53 🔗 Nemo_bis ersi, tell me :D
21:53 🔗 ersi But it depends on the target, if he chooses to accept that kind of long/large data
21:53 🔗 Nemo_bis well, let's try
21:53 🔗 Nemo_bis misconfiguration happends
21:53 🔗 topaz sorry, I thought ersi was making a hyperbole and a half reference
21:53 🔗 topaz I was just doing the call-and-response.
21:53 🔗 ersi "man curl", I'll say. :P curl -X POST for just the method though
21:54 🔗 Nemo_bis :p
21:54 🔗 ersi topaz: I'm always half serios and half a jack ass
21:54 🔗 ersi I guess that makes me fit in here with all you serious jack asses
21:54 🔗 ersi ;)
21:56 🔗 ersi Nemo_bis: Seems like it's just curl -X POST -d <your data> http://target/derp/api/herp?apikey=archiveteaaaaaaaam
21:56 🔗 topaz oh good, then I won't be alone.
21:56 🔗 Nemo_bis ersi, I think -X POST is not needed even
21:57 🔗 ersi that'll be formencoded though. If you want binary you need to --data-binary it seems
21:57 🔗 Nemo_bis at least according to https://www.mediawiki.org/wiki/Manual:Parameters_to_Special:Export
21:57 🔗 alard and with -d @somefile you'll post the content of the file instead.
21:57 🔗 ersi Nemo_bis: Never hurts, hehe
21:57 🔗 ersi alard: neat
21:57 🔗 Nemo_bis but he told me it was too long
21:57 🔗 ersi I do like the manual page for curl though. It's very useful and easy to search in
22:02 🔗 Nemo_bis it's actually a bash error; bash: /usr/bin/curl: Elenco degli argomenti troppo lungo
22:02 🔗 Nemo_bis (list of arguments too long)
22:05 🔗 Schbirid i love bash scripting
22:07 🔗 Nemo_bis hm, yes, what alard said is the solution anyway
22:12 🔗 Schbirid 800 pages with 20MB -> 280KB 7z :)
22:14 🔗 Nemo_bis a MediaWiki page I downloaded once was compressed over 5000 times
22:15 🔗 Nemo_bis I wonder whether this poor server will send me a pony back after this 66 MB request taking ages
22:16 🔗 Schbirid mirroring sites is weird
22:16 🔗 Schbirid "should i go brute force and hammer the server quickly so they do not notice me in time to stop"
22:16 🔗 Schbirid or "should i be nice and slow, hoping they will not notice at all"
22:19 🔗 ersi If they're closing down shortly, rape the fuck out of it
22:19 🔗 Nemo_bis did the latter now trying the former
22:19 🔗 ersi if it's uncertain/you don't know, rape the fuck out of it
22:19 🔗 ersi just kidding, do both in variations :)
22:20 🔗 Schbirid heh
22:20 🔗 Schbirid i am more a guy for consensual
22:21 🔗 ersi Duct tape makes "No no no" to "mmh mmh mmh"
22:23 🔗 ersi Why look, a swede!
22:25 🔗 ersi Too bad emijrp's offline. That dude from the IA forum about the #archive chan just joined #archive
22:25 🔗 ersi Nemo_bis: ^
22:25 🔗 Nemo_bis ersi, oh, good
22:25 🔗 ersi He's gonna come over here
22:25 🔗 ersi there he is, it's dada_ :-)
22:26 🔗 dada_ hello
22:26 🔗 ersi we're a loose bunch of hobby archivists ^_^
22:27 🔗 ersi with contacts at IA/archive.org
22:27 🔗 Nemo_bis hello again dada_ :)
22:28 🔗 Nemo_bis what are you going to do, push the occupiers out? :p
22:29 🔗 ersi I'm gonna go catch some sleep, but hang around - check out our site and chat with people about doing awesome archival stuff~ \o
22:30 🔗 Schbirid hm, looks like (40million posts/~8) threads
22:30 🔗 Schbirid but nicely done with ~15 posts per page
22:30 🔗 dada_ I was actually thinking about an archive project not too long ago. although it would be a bit big in scope. an archive of all DOS games ever released (these archives actually exist and are complete for a couple of consoles). it would be hard due to various technical reasons, though
22:30 🔗 DFJustin there are folks doing that
22:31 🔗 DFJustin e.g. DOSCollection
22:31 🔗 Schbirid getting each page 2-3 times but since it compresses so well i dont care
22:31 🔗 Schbirid there is a project for that on underground-gamer
22:31 🔗 Schbirid amazing work
22:31 🔗 Schbirid http://www.underground-gamer.com/wiki/index.php/Projects
22:32 🔗 dada_ yeah, I hear there are a few projects like that underway already, although iirc they don't use lossless disk images to do it?
22:33 🔗 Schbirid i think http://www.underground-gamer.com/wiki/index.php/Redump.org_IBM_PC_Compatible does
22:33 🔗 dada_ looks like they have solid archives free of cruft, though
22:34 🔗 dada_ (one thing I was thinking of: some older games used technical copy protection means, such as bad sectors on the floppy disk. you'd be unable to use those games unless the disk image had that information encoded in it.)
22:34 🔗 DFJustin SPS is theoretically going to do PC floppies eventually but they're kind of glacial
22:35 🔗 dada_ when you get to later years (mid 90s) games start becoming huge due to the FMV game hype
22:37 🔗 DFJustin red book audio is a space hog too
22:39 🔗 DFJustin there's also http://www.demu.org/ which is transitioning to IA
22:40 🔗 dada_ I honestly don't think there even is a disk image format that supports these copyright protection tricks. maybe the amiga disk image format does...
22:41 🔗 DFJustin IPF
22:41 🔗 Schbirid hm, can i >/dev/null but get stderr output "like" stdout?
22:42 🔗 Schbirid so if i was running a script and used this inside, i would get that line's stderr as stdout of the script
22:48 🔗 nitro2k01 Schbirid: 2>&1
22:48 🔗 Schbirid i thought that was directing that to the same location like its stdout?
22:49 🔗 ersi emijrp: hey, dada_'s here. He's the one that made that forum post. And registered #archive on freenode ;p
22:49 🔗 nitro2k01 OH! What you want is &>/dev/null
22:49 🔗 ersi Now I'm really off for some sleep, fixed up some boring household chores
22:49 🔗 nitro2k01 Then everything goes down the drain
22:49 🔗 Schbirid nah
22:49 🔗 Schbirid hm, i will try 2>&1
22:50 🔗 nitro2k01 can you explain what you actually want?
22:50 🔗 Schbirid i tried
22:50 🔗 Dark_Star Schbirid: it depends on where the 2>&1 appears. "foo > /dev/null 2>&1" is different from "foo 2>&1 > /dev/null".... this is a bit tricky to get right (I also have to try it anew every time ;-)
22:50 🔗 nitro2k01 Like, "redirect stderr to stdout"
22:51 🔗 Schbirid yes, i want to redirect stderr to stdout AND redirect stdout to devnull. so in the end i want to see stderr as stdout, not being sent to devnull
22:51 🔗 arrith Schbirid, dada_: i'm not sure if they do older stuff but redump.org does stuff similar to that archival project
22:51 🔗 Dark_Star try "foo 2>&1 > /dev/null"
22:51 🔗 arrith that should work
22:52 🔗 Schbirid arrith: that is only metadata afaik
22:52 🔗 nitro2k01 Yeah, 2>&1 must be first for some reason
22:52 🔗 Schbirid ok, cheers
22:52 🔗 nitro2k01 I though it was the other way around
22:52 🔗 Dark_Star as I said, the outcome is different
22:52 🔗 arrith Schbirid: well like the GoodSets or MAME releases, i think people release redump sets
22:52 🔗 Dark_Star if you put it the other way round, everything goes to /dev/null
22:54 🔗 Dark_Star "foo 2&>1 > /dev/null" can be interpreted like "overwrite file descriptor 2 with the value of file descriptor 1, *and then* overwrite the file descriptor of #1 with a new one going to /dev/null"
22:55 🔗 Dark_Star if you do it the other way round it becomes "overwrite fd #1 with a new one for /dev/null, *and then* overwrite fd #2 with the value of fd #1"
22:55 🔗 Schbirid it is already past my bedtime, you aremaking it worse :P
22:55 🔗 Dark_Star sorry ;-)
22:56 🔗 Schbirid alright, started some big forums, so i better leave this unattended and killnig kittens
22:56 🔗 Schbirid good night :D
22:56 🔗 nitro2k01 Yeah. But I too ahev to rediscover this each time I need to do it
23:39 🔗 SketchCow OKAY BACK
23:41 🔗 hybernaut greetz
23:52 🔗 DFJustin you missed some 6502 action in #messdev :) http://git.redump.net/mess/commit/?id=e262ae8ed2a4438dd05ce633a632f42ac89a3bca

irclogger-viewer