#urlteam 2017-08-26,Sat

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)

WhoWhatWhen
chfoowhich items are affected? [00:00]
........ (idle for 35mn)
Somebody2chfoo: list coming right up [00:35]
chfoo: http://termbin.com/7yit -- 36 items [00:41]
chfoois there a good way to fix them up in bulk? [00:47]
Somebody2chfoo: if you make any change to the metadata, that should trigger a re-creation of the torrent.
You should be able to script that with the python library. I can try to come up with something you can cut-n-paste if you'd like.
[00:49]
chfoooh ok. i was thinking of using the python library too
if only editing the metadata works then it should be quite simple.
[00:50]
Somebody2cool, thanks for doing it
let me know when you have and I'll try re-downloading them
[00:54]
***ix has joined #urlteam [01:06]
ixhey need any help or something [01:07]
***Odd0002 has joined #urlteam [01:10]
....... (idle for 31mn)
Somebody2ix: sure, we can always use help analyzing more shorteners
The easiest way to help is to look over the unsorted list on the wiki page, and check whether each one is alive or not.
There are other ways to help, if that doesn't appeal.
[01:41]
..................... (idle for 1h43mn)
***Odd0002 has quit IRC (Remote host closed the connection) [03:25]
.......... (idle for 49mn)
Sk1d has quit IRC (Ping timeout: 250 seconds) [04:14]
Sk1d has joined #urlteam [04:21]
................................................. (idle for 4h2mn)
hook54321Some of Microsoft's URL shorteners: http://helps.ms/a0WK2h http://msft.social/1bwpHT
http://msft.it/6013BxcB9
[08:23]
.............................. (idle for 2h26mn)
JAAhook54321: Are you sure that helps.ms is Microsoft? http://helps.ms/ redirects to https://www.po.st/, which is owned by R1Demand according to the footer, which seems unrelated to Microsoft.
Same for msft.social, by the way.
[10:52]
On the other hand, almost all links seem to go to Microsoft websites.
I only found one that doesn't so far: msft.social/5YEy0w
helps.ms is registered by Microsoft though.
This is weird.
[10:57]
............... (idle for 1h13mn)
Somebody2: I'm thinking about splitting up the warrior project table on the wiki page. I think it would make more sense to have the general information about a shortener (number of URLs, incremental, what the short codes look like, example URL) in the relevant section below (Alive or Dead/Broken), and only keep the information about the warrior project in that table. It would also be nice to have individual r
ows for each scrape rather than lumping them together, and then replace the date columns with "start date" and "end date".
What do you think about that?
[12:11]
............................................... (idle for 3h51mn)
Somebody2JAA: I like the idea of separating out each scrape. I'm less enthusiastic about duplicating entries between the warrior table and the Alive/Dead table.
In my conception, the Alive table is a step *before* the warrior table -- ideally, everything in it should eventually have a warrior job.
And the Dead table is a memorial -- the ones we didn't catch in time.
Splitting up the info seems like it would just make it harder to update.
Now, if you wanted to *combine* the three tables, with a column: Warrior/Alive/Dead -- I might support that.
[16:02]
astridi mean it makes some sense to track whether a site is alive or dead even when it does have a warrior job
but you do whatever you think makes most sense
don't let me stop you :)
bc you're already KILLING IT
fantastic job, urlteam. i'm always super impressed.
[16:17]
***astrid sets mode: +oo JAA Somebody2
astrid sets mode: +o hook54321
[16:18]
Somebody2we do track if a site with a warrior job is still generating new short codes (it gets a gray background if not)
(and a pink background if we are currently scraping it)
This is likely not the best way to track this stuff, though
[16:30]
astridah kool
i mean however that takes the effort out
so we dont need to track its fate manually once it's warriored
[16:33]
Somebody2oh, it's all still manual
If someone wanted to write code to automatically update the wiki page based on the tracker stats, that would be very welcome
[16:41]
...................... (idle for 1h46mn)
JAASomebody2: I thought about throwing everything into one table as well. That would be quite neat, but the already somewhat crowded table would need several additional columns, so I'm not sure how well it'd work. [18:28]
Somebody2Yeah. Feel free to try it out and see if it works better.
I don't object to any of your suggestions enough to block them -- so if you will take the time to make the update, go for it!
[18:29]
JAAYeah, I think I'll take a subsample of the data and try out a few different things on a test page. I'll let you know how that goes. [18:31]
Somebody2Sounds good -- I'm glad to look it over if you'd like. [18:31]
......... (idle for 42mn)
hook54321JAA: I saw it used on their support twitter page [19:13]
***mls has quit IRC (Ping timeout: 250 seconds) [19:27]
........... (idle for 51mn)
hook54321https://nyti.ms/2vrH6Wl [20:18]
................. (idle for 1h21mn)
JAAbit.ly alias, added, thanks. [21:39]
..... (idle for 23mn)
***Odd0002 has joined #urlteam [22:02]
............... (idle for 1h10mn)
Odd0002_ has joined #urlteam
Odd0002 has quit IRC (Quit: Leaving)
Odd0002_ is now known as Odd0002
[23:12]

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)