For some of the newsgroups, and even listserves I follow -- I really want
to read only about 10% or less of the posts. Why fill up my little PC
disk (about 3 gigs) with all that text every few days, when I am just
going to throw 90% of it away, either by automatic filtering or manual
filtering of subject lines?
ISPs have a lot more disk space then we do. Does it not make sense to
let them handle the mass storage, and let us actually download what we
really intend to read?
In this regard, I have it easy, since I have a shell account. I can use
unix tools (e.g., procmail for mail and slrn or trn kill files for news
) to do initial filtering on the server. I can quickly delete mail/news
I dont want on line, and send to yarn, pretty much only what I want to read.
I tried maintaining a newsbase, with yarn duplicating all the unix stuff --
its own newsrc file, filters, etc. I found it faster and easier to do
this stuff on the server. In most cases, it does not take that much time
to do all my filtering and reading on the server side. I suppose anyone
can do this, if desired, with free services like dejanews.
So why use yarn, or an offline anything to download huge amounts of news?