>ISPs have a lot more disk space then we do. Does it not make sense to
>let them handle the mass storage, and let us actually download what we
>really intend to read?
Unfortunately, the ISPs won't cooperate by saving news
articles until I get around to reading them. Some of my
groups have 30 to 60 day expirations.
>In this regard, I have it easy, since I have a shell account. I can use
>unix tools (e.g., procmail for mail and slrn or trn kill files for news
>) to do initial filtering on the server. I can quickly delete mail/news
>I dont want on line, and send to yarn, pretty much only what I want to read.
Shell accounts are not available at most ISPs these days.
I need a solution that's portable.
>I tried maintaining a newsbase, with yarn duplicating all the unix stuff --
>its own newsrc file, filters, etc. I found it faster and easier to do
>this stuff on the server. In most cases, it does not take that much time
>to do all my filtering and reading on the server side. I suppose anyone
>can do this, if desired, with free services like dejanews.
Dejanews is very useful for research, but all client/server
setups like this tend to perform like a dog. Click and
>So why use yarn, or an offline anything to download huge amounts of news?
Define "huge." :)