Quote:
Originally posted by george_proost
To allow for multiple concurrent runs. I use up to 6 different news services. I provide a parm to the newnews module(s) hardcoded and have a perl module for each news reader.
|
Do all 6 ever run at once? You must have a badass server! If my two sites happen to update their news at the same time, my poor little machine nearly grinds to a halt!

The next version will pull the articles and insert them straight away, rather than batch them all up. The slight delay should reduce the server load considerably.
Quote:
I update PID files and kill the newname if already running.
Just before termination I do a wait of 'x' minutes and then it relaunches itself. (the 4 crons per day make sure there is continuity)
|
I've been trying to think of a way to continuously 'stream' the news in like a 'IHAVE' feed and run it as a daemon, but I'm pretty sure there's no way with a standard 'suck' account.
Quote:
Another nice feature would be to allow for grouping of newgroups by eg.:subject JOBS, PGMLANG, OS etcc
and 'ageing/ pruning' of threads based on the group above or specified per newsgroup in days.
eg: i would like to prune JOBS at 14 days while keep PGMLANG forever (or until the MySql limit )
|
I hadn't thought about grouping, but I do intend to add the per newsgroup expire option.
I don't actually use the auto expire option as it takes too long to empty the searchindex on an individual post basis. We need some a timestamp in the searchindex table, but that will probably more than double the size of the table. Mine is already over 1GB. (Or at least it was until yesterday... I'm experimenting with full text search at the moment, so I emptied it.
Quote:
If I can help ... i'll be pleased to oblige.
|
Thanks