[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[leafnode-list] Re: Downloading old articles



On Tue, Nov 3, 2009 at 3:33 PM, Marcin Dziwnowski
<m.dziwnowski@xxxxxxxxx> wrote:

> # fetchnews -vvvv -x 235000
>  backing up from 1706789 to 1471789
>  considering articles 1471789 - 1706793
>  0 articles fetched, 0 killed
> No information about why it does not attempt to download the missing
> posts is given

As it turns out the server I am using probably just can't cope with
listing the demanded 235 000 posts all at once. The articles are
there, the xover command is recognized but it's just too much.

Is there a way to make fetchnews download posts in "batches", ten,
maybe twenty thousand every run? Without overloading the server with
listing three, four, five or six hundred thousand articles first?

> Or maybe some smarter tool for the job?

The "smarter tool" term was very unfortunate, Matthias, I'm sorry.
-- 
_______________________________________________
leafnode-list mailing list
leafnode-list@xxxxxxxxxxxxxxxxxxxxxxxxxxxx
https://www.dt.e-technik.uni-dortmund.de/mailman/listinfo/leafnode-list
http://leafnode.sourceforge.net/