[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[leafnode-list] Re: Downloading old articles
Am 12.11.2009, 19:12 Uhr, schrieb Marcin Dziwnowski
> On Fri, Nov 6, 2009 at 10:55 AM, Robert Grimm <lists@xxxxxxxxxxxxxx>
>> Apparently it is "noxover = 1" in leafnode 1.
> ... and it doesn't help. Leafnode still doesn't download more articles.
> Well, the fault is more on the server side than leafnode's, but is
> there another way around it?
I think what you need to do as a workaround will be:
1. configure /etc/leafnode/config to maxfetch=100000
2. run fetchnews -x100000
3. repeat 1 and 2 with 200000, then 300000, and so forth, until you're set.
You can try to automate things with a Bourne-like shell (bash, ksh and
pdksh qualify) and perl like this:
# WARNING - UNTESTED CODE BELOW
# this script workaround fetches 700000 articles in 100000 increments:
# note that max must be a multiple of inc!
while [ $i -le $max ] ; do
perl -ple "s/^maxfetch *=.*/maxfetch=$i/;" -i /etc/leafnode/config
fetchnews -nvx $i
i=$(( $i + $inc ))
# end of script
If you don't have perl installed, replace the perl line with:
sed -e "s/^maxfetch *=.*/maxfetch=$i/;" -i /etc/leafnode/config
assuming your sed(1) implementation can do in-place edits (that is what -i
Save the script to bulkfetch.sh and then run sh bulkfetch.sh as root or
leafnode-list mailing list