[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[leafnode-list] Re: Fetchnews messages



On Fri, May 22, 2009 at 6:02 PM, clemens fischer <
ino-news@xxxxxxxxxxxxxxxxxxxxxxxxx> wrote:

> Enrico Wiki wrote:
>
> > On Tue, May 19, 2009 at 7:10 PM, clemens fischer wrote:
> >
> > You are perfectly right: I just reported the relevant text part of the
> > message, but there were message-ids too.
> >
> > store: duplicate article <2X0He.21565$2U1.1376213@xxxxxxxxxxxx>
> > store: duplicate article <LL9He.23780$B6.641826@xxxxxxxxxxxxxxxxxx>
>
> If you want to anonymize Message-IDs, just put a few '_' or '.'
> characters here and there, like, eg. <...@xxxxxxxxxxxxxxxx>.


Ok.


> I really
> thought store.c was severely broken.
>
> > They were not crossposts. Besides I was just retrieving 1 group with
> > the N option and had an empty interesting.groups folder.
> >
> > The command line was like:
> > fetchenws -N [group]
> >
> > At the end of the job, fetchnews said:
> > ...[group] 48262 articles fetched (to 48263), 2 killed
> > fetchnews: 48262 articles and 0 headers fetched, 2 killed, 0 posted, in
> 4073
> > seconds
> >
> > Might they just be what they say to be: duplicate mids on the upstream
> > server?
>
> Unlikely.  My guess is you tried "fetchnews" a few times until you were
> satisfied with what happened and the spool really did contain those
> articles.  Besides, "fetchnews" checks for duplicates in its own spool,
> for it cannot know the IDs upstream.
>

I completely empty the news spool before each test, leaving just an empty
tree, no files.
Then a fewtchnews command line to create the group list.
Then a fetchnews -N to retrieve a group.
Just that, nothing else.


Here is a new test.

sudo fetchnews -vvvN sci.med.nutrition
fetchnews mode: get articles, get headers, get bodies, post articles
found 0 articles in in.coming.
text.giganews.com: connecting to port nntp
  trying:    address 216.196.97.140 port 119...
  connected: address 216.196.97.140 port 119.
text.giganews.com: connected (200), banner: "200 Text.GigaNews.Com"
text.giganews.com: checking for new newsgroups
text.giganews.com: found 0 new newsgroups
text.giganews.com: not posting, feedtype == none.
sci.med.nutrition: considering 117209 articles 167107 - 284315, using XOVER
sci.med.nutrition: XOVER: 103758 seen, 0 I have, 0 filtered, 103758 to get
sci.med.nutrition: will fetch 103758 articles
store: duplicate article <
1__61___75.53___8.110220@xxxxxxxxxxxxxxxxxxxxxxxxxxxx>
store: no valid newsgroups
store: duplicate article <M__d___9LsdsKH___Vn-oA@xxxxxxxxxxx>
store: duplicate article <
1__27___02.61___0.293410@xxxxxxxxxxxxxxxxxxxxxxxxxxxx>
store: duplicate article <h__dn___NOHYmXH___n-1A@xxxxxxxxxxx>
store: duplicate article <T__Ge.3___4$aA5.___5@xxxxxxxxxxxxxxxxxxxx>
store: duplicate article <f___e.6___6$oJ.1__44@xxxxxxxxxxxxxxxxxxxxxxxxxx>
store: duplicate article <MPG.1___02eeb__9f___98979f@xxxxxxxxxxxxxxxxxxxxxxx
>
store: duplicate article <q6v__199hrv___n4t___80jognouh011v@xxxxxxx>
sci.med.nutrition: 101932 articles fetched (to 101933), 9 killed
fetchnews: 101932 articles and 0 headers fetched, 9 killed, 0 posted, in
9403 seconds




> >> Enrico, did you run any "prepared" articles from news/in.coming by
> >> fetchnews?
> >
> > I don't think so. What do you mean by "prepared" articles? :-)
>
> Well, the normal operating mode is this:  point newsreader at leafnodes
> port, subscribe to some newsgroups and let fetchnews pick up their
> articles.  By that time leafnode will have set up the relevant
> information for fetchnews to act upon.
>
> But sometimes people want to integrate articles from other sources, so
> they massage them into the proper format and inject them somehow.  This
> procedure isn't documented and can fail in a number of ways.


Thanks for the explanation.

>
>
> Maybe you should subscribe to newsgroups the regular way, possibly
> adjusting "initialfetch", "timeout_short", and "timeout_long" for your
> purposes and only then using fetchnews to aquire missing articles.
>


I will try the regular subscribe with a reader and will let you know. I seem
to remember I already did it, with similar results, but I am not sure, so
I'll try again anyway.

-- 
Enrico
-- 
_______________________________________________
leafnode-list mailing list
leafnode-list@xxxxxxxxxxxxxxxxxxxxxxxxxxxx
https://www.dt.e-technik.uni-dortmund.de/mailman/listinfo/leafnode-list
http://leafnode.sourceforge.net/