Subject:
|
Re: Pseudo-streaming live news (was: Re: Monitor Page)
|
Newsgroups:
|
lugnet.admin.general
|
Date:
|
Mon, 27 Mar 2000 23:24:53 GMT
|
Reply-To:
|
jsproat@io./saynotospam/com
|
Viewed:
|
1905 times
|
| |
| |
Todd Lehman wrote:
> One suggestion: Separate the producer into a separate, totally encapsulated
> daemon which just sits there running 24x7 and slurps new articles whenever
> they appear and spools them into some directory on the local drive. That way,
> it can run while you sleep, even if you've exited the main user interface
> program. Write that first, and get it working 100%, and it won't ever need
> to change. Then allow (via API) any number of consumers to read, filter, and
> display the articles.
Question to y'all:
What type of API is expected between the spooler and the spooler's client?
i.e. How does your spooler serve articles to the client? Is it pretty much
expected to imitate the avid.cgi over HTTP etc., or is anyone writing a
spooler which keeps a persistent connection to the client, etc.?
A couple o'questions for Todd:
How far back does avid.cgi go? (I'm guessing 1.) How do you feel about
someone downloading, over a period of time, all the messages on LUGNET? It's
the packrat in me...
Cheers,
- jsproat
--
Jeremy H. Sproat <jsproat@io.com> ~~~ http://www.io.com/~jsproat/
Card-carrying member of the Star-Bellied Sneech Preservation Society
|
|
Message is in Reply To:
66 Messages in This Thread:
- Entire Thread on One Page:
- Nested:
All | Brief | Compact | Dots
Linear:
All | Brief | Compact
|
|
|
|