Gnus development mailing list
 help / color / mirror / Atom feed
* Request for opinions as to feasibility of a Gnus application.
@ 1999-11-07 20:04 Lloyd Zusman
  1999-11-08  8:37 ` Kai Großjohann
  0 siblings, 1 reply; 9+ messages in thread
From: Lloyd Zusman @ 1999-11-07 20:04 UTC (permalink / raw)


I have a need for some special processing that I'd like to implement
in Gnus, if possible.  I'd like to discuss this here to see if perhaps
there already is a way to do exactly what I want under Gnus that I
somehow missed, or if not, whether any of you can see any pitfalls in
my proposed approach or offer any useful suggestions.

First of all, here's what I'm trying to accomplish:

I manage a server machine that contains a collection of relatively
large files that could consist of images, PDF's, etc.  This machine is
is accessible over the internet by users who primarily make use of a
relatively slow PPP connection.  These users are capable of using Gnus
(at least as end-users) and are willing to do so.

I can very easily (mostly using Perl), build one or more validly
constructed newsgroups on the server (with associated overview
information) that contain articles of the following form, one for each
large file:

  From: me@myserver.com
  Newsgroups: my.local.newsgroup.name
  Subject: One-line description of the large file
  Date: <date posted to archive>
  Organization: My organization
  Lines: <line count>
  Message-ID: <unique ID that encodes the actual name of the large file>
 
  The body of the article would contain a more detailed text
  description of the large file.

The remote users would be able to use NNTP in a normal fashion to read
all these articles.  So far, this is all straightforward and trivial
and requires no changes or enhancementes to Gnus.

However, I'd like to give the users the means within Gnus to download
these large files to their local machines, if desired.  To do so, I'd
like to create a new decoding command within Gnus which, for any
selected articles, would create an asynchronous process which uses FTP
or HTTP or rsync or something similar (yet to be decided and possibly
even configurable at run time) to download the associated large files
to a user-specified location on the local machine.

There are a number of ways to do this including non-NNTP/non-Gnus
methods involving HTTP or other protocols.  However, I would like to
try to do this under Gnus.

I know how to write the elisp/Gnus code to implement this, but as I
mentioned, before I do so, I'm wondering if there already is something
in Gnus that I missed which could do something like this, or if not,
if anyone sees any obvious pitfalls or "gotchas" in my proposal.
Also, I'm wondering if perhaps anyone can think of one or more
different approaches under Gnus that would provide the functionality
I'm looking for (remote users behind a slow connection being able to
view information about large files within one or more newsgroups, and
then to be able to asynchronously download on request any desired
large files).

Thanks in advance for any insights you all might be able to provide.

-- 
 Lloyd Zusman
 ljz@asfast.com


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-11-07 20:04 Request for opinions as to feasibility of a Gnus application Lloyd Zusman
@ 1999-11-08  8:37 ` Kai Großjohann
  1999-11-08 12:04   ` William M. Perry
                     ` (2 more replies)
  0 siblings, 3 replies; 9+ messages in thread
From: Kai Großjohann @ 1999-11-08  8:37 UTC (permalink / raw)


Lloyd Zusman <ljz@asfast.com> writes:

> However, I'd like to give the users the means within Gnus to download
> these large files to their local machines, if desired.  To do so, I'd
> like to create a new decoding command within Gnus which, for any
> selected articles, would create an asynchronous process which uses FTP
> or HTTP or rsync or something similar (yet to be decided and possibly
> even configurable at run time) to download the associated large files
> to a user-specified location on the local machine.

I think I don't grok what you're trying to do, as you can see from the
following stupid question: why don't you put a URL in the article and
then use W3 to download the URL?

You'd have to frob browse-url.el a bit, of course, since it normally
invokes W3 to _display_ the URL, rather than to _download_ it.
Another idea would be to do M-x find-file-at-point RET on an ange-ftp
file name.

Can you try to explain so that I can better understand what you're
trying to achieve?

kai
-- 
This gubblick contains many nonsklarkish English flutzpahs,
but the overall pluggandisp can be glorked from context. -- David Moser


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-11-08  8:37 ` Kai Großjohann
@ 1999-11-08 12:04   ` William M. Perry
  1999-11-13 23:00     ` Lloyd Zusman
  1999-11-11  4:20   ` Lars Magne Ingebrigtsen
  1999-11-13 22:48   ` Lloyd Zusman
  2 siblings, 1 reply; 9+ messages in thread
From: William M. Perry @ 1999-11-08 12:04 UTC (permalink / raw)
  Cc: ding

Kai.Grossjohann@CS.Uni-Dortmund.DE (Kai Großjohann) writes:

> Lloyd Zusman <ljz@asfast.com> writes:
> 
> > However, I'd like to give the users the means within Gnus to download
> > these large files to their local machines, if desired.  To do so, I'd
> > like to create a new decoding command within Gnus which, for any
> > selected articles, would create an asynchronous process which uses FTP
> > or HTTP or rsync or something similar (yet to be decided and possibly
> > even configurable at run time) to download the associated large files
> > to a user-specified location on the local machine.
> 
> I think I don't grok what you're trying to do, as you can see from the
> following stupid question: why don't you put a URL in the article and
> then use W3 to download the URL?

You'd still want to hack something up to start it asynchronously.  By
default Emacs/W3 does not do asynch downloads.  And to get asynch fetches
with efs/ange-ftp, you would need to use copy-file intead of find-file.

-bp


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-11-08  8:37 ` Kai Großjohann
  1999-11-08 12:04   ` William M. Perry
@ 1999-11-11  4:20   ` Lars Magne Ingebrigtsen
  1999-11-16 20:34     ` Lloyd Zusman
  1999-11-13 22:48   ` Lloyd Zusman
  2 siblings, 1 reply; 9+ messages in thread
From: Lars Magne Ingebrigtsen @ 1999-11-11  4:20 UTC (permalink / raw)


Kai.Grossjohann@CS.Uni-Dortmund.DE (Kai Großjohann) writes:

> I think I don't grok what you're trying to do, as you can see from the
> following stupid question: why don't you put a URL in the article and
> then use W3 to download the URL?

Or, if it has to be asynchronous, define a viewer.

Like:  Have the nntp server just return normal
<URL:http://that.server.thing/image.jpg> things in the description
articles.  When the user hits `RET' on that link, have
`browse-url-browser-function' set to your own function that would call 
an external, say, Perl script that would download the image to the
local disk and then call xv on it, or whatever.

Or just let the browse-url function just call Netscape, and then let
it do the rest.  That would also work asynchronously.

-- 
(domestic pets only, the antidote for overdose, milk.)
   larsi@gnus.org * Lars Magne Ingebrigtsen


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-11-08  8:37 ` Kai Großjohann
  1999-11-08 12:04   ` William M. Perry
  1999-11-11  4:20   ` Lars Magne Ingebrigtsen
@ 1999-11-13 22:48   ` Lloyd Zusman
  2 siblings, 0 replies; 9+ messages in thread
From: Lloyd Zusman @ 1999-11-13 22:48 UTC (permalink / raw)
  Cc: Kai.Grossjohann

I'm sorry I didn't respond to this sooner ...

Kai.Grossjohann@CS.Uni-Dortmund.DE (Kai Großjohann) writes:

> Lloyd Zusman <ljz@asfast.com> writes:
> 
> > However, I'd like to give the users the means within Gnus to download
> > these large files to their local machines, if desired.  To do so, I'd
> > like to create a new decoding command within Gnus which, for any
> > selected articles, would create an asynchronous process which uses FTP
> > or HTTP or rsync or something similar (yet to be decided and possibly
> > even configurable at run time) to download the associated large files
> > to a user-specified location on the local machine.
> 
> I think I don't grok what you're trying to do, as you can see from the
> following stupid question: why don't you put a URL in the article and
> then use W3 to download the URL?

In the simple, one-file-at-a-time case, this would work.  I neglected
to mention in my original query that I also would like the user to be
able to mark a group of articles and to then cause them all to be
downloaded in a single batch.  This is why the decoding commands came
to mind.

Also, I'd like for the downloading to take place asynchronously from
the client Gnus session, so that the downloading of the file or files
could take place simultaneously while the user is using Gnus for other
purposes.

> You'd have to frob browse-url.el a bit, of course, since it normally
> invokes W3 to _display_ the URL, rather than to _download_ it.
> Another idea would be to do M-x find-file-at-point RET on an ange-ftp
> file name.

I'm curious: could the W3 code or `find-file-at-point' via an ange
filename do the downloading asynchronously, so that the user could be
doing other things with Gnus while the probably-several-minute
download is taking place?  If so, perhaps one or the other of these
could be invoked within this proposed new decoding command ... ???

> Can you try to explain so that I can better understand what you're
> trying to achieve?

Have I clarified this better now?

-- 
 Lloyd Zusman
 ljz@asfast.com


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-11-08 12:04   ` William M. Perry
@ 1999-11-13 23:00     ` Lloyd Zusman
  0 siblings, 0 replies; 9+ messages in thread
From: Lloyd Zusman @ 1999-11-13 23:00 UTC (permalink / raw)


I'm sorry I didn't reply to this sooner.  Thanks for your reply ...

wmperry@aventail.com (William M. Perry) writes:

> Kai.Grossjohann@CS.Uni-Dortmund.DE (Kai Großjohann) writes:
> 
> > Lloyd Zusman <ljz@asfast.com> writes:
> > 
> > > [ ... ]
> > 
> > I think I don't grok what you're trying to do, as you can see from the
> > following stupid question: why don't you put a URL in the article and
> > then use W3 to download the URL?
> 
> You'd still want to hack something up to start it asynchronously.  By
> default Emacs/W3 does not do asynch downloads.  And to get asynch fetches
> with efs/ange-ftp, you would need to use copy-file intead of find-file.

Hmmm ... given that I also want to be able to do this by marking a
batch of articles and invoking a mass-download on them (I forgot to
mention this in my original article), perhaps the approach could go
something like this:

Create a command called, for example, `download-for-marked-articles'
that does the following:

* Runs some sort of "for-each-marked-article-do-something-specific"
  function which passes the headers and maybe also the body of each
  marked article one-by-one to my own callback function.  Question: Is
  there already such a "for-each-marked-article..."  function?

* My own callback function looks inside the headers or body of each
  article to determine the file name that would be downloaded, and
  appends each name to some sort of list.

* My `download-for-marked-articles' command then causes `copy-file'
  to get invoked on each file within this list to do the asynchronous
  downloads.

Does this sound reasonable?

-- 
 Lloyd Zusman
 ljz@asfast.com


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-11-11  4:20   ` Lars Magne Ingebrigtsen
@ 1999-11-16 20:34     ` Lloyd Zusman
  1999-12-01 15:37       ` Lars Magne Ingebrigtsen
  0 siblings, 1 reply; 9+ messages in thread
From: Lloyd Zusman @ 1999-11-16 20:34 UTC (permalink / raw)


Lars Magne Ingebrigtsen <larsi@gnus.org> writes:

> Kai.Grossjohann@CS.Uni-Dortmund.DE (Kai Großjohann) writes:
> 
> > I think I don't grok what you're trying to do, as you can see from the
> > following stupid question: why don't you put a URL in the article and
> > then use W3 to download the URL?
> 
> Or, if it has to be asynchronous, define a viewer.
> 
> Like:  Have the nntp server just return normal
> <URL:http://that.server.thing/image.jpg> things in the description
> articles.  When the user hits `RET' on that link, have
> `browse-url-browser-function' set to your own function that would call 
> an external, say, Perl script that would download the image to the
> local disk and then call xv on it, or whatever.
> 
> Or just let the browse-url function just call Netscape, and then let
> it do the rest.  That would also work asynchronously.

Thank you.  This sounds like a good approach for the user to use
within a single article, but I also have a twist to add:

I'd like for the user to be able to mark in some way more than one
article and then for all the remote files that correspond to each of
these articles that were selected (not necessarily images, they could
also be PDF's, .doc files, text files, whatever ... even .html files)
to be asynchronously downloaded en masse and stored in a local
directory.

I came up with this structure, but I'm not sure if it's the best way
to do what I want:

  (let ((processable gnus-newsgroup-processable)
        article)
    (while processable
      (setq article (car processable))
      (setq processable (cdr processable))
      ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
      ;;; Find the <URL:...> item in the given article and ;;;
      ;;; use `browse-url-browser-function' or some such   ;;;
      ;;; thing to queue up an asynchronous download to be ;;;
      ;;; performed by Perl or whatever.                   ;;;
      ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
      ))

Given my desire to allow a group of articles to be processed, would
this particular <URL:...> and `browse-url-browser-function' approach
be the most efficient?

Also, ideally, I'd prefer to avoid Perl or any non-elisp-based
software, because I can't control what the users might have installed
on their machines, other than Gnus.  I know that I could use
`efs-copy-file-internal' with the `nowait' argument set in order to do
the asynchronous downloads, but is there some other way to do this
asynchronously and en masse via URL's so that I could still use the
`browse-url-...' function(s)?

Thanks again, in advance.

-- 
 Lloyd Zusman
 ljz@asfast.com


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-11-16 20:34     ` Lloyd Zusman
@ 1999-12-01 15:37       ` Lars Magne Ingebrigtsen
  1999-12-01 16:07         ` William M. Perry
  0 siblings, 1 reply; 9+ messages in thread
From: Lars Magne Ingebrigtsen @ 1999-12-01 15:37 UTC (permalink / raw)


Lloyd Zusman <ljz@asfast.com> writes:

> Also, ideally, I'd prefer to avoid Perl or any non-elisp-based
> software, because I can't control what the users might have installed
> on their machines, other than Gnus.  I know that I could use
> `efs-copy-file-internal' with the `nowait' argument set in order to do
> the asynchronous downloads, but is there some other way to do this
> asynchronously and en masse via URL's so that I could still use the
> `browse-url-...' function(s)?

No, in that case I think using something like
`nnweb-url-retrieve-asynch' would be a better idea.  But getting the
asynch thing right can be a bit fiddly.

-- 
(domestic pets only, the antidote for overdose, milk.)
   larsi@gnus.org * Lars Magne Ingebrigtsen


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: Request for opinions as to feasibility of a Gnus application.
  1999-12-01 15:37       ` Lars Magne Ingebrigtsen
@ 1999-12-01 16:07         ` William M. Perry
  0 siblings, 0 replies; 9+ messages in thread
From: William M. Perry @ 1999-12-01 16:07 UTC (permalink / raw)


Lars Magne Ingebrigtsen <larsi@gnus.org> writes:

> Lloyd Zusman <ljz@asfast.com> writes:
> 
> > Also, ideally, I'd prefer to avoid Perl or any non-elisp-based
> > software, because I can't control what the users might have installed
> > on their machines, other than Gnus.  I know that I could use
> > `efs-copy-file-internal' with the `nowait' argument set in order to do
> > the asynchronous downloads, but is there some other way to do this
> > asynchronously and en masse via URL's so that I could still use the
> > `browse-url-...' function(s)?
> 
> No, in that case I think using something like
> `nnweb-url-retrieve-asynch' would be a better idea.  But getting the
> asynch thing right can be a bit fiddly.

The new URL package is 100% asynchronous (only FTP/FILE/NFS/HTTP are asynch
right now, but you can use the asynch interface for everything and it still
works).  I'm working on converting Emacs/W3 to use it right now, but you
can get it via CVS from :pserver:anoncvs@anoncvs.gnu.org/gd/gnu/anoncvsroot
- module name is 'url'.

-Bill P.


^ permalink raw reply	[flat|nested] 9+ messages in thread

end of thread, other threads:[~1999-12-01 16:07 UTC | newest]

Thread overview: 9+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
1999-11-07 20:04 Request for opinions as to feasibility of a Gnus application Lloyd Zusman
1999-11-08  8:37 ` Kai Großjohann
1999-11-08 12:04   ` William M. Perry
1999-11-13 23:00     ` Lloyd Zusman
1999-11-11  4:20   ` Lars Magne Ingebrigtsen
1999-11-16 20:34     ` Lloyd Zusman
1999-12-01 15:37       ` Lars Magne Ingebrigtsen
1999-12-01 16:07         ` William M. Perry
1999-11-13 22:48   ` Lloyd Zusman

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).