From mboxrd@z Thu Jan 1 00:00:00 1970 X-Msuck: nntp://news.gmane.io/gmane.emacs.gnus.general/26798 Path: main.gmane.org!not-for-mail From: Lloyd Zusman Newsgroups: gmane.emacs.gnus.general Subject: Re: Request for opinions as to feasibility of a Gnus application. Date: 13 Nov 1999 18:00:09 -0500 Organization: Linux Hippopotamus Preserve Sender: owner-ding@hpc.uh.edu Message-ID: References: <8666zdfav0.fsf@megalith.bp.aventail.com> NNTP-Posting-Host: coloc-standby.netfonds.no Mime-Version: 1.0 Content-Type: text/plain; charset=iso-8859-1 Content-Transfer-Encoding: 8bit X-Trace: main.gmane.org 1035163939 21299 80.91.224.250 (21 Oct 2002 01:32:19 GMT) X-Complaints-To: usenet@main.gmane.org NNTP-Posting-Date: Mon, 21 Oct 2002 01:32:19 +0000 (UTC) Return-Path: Original-Received: from bart.math.uh.edu (bart.math.uh.edu [129.7.128.48]) by sclp3.sclp.com (8.8.5/8.8.5) with ESMTP id SAA22163 for ; Sat, 13 Nov 1999 18:00:44 -0500 (EST) Original-Received: from sina.hpc.uh.edu (lists@Sina.HPC.UH.EDU [129.7.3.5]) by bart.math.uh.edu (8.9.1/8.9.1) with ESMTP id RAB28187; Sat, 13 Nov 1999 17:00:34 -0600 (CST) Original-Received: by sina.hpc.uh.edu (TLB v0.09a (1.20 tibbs 1996/10/09 22:03:07)); Sat, 13 Nov 1999 17:00:51 -0600 (CST) Original-Received: from sclp3.sclp.com (root@sclp3.sclp.com [204.252.123.139]) by sina.hpc.uh.edu (8.9.3/8.9.3) with ESMTP id RAA08278 for ; Sat, 13 Nov 1999 17:00:40 -0600 (CST) Original-Received: from ljz.net (gnus@ljz.net [205.230.65.138]) by sclp3.sclp.com (8.8.5/8.8.5) with ESMTP id SAA22155 for ; Sat, 13 Nov 1999 18:00:10 -0500 (EST) Original-Received: (from gnus@localhost) by ljz.net (8.8.7/8.8.7) id SAA19921; Sat, 13 Nov 1999 18:00:09 -0500 Original-To: ding@gnus.org X-Face: "!ga1s|?LNLE3MeeeEYs(%LIl9q[xV9!j4#xf4!**BFW_ihlOb;:Slb>)vy>CJM Kai.Grossjohann@CS.Uni-Dortmund.DE (Kai Großjohann) writes: > > > Lloyd Zusman writes: > > > > > [ ... ] > > > > I think I don't grok what you're trying to do, as you can see from the > > following stupid question: why don't you put a URL in the article and > > then use W3 to download the URL? > > You'd still want to hack something up to start it asynchronously. By > default Emacs/W3 does not do asynch downloads. And to get asynch fetches > with efs/ange-ftp, you would need to use copy-file intead of find-file. Hmmm ... given that I also want to be able to do this by marking a batch of articles and invoking a mass-download on them (I forgot to mention this in my original article), perhaps the approach could go something like this: Create a command called, for example, `download-for-marked-articles' that does the following: * Runs some sort of "for-each-marked-article-do-something-specific" function which passes the headers and maybe also the body of each marked article one-by-one to my own callback function. Question: Is there already such a "for-each-marked-article..." function? * My own callback function looks inside the headers or body of each article to determine the file name that would be downloaded, and appends each name to some sort of list. * My `download-for-marked-articles' command then causes `copy-file' to get invoked on each file within this list to do the asynchronous downloads. Does this sound reasonable? -- Lloyd Zusman ljz@asfast.com