From mboxrd@z Thu Jan 1 00:00:00 1970 X-Msuck: nntp://news.gmane.io/gmane.emacs.gnus.general/75265 Path: news.gmane.org!not-for-mail From: Lars Magne Ingebrigtsen Newsgroups: gmane.emacs.gnus.general Subject: url-retrieve parallelism Date: Sun, 19 Dec 2010 01:45:50 +0100 Organization: Programmerer Ingebrigtsen Message-ID: NNTP-Posting-Host: lo.gmane.org Mime-Version: 1.0 Content-Type: text/plain X-Trace: dough.gmane.org 1292719584 17882 80.91.229.12 (19 Dec 2010 00:46:24 GMT) X-Complaints-To: usenet@dough.gmane.org NNTP-Posting-Date: Sun, 19 Dec 2010 00:46:24 +0000 (UTC) To: ding@gnus.org Original-X-From: ding-owner+M23619=ding+2Daccount=gmane.org@lists.math.uh.edu Sun Dec 19 01:46:20 2010 Return-path: Envelope-to: ding-account@gmane.org Original-Received: from util0.math.uh.edu ([129.7.128.18]) by lo.gmane.org with esmtp (Exim 4.69) (envelope-from ) id 1PU7Pm-0003rE-G6 for ding-account@gmane.org; Sun, 19 Dec 2010 01:46:18 +0100 Original-Received: from localhost ([127.0.0.1] helo=lists.math.uh.edu) by util0.math.uh.edu with smtp (Exim 4.63) (envelope-from ) id 1PU7Pl-0000vg-Mp for ding-account@gmane.org; Sat, 18 Dec 2010 18:46:17 -0600 Original-Received: from mx1.math.uh.edu ([129.7.128.32]) by util0.math.uh.edu with esmtps (TLSv1:AES256-SHA:256) (Exim 4.63) (envelope-from ) id 1PU7Pj-0000vY-3C for ding@lists.math.uh.edu; Sat, 18 Dec 2010 18:46:15 -0600 Original-Received: from quimby.gnus.org ([80.91.231.51]) by mx1.math.uh.edu with esmtp (Exim 4.72) (envelope-from ) id 1PU7Pb-0004AV-Iu for ding@lists.math.uh.edu; Sat, 18 Dec 2010 18:46:12 -0600 Original-Received: from lo.gmane.org ([80.91.229.12]) by quimby.gnus.org with esmtp (Exim 4.72) (envelope-from ) id 1PU7PZ-0000k9-Cm for ding@gnus.org; Sun, 19 Dec 2010 01:46:05 +0100 Original-Received: from list by lo.gmane.org with local (Exim 4.69) (envelope-from ) id 1PU7PV-0003iv-FQ for ding@gnus.org; Sun, 19 Dec 2010 01:46:01 +0100 Original-Received: from cm-84.215.34.171.getinternet.no ([84.215.34.171]) by main.gmane.org with esmtp (Gmexim 0.1 (Debian)) id 1AlnuQ-0007hv-00 for ; Sun, 19 Dec 2010 01:46:01 +0100 Original-Received: from larsi by cm-84.215.34.171.getinternet.no with local (Gmexim 0.1 (Debian)) id 1AlnuQ-0007hv-00 for ; Sun, 19 Dec 2010 01:46:01 +0100 X-Injected-Via-Gmane: http://gmane.org/ Mail-Followup-To: ding@gnus.org Original-Lines: 24 Original-X-Complaints-To: usenet@dough.gmane.org X-Gmane-NNTP-Posting-Host: cm-84.215.34.171.getinternet.no Face: iVBORw0KGgoAAAANSUhEUgAAADAAAAAwBAMAAAClLOS0AAAAD1BMVEXx7N/BrZGTclYgERVZ Ni6cQAGXAAACb0lEQVQ4jUVTi7HrKgwU3BSAiAsATAGJpQIMqP+a3grnzGM8JKPV6rtQ5hIDE4fI kUOKVModAicKBCARU4uRQnnRXTIcCUD1PwA4plhCTSWNEv5nhAxGismBV3YgpcDOCMSc3AUGj6Wz 1dzFRFJovR5LlojKh0TkAiBipZvN2uEiuj4EXLKpNpsqVjmnU+T7/lDrS67ZVnWWclyzif4BkmK/ uppZA1x7b/IDSjwvZLDZkGLkSlYe4BODrSVOaX0elSpRrkh2RV5epYLTgPIk6iN6XU1ll+lpzNad yJROVOPV46wdbK1C6OOO65M8oHk0VGBiCDX7Ra9P4kOu5gjstmwm+tKZ4pfyKTO3PRrcWm/SFAoN esEyONumYHoFVZEP/Z9oryE0B9BQR/YjYY/0XpxLaJgg0q+Kzgtt4DV9za0c4v13ALrt9Priip0f wPSG18MYfsdxePuq043s17/hIUN97bno483bd//ssq1pdcBlB2DbyXNoNQTGl2ELA4XgQC/m80qk g2J1Xt43Frx0nqhVMBOvIaQMIPvmFYMpJOum6fGdgEUjFPTmgFyhT3rO0UymK81uABpqTtQ93LJ9 4IzV4twRyoQIm8thA19PLr47wzf7MygEu6F21wBa7dOm7x17BWPSFvWWALtBnhyKMrdqrqWV+/ZB 0HooprQBFZ2+is2tzMrlAaC0yefzbzJz5/vHAKW+1WNh4sy5/wAvtG7t6PCXesgDYG5vudgVCjsY /QfMpBEl+gPBY8bm1tivVldV9FMPs5F80Py1HwA5vvHE1tyzp79Qeo4EjSLJ+Bu/7s7n+74hjytV Tj/gQ/8BuJKjJx9svI4AAAAASUVORK5CYII= Mail-Copies-To: never X-Now-Playing: Kate Bush's _The Dreaming_: "There Goes A Tenner" User-Agent: Gnus/5.110011 (No Gnus v0.11) Emacs/24.0.50 (gnu/linux) Cancel-Lock: sha1:Dk9+8R/sxwKucqKUng08hBBJ/6I= X-Spam-Score: -1.9 (-) List-ID: Precedence: bulk Xref: news.gmane.org gmane.emacs.gnus.general:75265 Archived-At: shr (and gnus-html, I guess) fire off a call to `url-retrieve' for every it finds. If a HTML message has 1000 s, then Emacs is going to do a DOS of the poor image web server. We obviously want to have more than a single `url-retrieve' call going at once, but we want to rate-limit this somewhat. To perhaps 10 at a time? So we need some kind of easy callback-ey interface, I think... But I'm wondering whether to make it a totally general library, that would be, like: (defun concurrent (concurrency function callback callback-arguments) ...) So FUNCTION would be required to return a process object (and have a parameter list based on `url-retrieve', which seems quite sensible), and CONCURRENT would just maintain a queue of processes, and fire off a new one (if any) when something returns, and so on... Would this be useful, or is my master's in Over Engineering showing again? -- (domestic pets only, the antidote for overdose, milk.) larsi@gnus.org * Lars Magne Ingebrigtsen