From mboxrd@z Thu Jan 1 00:00:00 1970 To: 9fans@cse.psu.edu From: vic zandy Message-ID: Content-Type: text/plain; charset=us-ascii References: , <200312151534.hBFFYHfq009718@math.Princeton.EDU> Subject: Re: [9fans] Links for Plan 9 Date: Mon, 15 Dec 2003 17:34:57 +0000 Topicbox-Message-UUID: a4d35ef4-eacc-11e9-9e20-41e7f4b1d025 how about writing a "browse server"? the plan 9 client sends a url to the browse server. the server, running on some other operating system, calls a real web browser to fetch and render the page. the server then scrapes the rendered page and returns it to the client for display. i can think of dozens of very icky issues -- forms, pop-ups, finding links, knowing when the page is loaded (or hung or unavailable), a reload button, cookies, ... more interesting, i think, is whether their solutions can be designed independent of the browser -- so that they can be solved once, perhaps by hideous and extreme means, but then maintained with light work. for example, the mechanism for scraping the page might involve accessing off-screen portions of a bitmap that have not been displayed (or perhaps even rendered). i'm sure that someone could corner browser brand X into coughing the bits. but is there a general (perhaps OS-dependent) technique to coherently "scroll" bits from another program's window? the incentive for examining this approach is that if successful it would avoid the well-decried maintenance hassles of both a native plan 9 browser and an eternally incomplete port of some other giant browser. compared to vnc'ing to the browser host, you'd get a client in a real plan 9 window and process that can communicate in the usual ways (e.g., plumbing) without helpers. it'd be simpler to fire up. also, perhaps the browser server could be designed to serve a group of users, which might be easier on an organization than supporting vnc sessions for everyone. alas, this is only a suggestion, not an offer.