On Sat, Apr 09, 2016 at 08:27:54AM -0400, Karl Dahlke wrote: > > What's most important and > > whos doing what currently? > > I'm afraid I'm doing not a damn thing, as my mother is dying, > my wife is in a cast and even the slightest household task requires > the coordinated efforts of the two of us, > my son's emotional issues never seem to end, my daughter is flirting with > homelessness and destitution again, > dragging her son down with her, > and my wife and I will soon need to move again, as this was my Mom's house > and is underwater so the bank will glom onto it. Sorry to hear all that. > If any one else wants to work on edbrowse I'd be happy to direct / advise; > that's probably all I can do for the foreseeable future. > > As for priority, I continue to suggest the find&fix method, > what can't people run, like google groups or github etc, why, > and is it really requiring a complex system like ajax or is it something silly > like a missing dom object? > I say that only because I tend to be user driven, market driven, > and Adam generally is not, design it all to work as it should > from the get-go, and that's good, if we all have the time. > So I don't know. To a point I want to design things right from the get-go but if we can fix things easily then we should. We certainly need to continue to fix up our DOM whatever we end up doing with our js implementation. > Love you all and glad you're on my team. > > Ok, here's a thought. > I think the next step is separating the curl stuff out into its own curl > server process, > > edbrowse --mode curl > > See main.c line 550. > This would render almost no change from the user's perspective, > but would seem to be necessary for future things to work right. > I think we were talking about this when everyone went into hiatus > for various reasons. > I proposed a series of messages back and forth between user edbrowse and edbrowse curl, > to fetch data and coordinate cookies and certificates and the like. > I'll seee if I can find that thread. > One curl server per user, so that parallel instances of edbrowse > would not access and clobber the same cookie jar. > I've done this experiment: edbrowse in two different consoles, > 1 reads website A, 2 reads website B, > 1 exits and writes the jar, 2 exits and writes the jar, > cookies from site A are gone. > This is a big job, a necessary job, > a job that lays some ipc groundwork, > but not as big as asynchronous js and some of the other > things we're flirting with. Agreed. From experience with our current IPC I suggest really looking at making it as async as possible since there are a number of instances where I find myself having to kill edbrowse-js because it goes into some infinite loop or something, blocking the browser. We also need to do that in the curl process because we need to be able to multiplex downloads in some way. I'd quite like to look at this if I have time. I may need a hand (Chris?) with the details of curl_multi and curl_shared etc though. Having said that, if anyone wants to take it they're welcome. Cheers, Adam.