Thanks for the suggestions. I had been parsing typeset -p output by splitting words and using a regexp match. Now, thanks to your suggestion, I use the parameters module. As for the typeset -T and typeset -p not showing -T associations, I think that's a misfeature. What's shown now are array assignments and you don't reliably know what goes with what. Much better would be to show the association. That is if I run: typeset -T foo FOO then typeset -T should echo that among the other -T possibilities. I can always get the array and scalar values if I want. But getting the other information, as you say, is not easy if possible at all. Should I look into providing a patch for this? Saving an environment for reloading into another session, whether nested or not, might be useful in other contexts. Possibly the code you or I have in zshdb could be turned into a function and put inside the parameter module? On Sun, Feb 27, 2011 at 4:01 PM, Bart Schaefer wrote: > On Feb 27, 6:44am, Rocky Bernstein wrote: > } > } A little bit of context of why I am doing this. Recently in the zsh > } debugger I've added the ability to go into a nested zsh, and often one > } wants the existing environment of the debugged program preserved in > } this nested zsh. > > You're kind of doomed here from the start. If the debugger is inside > a shell function, for example, you're never going to get scoping back. > Not true in a couple of ways. First even if you changing values doesn't persist, it is useful to be able to see the values. What I'm thinking of now is adding a function save_var which takes the name of a variable one wants to persist. In an at_exit hook (or whatever the equivalent is on zsh), then those variables are written out to a file similar to what was done on entry to get all of those variables set. If there is something like a tie function like there is in Perl or a discipline function on a varaible like ksh, that might be used, but this would be on a per-variable basis. Other comments are in line below. > > } I would like to save to a file shell variables so that I can run a > } nested zsh and then read these back in. It is becoming a bit of a > } challenge because this in the output: > } > } typeset -i10 -r !=0 > > Hm, I'm a bit surprised the '!' isn't quoted there; but the real issue > is that you get "not an identifier" or the like for a number of those, > and "can't change type" for ones that come from zsh/parameter, plus a > few "read-only variable" complaints. > > } typeset -ar '*' > > Hmm, strange. That one does NOT give "not an identifier" ... > > } *=() > > ... but that of course bombs with a globbing error. > So although I can see how this might be useful as an explanation, I think it would be nice if there were some simple way (i.e. something in the typeset -p command) to filter out those things that can't be source'd back in. > > } Failing a better solution, I think what I'll have to do is store IFS='' > } typeset -p into an array and check each item of the array. > > The zsh/parameter module $parameters hash already tells you nearly all > you need to know. Something like this: > > () { > local param type > for param type in "${(kv@)parameters}" > do > case "$type" in > (*local*) continue;; # Skip loop variables > (*export*) continue;; # No need to dump/restore if exported? > (*special*) continue;; # Maintained by the shell > (*readonly*) continue;; > (*) typeset -p "$param";; > esac > done > } > > You can avoid zsh/parameter by parsing the output of "typeset +m +": > > () { > local param type description > typeset +m + | while read -A description > do > param="${description[-1]}" > description[-1]=() > if (( ${#description} )) > then type="${description[*]}" > else type=scalar > fi > case "$type" in > (*local*) continue;; # Skip loop variables > (*export*) continue;; # No need to dump/restore if exported? > (*readonly*) continue;; > (*) typeset -p "$param";; > esac > done > } > Thanks again. I have modified the code to do something like this. Note however I don't want to skip local variables, because in a debugger that's one of the important things we want to examine. And read-only variables I want to skip only if they are already set. That's why I had that typeset -p $var >/dev/null 2>&1 || prefix in the code before. (I had erroneously used && instead of || and and a stray character in there.) But all of this is easily fixed. > However, that doesn't let you catch "special" parameters, though you > can still filter the readonly subset. > > Note both of these techniques still miss things like: > > typeset -T foo FOO > > I.e., there's no way to discover by examination that an array and scalar > have been tied together. > > } But then we come to the typeset -ar '*' line which I guess flows onto the > } next line. > > Not exactly "flows", but for arrays and associative arrays (hashes) the > value can't be supplied in the typeset command, so an assignment line is > needed. > Yes, I meant that the semantic description of the variable flows into the next line, not that there was one statement split on two lines. I often write these two statements on one like with a semicolon in between. But while we are being precise, let me point out that it is wrong to say that a local variable is a "loop" variable as you suggest in the code above. > Also you may have to be careful with the order of assignments when you > read the file back in. Some assignments to special variables (like to > the "options" hash) might change shell behavior in unexpected ways. > Ok. Thanks, I will keep that in mind. Right now, I'm skipping the special variables as you had in your code. Should I need them and I discover that there is an ordering problem, I think it is a simple matter to put these in a list which gets run in list order after doing other variables first