From mboxrd@z Thu Jan 1 00:00:00 1970 Message-ID: <3D1C2D03.B288CC2B@strakt.com> From: Boyd Roberts MIME-Version: 1.0 To: 9fans@cse.psu.edu Subject: Re: [9fans] dumb question References: , <20020627115912.S7017@cackle.proxima.alt.za> <3D1B2CCF.1FD49007@null.net> <20020627185720.V7017@cackle.proxima.alt.za> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Date: Fri, 28 Jun 2002 11:31:47 +0200 Topicbox-Message-UUID: bd3593b0-eaca-11e9-9e20-41e7f4b1d025 Lucio De Re wrote: > I keep promising myself I'll write a "stat" modelled on "du" that > produces a record of stat fields for each entry in the descended > directory. I did this once so I could build a magnetic version of /n/dump on ULTRIX. I wanted to hang onto files modified in the 'recent past' so I needed something to get at the stat information and then use this to decide what files to copy (based on type, mtime, size, name, ...). So I wrote 'walk' and implemented an /n/dump file tree writer as: walk | select | push walk was program as it had to call ftw(3), but the others were scripts. select contained the logic for deciding if a file was worth copying and if so it output the name of the file. I also had a plan that select could be user supplied, but I never implemented it. push would copy filenames read from stdin to /n/dump/YYYY/MMDD/... All 3 ran as the user whose home directory was being copied. This was imperfect, but it was secure. The controlling script used 'su' to become the user, avoiding the yet another program running as root syndrome. Time information output by walk was represented as a numeric value to avoid re-parsing problems and it was never intended to be read by humans. Anyway, it's a trivial exercise to write a filter to mung the output of walk. As far as quoting went I think i decided that if you were stupid enough to name your files with newlines you didn't deserve to get them copied. select would catch some of these or push would get 'No such file or directory'. The name of the file was output last by walk, after the stat info. This version of /n/dump was just a convenience so loss of data was no big drama (the standard backup system would catch that). The hardest part was getting the script to glue 4 RA-90's [1Gb each] together so that the last month of stuff was kept/recycled. Teaching shell scripts about time is no fun.