Gnus development mailing list
 help / color / mirror / Atom feed
From: Richard Lawrence <richard.lawrence@berkeley.edu>
To: ding@gnus.org
Subject: Automatically processing plain text attachments
Date: Sun, 23 Jan 2011 19:11:41 -0800	[thread overview]
Message-ID: <877hdvf11u.fsf@berkeley.edu> (raw)

Hi all,

[I'm new to the list, so apologies if this is a rather newb-ish question.
I looked in the manual, and tried to search the archives at Gmane, but
didn't come up with much, perhaps because I'm not exactly certain which
terms I should be using.  Pointers to documentation of all kinds would
be greatly appreciated...]

I'm hoping I can solicit a little advice about how to pull plain text
email attachments off of a subset of incoming messages and do some batch
processing on them, without seriously slowing down the rest of my mail
reading.

The background: I am about to begin teaching a writing-intensive course.
Students will email me their papers every week.  I have no desire to
download, print, and read a bunch of .doc files by hand every week.  So
I am considering asking my students to email their papers in plain text.
I would like to then apply some automated processing on my end that
would:

- download each student's paper 
- apply some (hopefully) simple transformations on the text
- save the resulting document in my "teaching" directory

The goal is to have these papers end up in Org mode format, so I can do
further batch processing that will export them using LaTeX, add them to
my to-do list, etc.  Since I will have about 100 papers to read over the
course of this semester, it seems like automating this is the right way
to go.

I currently use a very simple Gnus setup.  I am running Gnus 5.13 in
Emacs 23 on Debian.  I read my mail over nnimap.

Here's what I was thinking might work: 

1) Tell my students that they must indicate in the subject line of their
email that it contains a paper submission, so I can split those emails
to a special group using nnimap-split-rule.

2) Somehow, further process the emails that end up in that group.  For
example:
 - the group could represent a local Maildir, and I could have a cron job
   process new emails found there
 - the group could represent an IMAP folder, and when I read articles in that
   group, Gnus runs a hook that extracts attachments, processes them
   in a temporary buffer, and saves the result to a file (or perhaps
   refiles in an existing Org file)

Perhaps my question boils down to: what hooks are available in Gnus that
would allow me batch process the full content of these paper-submission
emails at the time they are split from my inbox (without having to
download the full content of *every* mail, e.g., by setting
nnimap-split-download-body)?

Thanks so much for any advice you can offer!

Best,
Richard




             reply	other threads:[~2011-01-24  3:11 UTC|newest]

Thread overview: 4+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2011-01-24  3:11 Richard Lawrence [this message]
2011-01-24 22:00 ` Lars Ingebrigtsen
2011-01-25 17:21   ` Richard Lawrence
2011-01-27  0:53     ` Lars Ingebrigtsen

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=877hdvf11u.fsf@berkeley.edu \
    --to=richard.lawrence@berkeley.edu \
    --cc=ding@gnus.org \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).