Reading a paper is like peeling an onion...
I've been reading a lot of papers lately. I should say I've been reading a lot of papers carefully lately. I usually read a lot of papers. Because what else is a guy to do on a nice winter evening?
So what's different this time? I'm preparing for my preliminary oral exam to progress to candidacy in the PhD program here at Maryland. For the 'exam' (as the name 'oral' indicates, it's really more of a presentation), we ('we' being people in the Applied Mathematics, Statistics, and Scientific Computation program) do a literature review of a particular topic, choosing one or two primary references and a smattering (approximately eight or so) secondary references. The goal is to 'demonstrate that [I have] a deep enough understanding to carry out research in a proposed area.' No big deal.
The area of research I've chosen to focus on is called computational mechanics. That deserves a post of its own someday soon. And a Wikipedia page. Because this is not what I mean1. (Also, \(\epsilon\)-machines2.) Rather, I mean the study of processes from a 'computation theoretic' framework. That is, study a given phenomena as something doing a computation. An obvious example of such a thing is a neuron. But other natural systems 'compute' in much the same.
And this framework turns out to be useful statistically, in terms of optimally predicting some stochastic process \(X(t)\).
But I'm getting carried away. The thing I wanted to comment on is the difference between 'deep reading' and just 'reading.' Everyone who has pursued a degree that involves 'working problems' (i.e. solving some sort of mathematically oriented problem set) knows that you can't read a math (or physics, or chemistry, or...) textbook the same way you read a novel. And yet I forget that sometimes. Just because I think I know something at a given instant of time doesn't mean that (a) I really understand the finer points or (b) the material has any chance of entering my long term memory.
I have a resolution in mind for the rest of the year: any paper I want to read and internalize, I won't allow myself to read without a pen in hand.
Computational mechanics has no textbook, very few pedagogical references, and no Wikipedia entry. What is does have is a very badass reputation as the optimal nonlinear predictive filter for any stochastic process. So, yeah, that's kind of cool. But at the same time, I wish that one of the fundamental building blocks didn't have the name \(\epsilon\)-machine.↩
Try Googling that, and you'll learn about machine epsilon. A super useful concept, but not at all related to computational mechanics. Fortunately the second entry is by Cosma Shalizi, though it's a little outdated. (I was eleven years old when the entry was last updated...)↩
Fun fact: the first issue of SEED I ever saw was in high school, when Kevin Orlando (my high school chemistry teacher and friend) had a copy lying around. The main topic for that issue: homosexuality in the animal kingdom.↩