I’m an incidental economist for many reasons. One is described on the About page. Another is that while I am an economist by profession I do not have an economics degree. My undergraduate studies were in Applied and Engineering Physics at Cornell and my graduate work was in image and signal processing in the Electrical Engineering and Computer Science Department at MIT (the Stochastic Systems Group).

Maybe I’ll tell the story another time of how one seamlessly transitions from MIT’s Stochastic Systems Group to health policy. For now I’m going to tell the story of what my graduate research was about. Since this has nothing whatsoever to do with economics, health policy, law, or my current life, I would forgive readers for stopping right here. I can’t even promise this will be much fun. My main motivation for writing this post is that it finally brings my blogging about my publications up to date. With this post every single one of my publications has been described and referenced. Whew!

Onward! In graduate school I studied multiscale signal and image processing (in the spirit of, though distinct from wavelet-based multiresolution analysis). Huh? The quick-and-dirty way to think about this stuff is successive approximation.

Example 1: In an image that looks pretty much the same everywhere (like a close up picture of tree bark) there is something very different (like a big black blob). How would a machine find it? The multiscale way is to first figure out what quadrant it is in. Then, focusing on that quadrant, figure out what quadrant of that one it is in, and so on. Zoom in by successive quadrant selection. This is what I worked on for my master’s thesis. Only I didn’t have a normal image. My measurements were weirder. They were tomographic (like a CAT scan) [1].

Example 2: You have a one-dimensional series of data (like the S&P 500). You could model it as an autoregressive (AR) process of some order. No way you’re getting a PhD for that. Instead, consider a generalization of AR models, indexed not by the integers but by nodes of a tree (in the graph theoretic, not arboreal, sense). My work was to relate such models to wavelets [2], develop computationally efficient algorithms for estimating model parameters [3], and generalizing a famous algorithm known in the AR modeling framework (Levinson’s algorithm) and using it to solve a famous problem (covariance extension) [4].

Well, I did a few other things you can read about in my thesis, and many things you cannot read about anywhere. My years as a graduate student were wonderful and fulfilling. The proof is that among my bigger regrets is that I could only find 199 scholarly works to cite in my thesis. I had wanted to break 200 but in the final moments before submission I could not come up with even one more (these were the days before Google Scholar). Pretty trivial regret, no?

I learned a lot in graduate school, including that statistical signal and image processing were not for me. Not enough policy relevance. A month after graduation I began work in health policy and economics. Ten years later I launched this blog. I could not have guessed this trajectory in a thousand tries.

[1] Frakt AB, Karl WC, and Willsky AS, “A Multiscale Hypothesis Testing Approach Anomaly Detection and Localization From Noisy Tomographic Data,” *IEEE Transactions on Image Processing*, 7(6) (June 1998): 825-837.

[2] Daoudi K, Frakt AB, and Willsky AS, “Multiscale Autoregressive Models and Wavelets,” *IEEE Transactions on Information Theory*, 45(3) (April 1999):828-845.

[3] Frakt AB and Willsky AS, “Computationally Efficient Stochastic Realization for Internal Multiscale Autoregressive Models,” *Multidimensional Systems and Signal Processing*, 12(2) (April 2001): 109-142.

[4] Frakt AB, Lev-Ari H, and Willsky AS, “A Generalized Levinson Algorithm for Covariance Extension With Application To Multiscale Autoregressive Modeling,” *IEEE Transactions on Information Theory*, 49(2) (February 2003): 411-424.