You are on page 1of 2

Digital Signal Processing

Daniel Gruszczynski
September 11, 2015
This primer explores the mathematical theory of Digital Signal Processing
(DSP). For maximum benefit, I advise having prior exposure to analysis (both
real and complex), ODEs, and experience with algorithms. Nonetheless, I will
try to write so that even the uninitiated can easily follow; this stems from my
belief that rigor and clarity are symbiotic.
My main reference is the book Digital Signal Processing: Principles, Algorithms, and Applications by Proakis and Manolakis (1). The book is lengthy
but thorough, so my plan is condense the first six chapters ( 500 pages) into
a hundred. Sounds crazy right? This will cover: basic terminology, discrete
time signals, the z-transform, frequency analysis, the Discrete Fourier Transform (DFT), and the Fast Fourier Transform (FFT).

Preface: Evolution of the Function

The great irony of mathematics is how long it is taken for its (now) most indispensible concepts to be developed. Among those concepts is the notion of a
function. To appreciate this evolution, recall the set-theoretic definition:
Definition 1. Let A and B be sets. A function, denoted f : A B, is a
subset of A B such that, for every a A, there exists a unique b B with
(a, b) A B. We then write f (a) = b.
Under this interpretation, a function relates elements from one set to another
by some rule or assignment. The elements of A can be seen as inputs, f as a
black box, and the elements of B as outputs. Hence, we impose a restriction
on the inputs: no single input can yield two distinct outputs.
The concept of a function began as a way to describe the relationship between quantities called variables. Prior to Galileo, mathematicians thought
of functions as the literal table of values one would compute relating sets of
numbers. In other words, functions were their representations, and not the abstract relation between sets. That isnt to imply that a table doesnt suggest
that such a relationship exists, but rather that no one thought of them in this
general way. With Galileo, we see notions of forming correspondences between
geometric points, and with Descartes, we see the use of a function for curve construction. Leibniz and Johann Bernoulli employed the word function in their
1

discussions, but used it in an informal sense. Nonetheless, functions coincided


with the development of the calculus of variations, and so started to take on a
more mathematical sense. In 1755, Euler gave us a truly modern conception of
a function:
If some quantities so depend on other quantities that if the latter are
changed the former undergoes change, then the former quantities are called functions of the latter. This definition applies rather widely and includes all ways
in which one quantity could be determined by other. If, therefore, x denotes a
variable quantity, then all quantities which depend upon x in any way, or are
determined by it, are called functions of x.
Despite this, Euler fixated on analytic expressions, again possibly conflating
the concept of a function with its representation. The debates and development
continued among mathematicians such as Fourier, Dirichlet, Cauchy, etc. and
the function concept came to encompass discontinuous functions, explicit and
implicit functions, and so on. To curtail the history lesson, I end with this: our
study of signal processing is intimately linked to the development of the Fourier
series, which lies central to our study of signals. See (? ) for more information.
Now, we glossed over some philosophical concerns: what are sets and what
are variables? Mathematically, it is clear what we mean, but how exactly do
these concepts correspond to the physical world? What are the quantities that
Euler refers to? The evasive response is to say that these are placeholders for
things we can observe or measure, but already this definition fails to include all
such uses of functions. Do we take Poincares stance and avoid the pathological
abuses of functions? Or do we take the categorical approach and continue to
build our abstract theories? Either way, these concepts are inherently theoryladen and therefore make assumptions about reality. For now, we adopt this
ontology and see where it leads.

Terms and definitions

It is generally the case that our most primitive concepts lack satisfying definitions. This is no exception. We take on faith the existence of a concept known
as a signal. A signal takes on an Eulerian notion: it is a physical quantity that
varies with time, space, or any other independent variable(s). This implies that
a signal has the mathematical representation of a function (which shouldnt
surprise you given the ad hoc nature of our definition).
wave.

References
[1] J. G. Proakis and D. G. Manolakis, Digital Signal Processing. Prentice-Hall,
3 ed., 1996.

You might also like