Professional Documents
Culture Documents
February 4, 2015
Contents
General Motivation
Introduction to Metric Spaces
Introduction to Vector Spaces, Inner Products and Norms Hilbert and Banach Spaces
Projections
Geometric Notions
General Motivation I
Geometric Notions
In 3D geometry, we are used to notions of unit-length basis
vectors, dimensionality, angle between vectors, dot product,
etc. The need for orthogonal and unit-length basis vectors is
readily apparent (extension to N length vectors).
From DSP, we can see that these notions extend to signals we use orthogonal basis signals of unit energy to represent
signals.
In communications, an optimal receiver projects the received
signal on a space spanned by the basis of the modulated
signals - the same geometric notions as with vectors are used.
These notions extend to random variables too, and will play a
pivotal role in statistical signal processing for example.
3
X
ak bk
k=1
Metric Spaces I
The notion of metric spaces will be very useful to understand
norms, inner products and other important concepts.
Metric Spaces - Definition
A Metric d: X X R is a function that measures the
distance between elements in set X.
Properties of a Metric
1
d(x, y ) = d(y , x)
d(x, y ) 0
d(x, y ) = 0 iff x = y
Metric Spaces II
Commonly Used Metrics
For M 1 vectors x and y
d1 (x, y)
PM
d2 (x, y)
i=1
P
dp (x, y)
P
d (x, y) =
|xi yi |
M
i=1 |xi
M
i=1 |xi
maxM
i=1 |xi
yi
|2
y i |p
1
1
p
yi |
Examples
A quantizer that maps vector x to
x uses d1 (.) or d2 (.)
To measure the distance between binary codewords, one based
on the hamming distance dH can be used:
dH (x, y) =
M1
X
xi yi
(modulo 2)
i=0
Metric Spaces IV
(1)
Metric Spaces V
Signal Representation
In signal representation, d2 (.) is often used (Root mean-squared
error).
The Fourier series representation of
x(t) = sin(2kt/T
)
sin(2kt/T ) t 6= T /3
y (t) =
3
t = T /3
will be exactly the same! Convergence is in the mean-square sense.
Convergence issues arise in description of the Gibbs Phenomenon
Vector Spaces I
Linear Vector Space
Linear vector space S over aa set of scalars R is a collection of
vectors, together with additive operation + and a scalar
multiplication . such that:
1
x and y S then x + y S
(x + y) + z + x + (y + z)
Vector Spaces II
Some Definitions
Let S be a vector space. If V S is a subset such that V itself is
a space, then V is a subspace of S.
This notion of subspaces will be useful when we deal with Hilbert
Spaces and Projections.
Signals as vectors
Under some simple assumptions, we can treat signals as
vectors
A signal x(t) can be considered as an infinite-sized vector
Similarly, a sequence x[n] can be consdired to be an infinitely
long vector
Some issues arise with basis signals (convergence etc)
Any inifinte set of basis signals cannot span every possible
signal x(t) - hence the need for Dirichlet conditions in Fourier
Transforms
k x k is real, and 0
k x k= 0 Iff x = 0
k cx k= |c| k x k
k x + y kk x k + k y k (triangle inequality)
&
lp norm:
P
1
M
p p
k x kp =
|x
|
i
i=1
k x(t) k1 =
&
Rb
a
|x(t)|dt
k x(t) kp =
R
b
a
|x(t)|p dt
1
p
l norm:
k x k = maxi |xi |
&
k x(t) k = sup[a,b] |x(t)|
All these satisfy all conditions of a norm.
The norm used depends on the application.
Inner Product I
Defintion and Properties
For vector space S with elements R, the inner product h., .i
S S R satisfies:
1
hcx, yi = chx, yi
hx + y, zi = hx, zi + hy, zi
Inner Product II
Hilbert Space
A complete normed linear space with an inner product (with the
norm being the induced norm) is referred to as a Hilbert Space.
Orthogonal Subspaces Let S be a vector space, and let V and W
be subspaces of S. V and W are orthogonal if every vector in V is
orthogonal to every vector in W.
Rb
a
x(t)y (t)dt
Inner Product IV
Projection Theorem
Let S be a Hilbert space, and V a subspace of S. Then for every
vector x S, there exists a unique vector vx V that is closest to
x. ||x vx || is minimized only when x vx is othogonal to V.
This theorem plays a fundamental role in communications,
statistical signal processing and many other areas.
Inner Product V
Inner Product VI
Cauchy-Schwartz Inequality
|hx, yi| k x kk y k
with equality iff y = x. Defined similarly for signals.Proof...
Using this, the angle between real x and y can be understood:
cos() =
hx, yi
k x k2 k y k2
<{hx, yi}
k x k2 k y k2