You are on page 1of 56

Random vibrations

Structural Dynamics and Stability


School of Mechanical, Aerospace & Civil Engineering

Partha Mandal (P/B19)


A system is vibrating if it is shaking or moving backwards or
forwards in some way.

If this motion is unpredictable then the system is said to be in


random vibration.
Example: The motion of a leaf fluttering in the breeze.

The subject of random vibrations is concerned with finding out


how the statistical characteristics of the motion of a randomly
excited system, like a leaf, depend on the statistics of the
excitation.
Michael Fish is Britain's longest-serving TV
weather forecaster, and probably the best known.

Michael Fish's broadcasting career began in 1971


when he started forecasting for BBC Radio and in
January 1974 became part of BBC Television's
weather team and continues to present across the
BBC channels. Before his retirement in 2004,
Michael was one of the longest serving
Michael Fish Broadcast Meteorologists in the world.

During a weather forecast in October 1987, Michael Fish told


viewers that "a woman rang to say she'd heard there was a
hurricane on the way. Well, don't worry," he continued, "there
isn't." Brushing aside the amateur's forecast with a chuckle, Fish
promised "sea breezes" and a "showery airflow."
Britain was promptly hit by 120 mph winds
which ripped up 300 miles of power cables,
plunged a quarter of the country into
darkness, blocked 200 roads with fallen
branches, downed 25 per cent of the trees in
Kent and stopped all rail traffic in the south
for twenty-four hours.
An ambulance at Hayling island was hit by a
yacht floating across the road and the
Meteorological Office called it the worst
hurricane since 1703.

[Fish's spokesman later explained: "It's really all a


question of detail." In fact, technically, Fish was
correct. It was not a hurricane, but an intense North
Atlantic depression. "We don't get hurricanes in the
West Country," one observer explained. "We get
hurricane force winds."]
Probability density function

Non-random
Sine wave

Time for which


x ≤ x(t ) ≤ x + dx
x = x0 sin ω t
dx = x0ω cos ω t dt

dx dx
dt = =
x0ω cos ω t ω x02 − x 2

The proportion of time per cycle that x(t) spends in the band x
to x+dx is
2 ( dt ) dx
= where T = 2π / ω
T π x02 − x 2

= Prob( x ≤ x(t0 ) ≤ x + dx)


The first order probability density function p(x) is defined so that

Prob( x ≤ x(t0 ) ≤ x + dx) = p ( x) dx


1
p( x) = for − x0 ≤ x ≤ x0
π x02 − x 2

Prob (− x0 ≤ x(t0 ) ≤ x0 )
x0
=∫ p ( x) dx
− x0
Probability density function for a sine wave.
Random process

p ( x)dx =
∑ dt
T

Probability Analyser

dn
p ( x)dx =
N
Gaussian distribution

Many natural events


follow Gaussian
(Normal) distribution.

1 ⎛ −( x − m) 2 ⎞
p( x) = exp ⎜ ⎟
σ 2π ⎝ 2σ
2

Calculation of averages

Mean value of x is normally denoted by E[x] where E stands for


the statistical expectation of.

(E[x])T = Total area under the x(t) curve during the interval T
T
= ∫ x(t ) dt
0

T dt ∞
E[ x ] = ∫ x(t ) = ∫ x p ( x) dx
0 T −∞
The mean square value of x, E[x2], is defined as the average value
of x2 and can be written as
T dt ∞
E ⎡⎣ x ⎤⎦ = ∫
2
x (t ) = ∫ x 2 p ( x) dx
2
0 T −∞

The standard deviation of x, usually denoted by σ, and the variance


σ2, are defined as

σ = E ⎢( x − E [ x ]) ⎤⎥

2 2

⎣ ⎦
Variance, σ2 is the mean of the square of the deviation of x from
its mean level E[x].
σ = E ⎢( x − E [ x ]) ⎤⎥

2 2

⎣ ⎦
= E ⎢ x − 2 x E [ x ] + E [ x ] ⎤⎥


2 2

⎦ ( )
= E ⎡⎣ x ⎤⎦ − 2 E [ x ] ⋅ E [ x ] + ( E [ x ])
2 2

= E ⎡⎣ x ⎤⎦ − ( E [ x ])
2 2

Or,

{
(variance) = (standard deviation) 2 = Mean square - ( Mean )
2
}
For Gaussian processes E [ x] = m

Therefore, E ⎡⎣ x 2 ⎤⎦ = σ 2 + m 2
Probability distribution function

P ( x ) = ∫ p ( x) dx
x

−∞

dP ( x)
= p ( x)
dx


Prob ( −∞ ≤ x ≤ ∞ ) = ∫ p( x) dx = P ( x = ∞ ) = 1
−∞
Example

A random variable x is distributed between 0 and 1 so that

p( x) = 1 for 0 ≤ x ≤1
and
p( x) = 0 for x < 0 and x > 1

Determine E [ x ] , E ⎡⎣ x 2 ⎤⎦ and σ x .
Second order or joint probability
density function

Prob ( x ≤ x(t0 ) ≤ x + dx and y ≤ y (t0 ) ≤ y + dy ) = p ( x, y ) dx dy


x2 y2
Prob ( x1 ≤ x(t0 ) ≤ x2 and y1 ≤ y (t0 ) ≤ y2 ) = ∫ ∫ p( x, y ) dx dy
x1 y1

∞ ∞
Prob (−∞ ≤ x(t0 ) ≤ ∞ and -∞ ≤ y (t0 ) ≤ ∞ ) = ∫ ∫ p ( x, y ) dx dy = 1
−∞ −∞


Prob ( x ≤ x(t0 ) ≤ x + dx and -∞ ≤ y (t0 ) ≤ ∞) = dx ∫ p( x, y ) dy
−∞


p ( x) dx = dx ∫ p ( x, y ) dy
−∞


p ( x) = ∫ p ( x, y ) dy
−∞


Similarly, p ( y ) = ∫ p ( x, y ) dx
−∞
Conditional Probability

p ( x, y )
p( x y ) = Prob (−∞ ≤ x(t0 ) ≤ ∞ and y ≤ y (t0 ) ≤ y + dy ) =
p( y )

For statistical independence of x and y p ( x y ) = p( x)


Hence p ( x y ) = p ( x) p ( y ) if x and y are are two statistically
independent random variables
Ensemble averaging
• Ensemble or collection of sample functions x1(t), x2(t), x3(t),
etc., which together make up the random process x(t).
• Instead of being measured along a single sample, ensemble
averages are measured across the ensemble.
• For a Gaussian random process, the ensemble probability
distributions must all be Gaussian.
Stationary random process
• A random process is said to be stationary if the probability
distributions obtained for the ensemble do not depend on absolute
time (but only, for second and higher order probability
distributions, on the time separation between measuring points).
• Of course, the term stationary refers to the probability distributions
and not to the samples themselves.
• This implies that all the averages are independent of absolute time
and, specifically, that the mean, mean square, variance and
standard deviation are independent of time altogether.
• The term weakly stationary is sometimes used to describe
processes in which only the first- and second-order probability
distributions are invariant with time.
• A strictly stationary process is one for which all probability
distributions of the ensemble are invariant with time.
Ergodic process

• A stationary random process is called an ergodic process


if, in addition to all the ensemble averages being
stationary with respect to a change of the time scale, the
averages taken along any single sample are the same as
the ensemble averages.
• Each sample function is then completely representative of
the ensemble that constitutes the random process.
• If a process is ergodic it must also be stationary.
Correlation

x and y values uncorrelated x and y values correlated

We wish to express an approximate functional relationship


between x and y in the form of a straight line, say.
One way to achieve that would be to minimize the square of the
deviation of the actual values of y from their values predicted by
the straight line approximation.

If the positions of the axes are adjusted so that E[x] = E[y] = 0,


i.e., so that the origin lies at the “centre of gravity” of the data
points, then the straight line approximation passes through the
origin and can be written as y = m x

The deviation of any value y from its predicted value mx is


Δ = y − mx

The average value of the square of the deviation is

E ⎣⎡ Δ 2 ⎦⎤ = E ⎡( y − mx ) ⎤ = E ⎣⎡ y 2 ⎦⎤ + m 2 E ⎡⎣ x 2 ⎤⎦ − 2mE [ xy ]
2

⎣ ⎦
E ⎡⎣ Δ 2 ⎤⎦ is minimum, when by differentiation with respect to m
0 = 2mE ⎡⎣ x 2 ⎤⎦ − 2 E [ xy ]

Or, E [ xy ]
m=
E ⎡⎣ x 2 ⎤⎦

Hence, E [ xy ]
y= x
E ⎡⎣ x ⎤⎦
2

Since for zero mean processes E ⎡⎣ x 2 ⎤⎦ = σ x2 and E ⎡⎣ y 2 ⎤⎦ = σ y2

y ⎧⎪ E[ xy ] ⎫⎪ x
We can write =⎨ ⎬
σ y ⎪⎩ σ xσ y ⎪⎭ σ x
Line of regression of y on x Line of regression of x on y

y ⎧⎪ E[ xy ] ⎫⎪ x x ⎧⎪ E[ xy ] ⎫⎪ y
=⎨ ⎬ =⎨ ⎬
σ y ⎪⎩ σ xσ y ⎪⎭ σ x σ x ⎪⎩ σ xσ y ⎪⎭ σ y
When x and y have non-zero means
y − my ⎧⎪ E[( x − mx )( y − m y )] ⎫⎪ x − mx
=⎨ ⎬
σy ⎪⎩ σ xσ y ⎪⎭ σ x

and
x − mx ⎧⎪ E[( x − mx )( y − m y )] ⎫⎪ y − my
=⎨ ⎬
σx ⎪⎩ σ xσ y ⎪⎭ σ y
where mx and my are the mean values of x and y respectively.

E ⎡⎣( x − mx )( y − m y ) ⎤⎦
The parameter ρ xy =
σx σy

is called the correlation coefficient.


Autocorrelation

E [ x(t ) x(t + τ ) ]
For stationary processes, E[x(t) x(t+τ)] will be independent of
absolute time t and will depend on the time separation τ.

E [ x(t ) x(t + τ ) ] = f (τ ) = Rx (τ ) Autocorrelation


function

If x(t) is stationary, E [ x(t ) ] = E [ x(t + τ )] = m Mean

and σ x (t ) = σ x (t +τ ) = σ Standard
deviation

The correlation coefficient for x(t) and x(t+τ) can be


written as
E ⎡⎣{ x(t ) − m}{ x(t + τ ) − m}⎤⎦
ρ=
σ2
E ⎡⎣{ x(t ) − m}{ x(t + τ ) − m}⎤⎦
ρ=
σ2
E [ x(t ) x(t + τ ) ] − mE [ x(t + τ ) ] − mE [ x(t ) ] + m 2
=
σ2
Rx (τ ) − m 2
=
σ2

Rx (τ ) = σ 2 ρ + m 2 where − 1 ≤ ρ ≤ 1

Hence −σ 2 + m 2 ≤ Rx (τ ) ≤ σ 2 + m 2

Rx (τ = 0 ) = E ⎡⎣ x(t ) 2 ⎤⎦ = E ⎡⎣ x 2 ⎤⎦
At very large time intervals, τ→∞, a random process will be
uncorrelated, hence ρ→0.
Rx (τ → ∞ ) → m 2
Since for a stationary process, Rx(τ) depends only on the
separation time τ and not on absolute time t,

Rx (τ ) = E [ x(t ) x(t + τ ) ] = E [ x(t ) x(t − τ ) ] = Rx ( −τ )

such that Rx(τ) is an even function of τ.


Cross-correlation
For two different stationary random processes x(t) and y(t)
Rxy (τ ) = E [ x(t ) y (t + τ ) ] and Ryx (τ ) = E [ y (t ) x(t + τ ) ]

−σ xσ y + mx m y ≤ Rxy (τ ) ≤ σ xσ y + mx m y

Rxy (τ ) = Ryx (−τ )

Rxy (τ → ∞) → mx m y
Fourier analysis
If x(t) is a periodic function
of time t, with period T, it
can be expressed as the
following (Fourier) series.

⎛ 2π kt 2π kt ⎞
x(t ) = a0 + ∑ ⎜ ak cos + bk sin ⎟
k =1 ⎝ T T ⎠
where the Fourier coefficients can be written as
1 T /2
a0 = ∫ x(t ) dt
T −T / 2
2 T /2 2π kt
ak = ∫ x(t ) cos dt
T − T / 2 T
k ≥1

2 T /2 2π kt
bk = ∫ x(t ) sin dt
k ≥1 T −T / 2 T
Suppose that the position of the t-axis is adjusted so that the mean
value of x(t) is zero, i.e., a0 is zero.

Spacing between adjacent harmonics is Δω = 2π / T


When T becomes large, the spacing Δω becomes small, and
the Fourier coefficients become tightly packed.

In the limit when T→∞, they will in fact actually merge together.
Since in this case x(t) no longer represents a periodic phenomenon
we can no longer analyse it into discrete frequency components.
Fourier integral
Introducing the notations,
1 ∞ 1 ∞
A(ω ) =
2π ∫
−∞
x(t ) cos ω t dt and B(ω ) =
2π ∫
−∞
x(t ) sin ω t dt

we can write the function x(t) as


∞ ∞
x(t ) = 2 ∫ A(ω ) cos ω t dω + 2∫ B(ω ) sin ω t dω
0 0

This is a representation of x(t) by a so-called Fourier


Integral or Inverse Fourier transform.

The terms A(ω) and B(ω) are the components of Fourier


transform of x(t).
Complex form of the Fourier transform

We know that eiθ = cosθ + i sin θ

Define X (ω ) = A(ω ) − i B(ω )

Then the Fourier transform of x(t) can be written as


1 ∞
X (ω ) = ∫ x(t ) ( cos ω t − i sin ω t ) dt
2π −∞

1 ∞
=
2π ∫ −∞
x(t ) e − iω t dt

And the inverse Fourier transform will be



x(t ) = ∫ X (ω ) eiω t dω
−∞
Spectral density
The time history x(t) of a sample function of a random process is
not periodic. Therefore, it cannot be represented by a discrete
Fourier series.

Also for a stationary process, x(t) goes on for ever and the
condition ∞
∫−∞
x(t ) dt < ∞
is not satisfied, so that the classical theory of Fourier analysis
cannot be applied to a sample function.

The difficulty can be overcome by analysing, not sample function


of the process itself, but its autocorrelation function, Rx(τ).
If the mean m = E[x] = 0, then provided that x(t) has no
periodic components
Rx (τ → ∞ ) = 0

and the condition ∫−∞
Rx (τ ) dτ < ∞ is satisfied.

The Fourier transform of Rx(τ) and its inverse can be written as


1 ∞
S x (ω ) =
2π ∫
−∞
Rx (τ ) e − iωτ dτ


Rx (τ ) = ∫ S x (ω ) eiωτ dω
−∞

where Sx(ω) is called the spectral density of the random process


and is a function of angular frequency ω.

Rx (τ = 0) = ∫−∞ x ω ω = ⎡
⎣ ⎤⎦
2
S ( ) d E x

The mean square value of a stationary random process is given by


the area under a graph of spectral density Sx(ω) against ω.

Sx(ω) is also called the mean square spectral density.

Since, Rx(τ) is an even function of τ, it can be shown that Sx(ω) is


a real even function of ω.
Example
Determine the mean square and
autocorrelation function for the
stationary random process x(t)
whose mean square spectral
density is shown on the left.


E ⎡⎣ x ⎤⎦ = ∫ S x (ω ) dω = 2 S0 (ω 2 − ω1 )
2
−∞

∞ ∞
Rx (τ ) = ∫ S x (ω ) e iωτ
dω = ∫ S x (ω ) cos ωτ dω
−∞ −∞

since Sx(ω) is an even function of ω.



Rx (τ ) = ∫ S x (ω ) cos ωτ dω
−∞
ω2
= 2 ∫ S0 cos ωτ dω
ω1

ω2
⎡1 ⎤ S0
= 2 S0 ⎢ sin ωτ ⎥ = 2 (sin ω 2τ − sin ω1τ )
⎣τ ⎦ω1 τ
4S0 ⎛ ω1 + ω 2 ⎞ ⎛ ω 2 − ω1 ⎞
= cos ⎜ τ ⎟ sin ⎜ τ⎟
τ ⎝ 2 ⎠ ⎝ 2 ⎠
Narrow band process

A process whose spectral density has the form shown in the


following figure, called narrow band process, because its
spectral density occupies only a narrow band of frequencies.
Broad band process
A broad band process is one whose spectral density covers a
wide band of frequencies, and the time history is made up of
the superposition of the whole band of frequencies.

In the limit when the frequency band extends from ω1 = 0 to


ω2 =∞, the spectrum is called white noise.
The mean square value of a white noise process must be infinite, so
white noise is only a theoretical concept, but in practical terms a
spectrum is called white if it is broad band noise whose bandwidth
extends well past all the frequencies of interest.
In the previous example, if we set ω1 = 0, Rx(τ) becomes
4 S0 ⎛ ω 2τ ⎞ ⎛ ω 2τ ⎞ 2S0
Rx (τ ) = cos ⎜ ⎟ sin ⎜ ⎟ = sin (ω 2τ )
τ ⎝ 2 ⎠ ⎝ 2 ⎠ τ
When ω2→∞, adjacent cycles pack together so tightly that they
become indistinguishable from a vertical spike of infinite height,
zero width, and finite area of 2πS0.
Using Dirac delta function, we can write Rx (τ ) = 2π S0 δ (τ )

The delta function δ(τ) is defined so that it is zero everywhere


except at τ = 0, when it is finite in such a way that

∫−∞
δ (τ ) dτ = 1

Then
1 ∞
S x (ω ) =
2π ∫
−∞
2π S0 δ (τ ) e − iωτ dτ

= S0
Frequency response
Determine the amplitude ratio and phase
angle for the transmission of sine wave
excitation through the spring-damper system.
Excitation x(t ) = x0 sin ω t
Response y(t ) = y0 sin(ω t − φ )

Equation of motion cy + ky = x(t )

cy0ω cos(ω t − φ ) + ky0 sin(ω t − φ ) = x0 sin ω t


Or,
⎧ x0 ⎫
y0 sin ω t ⎨cω sin φ + k cos φ − ⎬ + y0 cos ω t {cω cos φ − k sin φ } = 0
⎩ y0 ⎭
y0 1
=
x0 ( c ω
2 2
+ k 2
)

φ = tan −1

Both the amplitude ratio and phase angle could be collectively


represented by a complex number H(ω), the (complex) frequency
response function.
H (ω ) = A(ω ) − i B (ω )
so that
B
H (ω ) = ( A + B )
2 2
and = tan φ
A
If a constant amplitude harmonic input is given by

x(t ) = x0 eiω t
is applied to a linear system, the corresponding output will be

y (t ) = H (ω ) x0 e iω t

In summary,
y (t ) = H (ω ) x(t )
Impulse response

The frequency response function H(ω) gives the steady state


response of a system to a sine wave input.

Alternatively the transient response due to a disturbance, can be


used to define the dynamic characteristic of a system.

h(t) is the impulse response function.


Relationship between H(ω) and h(t)


H (ω ) = ∫ h(t ) e − iω t
dt
−∞

and
1 ∞
h (t ) = ∫ H (ω ) e iω t

2π −∞
Mean square response
Determine the output spectral
density Sy(ω) for the single
degree-of-freedom oscillator
shown in the figure, when it is
excited by a forcing function
x(t) whose spectral density
Sx(ω) = S0
It can be shown that output spectral density
2
S y (ω ) = H (ω ) S x (ω )
where H(ω) is the complex frequency response function.

To find H(ω), put x(t ) = e iω t


and y = H (ω )e iω t

in the equation of motion of the oscillator.

my + cy + ky = x(t )


x(t ) = eiω t and y = H (ω )eiω t

my + cy + ky = x(t )

(−mω + ciω + k ) H (ω ) = 1
2

1
H (ω ) =
− mω + ciω + k
2
Output spectral density
2
S y (ω ) = H (ω ) S x (ω )

S0
Or S y (ω ) =
(k − mω 2 ) 2 + c 2ω 2
∞ π S0
E ⎡⎣ y ⎤⎦ = ∫ S y (ω ) dω =
2
−∞ kc

The peak value of Sy(ω) occurs, for small damping when

k
ω≈ = ωN
m
and the height of the peak is

S0S0 m
S y (ω N ) = 2 2 = 2
c ωN c k

Half-power bandwidth c
2Δω 
m
References
• Newland, D. E., An Introduction to Random Vibrations,
Spectral & Wavelet Analysis, 3rd Ed., Longman, 1993.
• Clough, R.W., and Penzien, J., Dynamics of Structures,
McGraw-Hill, 1993.
• Gleick, J., Chaos, Vintage, 1998.
• Kreyszig, E., Advanced Engineering Mathematics, 7th Ed.,
John Wiley, 1993, §10.9.

You might also like