You are on page 1of 38

I.

Discrete Time Random Signals (5/15/2011) Power versus Energy: Ex = x2(n) x2(n)

Px = lim 1/(1+2N)

Many signals x(n) are non-periodic, unpredictable, infinite energy signals with finite power.

Transform techniques are convenient.

However special averages of x(n) may have finite energy, just as Px can be finite even if Ex is . A. Probability Densities and Distributions [1] 1. Continuous Random Variables fx(x) is a non-negative function such that

then fx(x) is a probability density function (pdf) of the random variable X. Fx(x), the probability distribution of X is defined as Fx(x) = fx(u) du,

also monotonically non-decreasing


2

fx(x) =

Fx(x)/

x,

Note: X is a random variable and x denotes a value of X. 2. Discrete Random Variables Probability Mass Function: Here fx(x) = P(X = x) where X has a countably infinite set of possible values, which may be integer, rational, or real numbers. Denote possible values of X as x(k). fx(x(k)) = 1 Fx(x) = fx(x(n))

3. Relationships Between Two Random Variables: Joint Probability Distribution: FXY(x, y) = P(X x, Y y) fXY(x, y) = 2FXY(x, y)/ xy For discrete random variable case, fXY(x, y) = P(X = x, Y = y) X and Y are statistically independent if FXY(x, y) = FX(x) FY(y), or distribution of X and Y is distribution of X distribution of Y. Alternately, fXY(x, y) = fX (x) fY(y)

4. Stationarity Replace X and Y by Xn and Xm, where n and m denote times n and m. Then the distribution is FXnXm(xn, xm). The stationarity of the random process {Xn} implies: FXnXm(x, y) = FXn+kXm+k(x, y) for all n, m, k. In other words, joint distribution is shift invariant. Generally Xn is stationary iff: FXn1Xn2XnM(x1, x2, , xM) = FXn1+k,Xn2+k,XnM+k(x1, x2, , xM) For all values of M>0 and k

5. Additional Properties [2] (a) If U, V, and X.. are statistically independent, and Z = U + V + X, then fZ(z) = fU(z)* fV(z)*fX(z)..(* =convolution) (b) The Law of Large Numbers: 1 n As n , lim X k = E[X] n k =1 (c) Central Limit Theorem (CLT): Let X1, X2, X3, , Xn be a sequence of n (iid) random variables each having finite mean mx and variance 2 . If
S n n mx Sn = X k , Zn = n k =1 then they will converge in distribution to the standard normal distribution N(0,1) as n approaches infinity.
n

B. Averages: Let {Xn} be a random process, Xn is a random variable for each n, {Xn} is a random process. The ensemble of Xn would be a 2-D array as:

Vertical scale denotes all possible sequences for xn. Horizontal scale denotes all possible time values for n. We can average in the ensemble (vertical) mXn = E[xn] = x fXn(x) dx

where E[] denotes expected value. Let g(xn) be a function of the random variable xn. E[g(xn)] = g(x) fXn(x) dx

Discrete Case: mXn = E[xn] = E[g(xn)] = xfXn(x) g(x) fXn(x)

For stationary process xn, fXn(x) = fx(x), i.e., not a function of n. Example Calculations of Averages: Ex: Nonstationary process example. fXn(x) (a) Find E[x1] = (b) Find E[x5] = (c) Find E[xn] =

Ex: fXn(x) = 1 - |x|, |x| 1. Find E[xn] and E[xn2] E[xn] = = (1 - |x|)x dx

(1-x)x dx +

(1+x)x dx

E[xn2] =

(1 - |x|)x2 dx

10

Statistical Independence: Two random variables Xn and Ym are statistically independent iff fXnYm(x, y) = fXn(x) fYm(y) for continuous and discrete random variables.

Linear Independence: Two random variables are linearly independent or uncorrelated if E[xnym] = E[xn]E[ym]

Ex: Let xn and ym be statistically independent. Show that they are also linearly independent.

11

E[xnym]

xyfXnYm(x, y) dxdy fXn(x) fYm(y)

Ex: Prove (a) E[xn + ym] = E[xn] + E[ym]

(b) E[axn] = aE[xn]

12

In general, let f(n) be a function of n, but not a random function, i.e., f(n) = 1 + n + 3n2, E[f(n) g(xn)] = f(n) E[g(xn)] Ex: E[n xn] = n E[xn] = n mXn E[n3xn2] = n3E[xn2] f(n) is deterministic, not random. E[f(n)] = f(n), i.e., each member of f(n) ensemble is identical. Properties of Expected Value (1) E[xn + ym] = E[xn] + E[ym] (2) E[axn] = aE[xn] (3) E[f(n) g(xn)] = f(n) E[g(xn)] (4) E[f(n)] = f(n) (5) E[E[g(xn)]] = E[g(xn)]

13

Variance Mean-Square and Autocorrelation of xn: Mean Square of R.V. x is E[x2] = x2fx(x)dx x2fXn(x)dx

For xn, E[xn2] =

Variance = E[(xn - mXn)2]

In general case, they are functions of parameter n, but for stationary process, they are numbers.

14

Autocorrelation: rxx(n, m) = E[xnxm*] = xyfXnXm(x, y) dxdy

Autocovariance: cxx(n, m) = E[(xn mXn) (xm mXm)*] = rxx(n, m) - mXnmXm Crosscorrelation and Crosscovariance: rXY(n, m) = E[xnym*] cXY(n, m) = E[(xn mXn) (ym mYm)*] (1) Note that means are subtracted out for the covariances, forcing the quantities to be zero mean: (xn mXn) = zn is zero mean.
15

(2) Also time domain quantities will usually be real, so conjugate operation ()* will have no effect. (3) Random variables we look at will usually be zero mean, so that correlation and covariance will be equal. (4) Random processes we look at will often be stationary, so: fXn(x, n) = fX(x) fXnYm(x, y) = fXn+k,Ym+k(x, y) rXX(n, m) = rXX(m-n) rXX(n, n+m) = rXX(m) = E[xnxn+m*]

16

Wide Sense Stationary

A random process xn is wide sense stationary (WSS) if E[xn] is not a function of n and rxx(n,m) = rxx(m-n), i.e., mXn is constant and rxx is only a function of n-m. Ex. xn = X cos(w n+ ), where X and are statistically independent. X is zeromean and is uniformly distributed (has a uniform pdf) between and . Show that xn is WSS.

17

White Noise rxx(m) rxx(n-m) (m) = 0, otherwise (Kronecker delta function) = (m)2 (or) = (n-m)2 1, if m = 0

18

C. Time Averages [1] (Horizontal average over an ensemble member function) 1 <xn>N = ----------xn-k, (1+2N) xn is a random variable. <xn> = lim < xn>N

1 <xnxn+m>N = ----------(1+2N) <xnxn+m> = lim <xnxn+m>N Ergodicity

xn-kx*n+m-k

A random process xn is ergodic if time averages equal ensemble averages as


19

N , for every member function of the ensemble. Ergodic processes are stationary. Why? Up to now, xn has denoted a random variable and x(n) has denoted a sample value of xn. From now on we shall not often distinguish them. In ergodic process {x(n)}, <x(n)> = E[x(n)] = mx, <x(n)x(n+m)> = E[xnx*n+m] = rxx(m). D. Spectrum Representation of Infinite Energy Signals [1] The autocovariance of a wide sense stationary process can have finite energy and can have a z-transform and a Fourier transform.

20

1. Covariance and Correlation Functions for Real, Stationary Random Processes Stationarity implies that the n drops out. Also real implies that conjugate * drops out. Property 1: cxx(m) = rxx(m) mx2 cxy(m) = rxy(m) mxmy

Property 2: rxx(0) = E[x2n] = mean square value cxx(0) = x2 = variance Property 3: rxx(m) = rxx(-m) cxx(m) = cxx(-m) rxy(m) = ryx(-m) cxy(m) = cyx(-m)

21

Property 4: |rxy(m)| [rxx(0)ryy(0)]1/2 |cxy(m)| [cxx(0)cyy(0)]1/2 |rxx(m)| rxx(0) |cxx(m)| cxx(0) Property 6: For processes of interest, lim rxx(m) = mx2 lim cxx(m) = 0 lim rxy(m) = mxmy lim cxy(m) = 0 Observations: 1. Maximum of rxx(m) and cxx(m) is at m=0
22

2. rxx(m) and cxx(m) are even, so they are equal to rxx(|m|) and cxx(|m|)

Z-Transform Representation Rxx(z) = rxx(m)z-m, cannot converge

unless mx = 0, but, Cxx(z) = cxx(m)z-m, may still converge.

Region of Convergence: Ra < |z| <1/Ra Cxx(z) = cxx(m)(z-m+zm) + cxx(0)


23

z = Rej. Convergence of Cxx(z) iff, cxx(0)+ (a) If cxx(m)|z|m+ cxx(m)|z|-m< cxx(m)|z|m < for |z| < 1/Ra, then, cxx(m)|z|-m < for |z| > Ra. Sketch

Wiener-Khinchin (W-K) Theorem: The power spectrum is the DTFT of the autocovariance function (or of the autocorrelation function for the zero-mean case).
24

Pxx() = Cxx(ej) = Rxx(ej) if mx=0 Ex: Derive: x2 = 1/2

Pxx()d

Ex: Work for sketched PSD

Pxy() = Cxy(ej) Pxy() = Pyx*(-)

25

Response of Linear Systems to Real Stationary Input Signals Given mx, rxx(m), and unit pulse response h(k) of a digital filter we want to find ryy(m) and my.

Now find ryy(m) = E[y(n)y(n+m)] y(n) = h(k)x(n-k)

y(n+m) =

h(i)x(n+m-i)

26

ryy(m) = E[y(n)y(n+m)]

Ryy(z) = Rxx(z)V(z) Pyy() = |H(ej)|2PXX()

ryy(0) = 1/2

Pyy()d = 1/2

27

Ex: Find x2, Pyy(), y2

Ex: Rounding y(n) = x(n) + e(n), (a) me (b) e2

28

Ex: Evaluate E[(x(n+1) - x(n))2]

Ex: h(n) =

(2/)(sin2(n/2)/n), n 0 0 ,n=0

29

(a) Find expression for rXiXi(m)

Now find v(m)

30

Ex: x(n) is zero mean, white, with variance x2

(a) Is y2 = x2

h12(k) ?
31

(b) Is w2 = y2

h22(k) ?

32

blank

33

(c) h1(n) = anu(n), h2(n) = bnu(n) Find system impulse response and w2

34

Ex: A continuous-time random process {xa(t)} has the power spectrum: Sketch

xa(t) is sampled as x(n) = xa(nT). (a) What is the autocovariance sequence for x(n)?

35

(b) Given PXaXa(), how should T be chosen so that x(n) is white?

(c) For PXaXa() as shown, how should T be chosen so x(n) is white?

36

Therefore a band-limited continuous random process xa(t) can become (1) a band-limited discrete random process or, (2) a white discrete random process, if we have correct T and shape for PXaXa(). This is very convenient since white noise effects are easier to analyze.

37

References [1] A.V. Oppenheim and R.W. Schafer, Digital Signal Processing, Prentice Hall, 1975. [2] A. Papoulis, Probability, Random Variables, and Stochastic Processes, McGraw-Hill Book Company, 1965.

38

You might also like