Professional Documents
Culture Documents
A Gaussian random variable with mean m and variance 2 is denoted by N(m, 2).
The random variable N(0, 1) is usually called standard normal.
The Gaussian random variable is the most important and frequently encountered
random variable in communications. The reason is that thermal noise, which is the
major source of noise in communication systems, has a Gaussian distribution.
Assuming that X is a standard normal random variable, we define the function Q(x) as
P(X > x). The Q-function is given by the relation
Table 5.1 gives the values of this function for various values of x.
10
For an N(m, 2) random variable, a simple change of variable in the integral that
Example 5.1.7
Assuming X is a Gaussian random variable with m = 0 and = 1, find the probability
density function of the random variable Y given by Y = aX + b.
Solution In this case, g(x) = ax + b; therefore, g(x) = a. The equation ax + b = y
has only one solution, which is given by x = . Using these results, we obtain
11
From this example, we arrive at the important conclusion that a linear function of a
Gaussian random variable is itself a Gaussian random variable.
12
Statistical Averages. The mean, expected value, or expectation of the random variable
X is defined as
13
It is also easy to verify that the variance has the following properties:
14
Figure 5.11 Sample functions of the random process given in Example 5.2.2.
15
there exists a deterministic time function x(t; i ), which is called a sample function or a
realization of the random process. At each time instant t0 and for each i , we have
the number x(t0; i ). For the different outcomes (i s) at a fixed time t0, the numbers
x(t0; i) constitute a random variable denoted by X(t0). In other words, at any time
instant, the value of a random process is a random variable.
16
Example 5.2.4:In Example 5.2.1, determine the values of the random variable X(0.001).
Solution The possible values are cos(0.2), cos(0.4), . . . , cos(1.2) and each has a
1
probability .
6
Example 5.2.5:Let denote the sample space corresponding to the random experiment of
throwing a die. Obviously, in this case = {1, 2, 3, 4, 5, 6}. For all i, let x(t ;i )=ietu1(t)
denote a random process. Then X(1) is a random variable taking values e1, 2e1, . . . , 6e1
1
and each has probability 6. Sample functions of this random process are shown in Figure
5.14.
17
Since at any t0 the random variable X(t0) is well defined with a probability density
function fX(t0) (x), we have
Example 5.2.7
The mean of the random process in Example 5.2.2 is obtained by noting that
Hence,
19
Example 5.2.8
The autocorrelation function of the random process in Example 5.2.2 is
20
21
Example 5.2.11
Let the random process Y(t) be similar to the random process X(t) defined in Exercise
5.2.2, but assume that is uniformly distributed between 0 and . In this case,
22
23
By using the convolution integral to relate the output Y(t) to the input X(t), i.e.,
, we have
24
Example 5.2.15
For the stationary random process in Example 5.2.2, we had
25
Hence,
The power spectral density is shown in Figure 5.17. All the power content of the
process is located at f0 and f0. This is expected because the sample functions of this
process are sinusoidals with their power at those frequencies.
26
The power content, or simply the power, of a random process is the sum of the powers
at all frequencies in that random process. In order to find the total power, we have to
integrate the power spectral density over all frequencies.
Since SX(f ) is the Fourier transform of RX( ), then RX( ) will be the inverse
Fourier transform of SX(f ). Therefore, we can write
27
Example 5.2.17
Find the power in the process given in Example 5.2.15
Solution We can use either the relation
or the relation
28
Power Spectra in LTI Systems. We have seen that when a stationary random process
with mean mx and autocorrelation function RX( ) passes through a linear time-invariant
system with the impulse response h(t), the output process will be also stationary with
mean
and autocorrelation
29
and
30
Figure 5.18 Inputoutput relations for the power spectral density and the cross spectral
density.
31
32
where, in the last step, we have used the fact that the process is stationary and hence
E[X(t1)X(t2)] = RX(t1 t2). But from Equation (5.2.12), we have
33
34
If we find the power content of a white process using SX(f ) = C, a constant, we will
have
This shows that for all 0, we have RX( ) = 0. Thus, if we sample a white process
at two points t1 and t2 (t1 = t2), the resulting random variables will be uncorrelated.
If the random process is white and also Gaussian, any pair of random
variables X(t1), X(t2), where t1 t2, will also be independent (recall that for jointly
Gaussian random variables, uncorrelatedness and independence are equivalent.)
35
Properties of the Thermal Noise. The thermal noise that we will use in
subsequent chapters is assumed to have the following properties:
1. Thermal noise is a stationary process.
2. Thermal noise is a zero-mean process.
3. Thermal noise is a Gaussian process.
36
For example, one such filter can have a transfer function of the form
Since thermal noise is white and Gaussian, the filtered thermal noise will be Gaussian
but not white. The power spectral density of the filtered noise will be
where we have used the fact that for ideal filters |H(f )|2 = H(f ).
38