Professional Documents
Culture Documents
Unit-I
Lecture Notes-1
Introduction to Probability
Engineering systems are designed to operate well in the face of uncertainty of
characteristics of components and operating conditions. In some case, uncertainty is introduced
in the operations of the system, on purpose. Understanding how to model uncertainty and how to
analyze its effects is an essential part of an engineer's education. Randomness is a key element of
all systems we design. Communication systems are designed to compensate for noise. Internet
routers are built to absorb traffic fluctuations. Building must resist the unpredictable vibrations of
an earthquake. The power distribution grid carries an unpredictable load. Integrated circuit
manufacturing steps are subject to unpredictable variations. Searching for genes is looking for
patterns among unknown strings. What should you understand about probability? It is a complex
subject that has been constructed over decades by pure and applied mathematicians. Thousands
of books explore various aspects of the theory. How much do you really need to know and where
do you start?
To start with, the first thing that should be done is to define what is mean by random signal. A
Random signal is a time waveform that can be characterized only in some probabilistic manner.
In general it can be either desired or undesired waveform. Undesirable waveforms will appear in
almost all communication systems. In communication systems undesired signal is termed as
Noise. Some examples of effect of random signals include background hiss in radio receiver,
snow in Television system, randomly generated sea sounds in SONAR systems. Due to the
random nature there is a need to study the phenomena of these signals in probabilistic manner.
All the random signals exist in nature vary with time, there comes the concept of random
processes. Here in this course we first introduce the mathematical tool probability and describe
how this tool will help to characterize the random signals.
Unit-I
Contributors to Probability:
Adrien Marie LEGENDRE, 1752-1833
Best use of inaccurate measurements: Method of Least Squares.
Jacob BERNOULLI, 1654-1705
Making sense of uncertainty and chance: Law of Large Numbers.
Abraham DE MOIVRE, 1667 1754
Bounding the probability of deviation: Normal distribution.
Thomas SIMPSON, 1710-1761
A First attempt at posterior probability.
Thomas BAYES, 1701-1761
The importance of the prior distribution: Bayes' rule.
Pierre Simon LAPLACE, 1749-1827
Posterior distribution: Analytical methods.
Carl Friedrich GAUSS, 1777 1855
Least Squares Estimation with Gaussian errors.
Andrei Andreyevich MARKOV, 1856 1922
Markov Chains
Andrei Nikolaevich KOLMOGOROV, 1903-1987
Kolmogorov was one of the most prolific mathematicians of the 20th century. He made
fundamental contributions to dynamic systems, ergodic theory, the theory of functions and
functional analysis, the theory of probability and mathematical statistics, the analysis of
turbulence and hydrodynamics, to mathematical logic, to the theory of complexity, to geometry,
and topology. In probability theory, he formulated probability as part of measure theory and
established some essential properties such as the extension theorem and many other fundamental
results.
Thomas Bayes
Jacob Bernoulli
Abraham De Moivre
Markov
Unit-I
Unit-I
Set Theory
Prior to the introduction of the probability, it is necessary to develop certain concepts on set
theory.
Set definitions:
A set is a collection of objects, which are the elements of the set. If A is a set and x is an
x A
x A
element of B, we write
. If x is not an element of A, we write
. A set can have no
elements, in which case it is called the empty set or null set, denoted by . A set can contain a
finite or infinite number of elements. If a set A contains an infinite number of elements which
can be enumerated in a list (there exists a bijection between the elements of A and the set of
natural numbers), we say that A is countably infinite. If the elements of A cannot be enumerated
in a list we say that A is uncountable. If every element of a set A is also an element of a set B, we
A B
BA
A B
B A
say that A is a subset of B, and we write
or
. If
and
, the two sets are
equal, and we write A = B. We denote by S the universal set, which contains all objects of
interest in a particular context. All sets are subsets of universal set.
Set Operations:
Venn diagram
It is a geometric representation of elements of sets by closed-plane figures. The universal set S is
represented by rectangle, B is subset of A, C is disjoint from A and B as shown in Fig 1.1.
A B
Two sets A and B are equal if all elements in A are present in B and Vice versa. i.e if
and
B A
,for equal sets we write A=B. The difference of two sets A and B, denoted by A-B, is the
set containing all elements of A that are not present in B.
Union and Intersection
Unit-I
C A B
Complement
A
, denoted by , is the set of all elements not in A.
Algebra of sets
Commutative law
Associative law
Distributive law
A B B A
A B B A
A ( B C ) ( A B) ( A C )
A ( B C ) ( A B) ( A C )
( A B) C A ( B C ) A B C
( A B) C A ( B C ) A B C
De Morgans Laws
It states that the complement of a Union (Intersection) of two sets A and B equals the intersection
A
B
(Union) of the complements
and .Thus
Unit-I
( A B) A B
( A B) A B
Duality principle
by
by
Unit-I
Two events A and B are said to be mutually exclusive if they have no common elements (or
outcomes).Hence if A and B are mutually exclusive, they cannot occur together.
Def: Occurrence of an event
An event A of a random experiment is said to have occurred if the experiment terminates in an
outcome that belongs to A.
Probability of an event
The probability of an event has been defined in several ways. Some of the most popular
definitions are: i) the relative frequency definition ii) the classical definition and iii) Axiomatic
definition.
nA
)
n
Unit-I
NA
N
Axiom 1:
P(S ) 1
Axiom 2:
N
n 1
n 1
P (U An ) P ( An )
Axiom 3:
infinite.
Am An
if
for all
mn