Home | | **Probability and Random Processes** | | **Probability and Queueing Theory** | Important Short Objective Questions and Answers: Random Processes

Mathematics (maths) - Random Processes - Important Short Objective Questions and Answers: Random Processes

**Markov Processes and Markov chains**

**1. Define Random
processes and give an example of a random process. Answer**:

A Random process is a collection of R.V {*X*
(*s*,*t*)}that are functions of a
real variable

namely
time t where *s* ∈ *S* and *t* ∈*T*

**Example:
***X***
**(*t*)**
**=**
***A*cos(*ω**t*** **+**
***θ*** **)**
**where** ***θ***
**is
uniformly distributed in** **( ,20*π***
**)**
**where A and *ω *are
constants.

**2. State the four
classifications of Random processes. **

**Sol: **The Random
processes is classified into four types

**(i)Discrete
random sequence**

If both T and S are discrete then Random processes is called a discrete Random sequence.

**(ii)Discrete
random processes**

If T is continuous and S is discrete then Random processes is called a Discrete Random

processes.

**(iii)Continuous
random sequence**

If T is discrete and S is continuous then Random processes is called a Continuous Random sequence.

**(iv)Continuous
random processes**

If T &S are continuous then Random processes is called a continuous Random processes.

**3. ****Define
stationary Random processes. **

If certain probability distributions or averages do not depend on t, then the random process

{*X* (*t*)}is
called stationary.

**4.
Define first order stationary Random processes.**

A
random processes {*X*
(*t*)}is said to be a first order SSS process
if

*f *(*x*_{1}*
*,*t*_{1}* *+*
**δ** *)*
*=*
f *(*x*_{1}*
*,*t*_{1}* *)* *(i.e.) the first order density of a
stationary process* *{*X *(*t*)}*
*is*
*independent of time t

**5.
Define second order stationary Random processes**

A RP {*X*
(*t*)} is said to be second order SSS if *f*
(*x*_{1}, *x*_{2} ,*t*_{1} ,*t*_{2}
) =
*f* (*x*_{1} , *x*_{2} ,*t*_{1} +
*h*,*t*_{2} + *h*) where *f*
(*x*_{1}, *x*_{2} ,*t*_{1},*t*_{2}
) is the joint PDF of {*X*
(*t*_{1} ), *X* (*t*_{2} )}.

**6.
Define strict sense stationary Random processes**

Sol: A RP {*X*
(*t*)} is called a SSS process if the joint
distribution *X* (*t*_{1} )*X* (*t*_{21} )*X*
(*t*_{3} ).........*X* (*t _{n}* ) is the same as
that of

*X *(*t*_{1}*
*+*
h*)*X
*(*t*_{2}* *+*
h*)*X
*(*t*_{3}* *+*
h*).........*X
*(*t _{n} *+

**7.
Define wide sense stationary Random processes**

A RP {*X*
(*t*)}is called WSS if *E*{*X*
(*t*)} is constant and *E*[*X*
(*t*)*X* (*t* +*τ*
)]=
*R _{xx}* (

**8.
Define jointly strict sense stationary Random processes**

**Sol: **Two real valued
Random Processes** **{*X***
**(*t*)}and**
**{*Y*(*t*)}**
**are
said to be jointly stationary** **in the strict sense if the joint
distribution of the {*X*
(*t*)}and {*Y*(*t*)}
are invariant under translation

of
time.

**9.
Define jointly wide sense stationary Random processes**

**Sol:
**Two
real valued Random Processes** **{*X***
**(*t*)}and**
**{*Y*(*t*)}**
**are
said to be jointly stationary in the
wide sense if each process is individually a WSS process and *R _{XY}*
(

**10.
****Define Evolutionary Random
processes and give an example. **

**Sol: **A Random processes that
is not stationary in any sense is called an** **Evolutionary process.
Example: Poisson process.

**11. If **{*X***
**(*t*)}**
is a WSS with auto correlation ***R*(*τ*** **)** **=**
***Ae*^{−}^{α τ}** , determine the second order
moment of the random variable X (8) – X (5).**

∴*
E*(*X
*^{2}* *(*t*))=*
E*(*X
*(*t*)*X *(*t*))* *=*
R _{xx} *(

∴*
E*(*X
*^{2}* *(8))* *=*
A *&*
E*(*X *^{2}* *(5))* *=*
A *∴*
E*(*X
*(8)*X *(5))* *=* R _{xx} *(
5,)8=

Now second
order moment of {*X*
(8) −
*X* (5)} is given by

*E*(*X
*(8)* *−* X *(5))^{2}*
*=*
E*(*X
*^{2}* *(8)* *+*
X *^{2}* *(5)*
*−*
*2*X
*(8)*X *(5))

= *E*(*X
*^{2}* *(8))* *+*
E*(*X
*^{2}* *(5))* *−*
*2*E*(*X
*(8)*X *(5))* *

= *A
*+*
A *−*
*2*Ae*^{−}^{3}^{α}* *=*
*2*A
*1(−*
e*^{−}^{3}^{α}* *)* *

**12.Verify whether the sine wave process**{*X***
**(*t*)}**,
where ***X*** **(*t*)**
**=**
***Y***
**cos*ω**t*** where Y is
uniformly distributed in (0,1) is a SSS process.**

Sol: *F*(*x*) =
*P*(*X* (*t*) ≤ *x*) = *P*(*Y* cos*ω**t* ≤ *x*)

If {X(t)} is to be a SSS process, its
first order density must be independent of t. Therefore, {X(t)}is not a SSS process.

**13. Consider a random
variable ***Z*(*t*)** **=**
***X*_{1}**
**cos*ω*_{0}*t***
**−**
***X***
**_{2}** **sin*ω*_{0}*t***
where ***X*_{1}**
and ***X*_{2}**
are independent Gaussian random variables with zero mean and variance ***σ*** **^{2}**
find E(Z) and E( ***Z*** **^{2}**
)**

**Sol: **Given** ***E*(*X*_{1}**
**)** **=** **0** **=**
***E*(*X***
**_{2}** **) &*Var*(*X*_{1}** **)** **=**
***σ*** **^{2}**
**=**
***Var*(*X***
**_{2}** **)

⇒* E*(*X*_{1}^{2}*
*)* *=* **σ** *^{2}*
*=*
E*(*X
*_{2}* *^{2}* *)

*E*(*Z*)*
*=*
E*(*X*_{1}*
*cos*ω*_{0}*t
*−*
X *_{2}* *sin*ω*_{0}*t*)*
*=*
*0

*E*(*Z *^{2}*
*)* *=* E*(*X*_{1}*
*cos*ω*_{0}*t
*−*
X *_{2}* *sin*ω*_{0}*t*)^{2}

**14.Consider the random
process ***X*** **(*t*)**
**=**
**cos(*ω*_{0}*t***
**+**
***θ*** **)**
where ***θ*** is uniformly
distributed in **(−*π*** **,*π*** **)** .Check whether
X(t) is stationary or not? **

**Answer:**

**15. Define Markov Process. **

Sol: If for t1 < t2 <
t3 < t4 ............ < tn < t
then

*P*(*X *(*t*)*
*≤*
x */*
X *(*t*_{1}* *)* *=*
x*_{1}* *,*
X *(*t*_{2}* *)* *=*
x*_{2}* *,..........*X
*(*t _{n} *)

**16. Define Markov chain**.

Sol: A Discrete parameter Markov process is called
Markov chain.

**17. Define one step transition
probability**.

Sol:

Yes it is irreducible since each state can be
reached from any other state

Therefore the chain is irreducible.

**20 State the postulates of a Poisson
process.**

Let {*X*
(*t*)} = number of times an event A say,
occurred up to time ‘t’ so that the sequence {*X*
(*t*)}, *t* ≥
0 forms a Poisson process with parameter *λ*
.

(i)
P[1 occurrence in (*t*,*t* + ∆*t*)
]=*λ*∆*t*

(ii)
P[0 occurrence in (*t*,*t* + ∆*t*)
]=1-*λ*∆*t*

(iii)
P[2 or more occurrence in (*t*,*t*
+
∆*t*)
]=0

(iv)
X(t) is independent of the number of
occurrences of the event in any interval prior and after the interval (0,t).

(v)
The probability that the event occurs a
specified number of times in (t_{0},t_{0}+t) depends only on t,
but not on t_{0}.

**20.****State
any two properties of Poisson process Sol: **(i) The Poisson
process is a Markov process** **

(ii) Sum
of two independent Poisson processes is a Poisson process

(iii) The
difference of two independent Poisson processes is not a Poisson process.

**21.****If
the customers arrived at a counter in accordance with a Poisson process with a
mean rate of 2 per minute, find the probability that the interval between two ****consecutive
arrivals is more than one minute.**

**Sol: **The interval T
between 2 consecutive arrivals follows an exponential distribution** **with

**22.A bank receives on an average ***λ*** **=**
**6**
bad checks per day, what are the probabilities that it will receive (i) 4 bad
checks on any given day (ii) 10 bad checks over any 2 consecutive days.**

**23.Suppose the customers arrive at a
bank according to a Poisson process with a mean rate of 3 per minute. Find the
probability that during a time interval of 2 minutes exactly 4 customers arrive**

**25.Customers arrive a large store
randomly at an average rate of 240 per hour. What is the probability that
during a two-minute interval no one will arrive.**

**26.The no of arrivals at the reginal
computer centre at express service counter between 12 noon and 3 p.m has a
Poison distribution with a mean of 1.2 per minute. Find the probability of no
arrivals during a given 1-minute interval**.

**27. For the sine wave process ***X***
**(*t*)**
**=**
***Y***
**cos*ω**t*,−∞
<**
***t***
**<
∞**
where ***ω*** = constant, the
amplitude Y is a random variable with uniform distribution in the interval 0
and 1.check wheather the process is stationary or not.**

Therefore
it is not stationary.

**28.
Derive the Auto Correlation of Poisson Process.**

Sol:
*R _{xx}* (

R
* _{xx}*
(

=*E*[*X
*(*t*_{1}*
*){*X
*(*t*_{2}*
*)* *−* X *(*t*_{1}*
*)}]+*
E*[*X
*^{2}* *(*t*_{1}*
*)]*
*

=*E*[*X
*(*t*_{1}*
*)]*E*[*X
*(*t*_{2}*
*)* *−* X *(*t*_{1}*
*)]+*
E*[*X
*^{2}* *(*t*_{1}*
*)]*
*

Since
*X* (*t*) is a Poisson process, a process of independent increments.

∴* R _{xx} *(

⇒* R _{xx} *(

(or)
⇒ *R _{xx}* (

**29.
Derive the Auto Covariance of Poisson process**

Sol:
*C*(*t*_{1} ,*t*_{2} ) =
*R*(*t*_{1} ,*t*_{2} ) −
*E*[*X*
(*t*_{1} )]*E*[*X*
(*t*_{2} )]

= *λ*^{2}*t*_{1}*t*_{2}* *+* λ**t*_{1}* *−* λ*^{2}*t*_{1}*t*_{2}* *=* λ**t*_{1}* *if* **t*_{2}* *≥* **t*_{1}* *

∴ *C*(*t*_{1},*t*_{2}*
*)* *=* **λ** *min{*t*_{1},*t*_{2}}*
*

**30.****Define
Time averages of Random process. **

Sol: The time averaged mean of a sample function *X*
(*t*) of a random process{*X*
(*t*)} is

**mean and variance of
X(10)-X(6). **

**Answer**:

**X(10)**-X(6)
is also a normal R.V with mean** ***µ*(10)**
**−**
***µ*(6)** **=**
**0**
***and*** ***Var*[*X *(10)* *−*
X *(6)]*
*=*
*var{*X
*(10)}+* *var{*X *(6)}−*
*2cov{*X
*(10),* X *(6)}

= *C*(10,10) + *C*(
6,)6−
2*C*(10
6,)
= 16 +16 − 2×16*e*^{−}^{4} = 31.4139

**33.Define a Birth process.**

Answer:

A Markov process {X(t)} with state space S={1,2…..}
such that

_{}

**34. Define
Ergodic Random Process**.

Sol: A random process {*X*
(*t*)}is said to be Ergodic Random Processif
its ensembled averages

are
equal to appropriate time averages.

**35.Define
Ergodic state of a Markov chain**.

Sol:
A non null persistent and aperiodic state is called an ergodic state.

**36.Define
Absorbing state of a Markov chain**.

Sol: A state i is called an absorbing state if and
only if *P _{ij}* = 1 and

**37.Define irreducible**

The process is
stationary as the first and second moments are independent of time. State any
four properties of Autocorrelation function.

Answer:

i) *R _{XX}
*(−

ii) *R*(*τ** *)* *≤*
R*(0)*
*

iii) *R*(*τ** *)* *is continuous
for all* **τ**
*

iv) if
*R _{XX}* (−

components, then *µ*_{X}^{2} = lim_{ }_{→∞} *R*(*τ* )

**38.What
do you mean by absorbing Markov chain? Give an example**.

Sol:
A State I of a Markov chain is said to be an absorbing state if *P _{ii}*
=
1(i.e.) it is

impossible to leave it .A Markov chain is said to be
absorbing if it has at least one absorbing state.

Eg: The tpm of an absorbing Markov chain is

**39.Define
Bernoulli Process.**

Sol: The Bernoulli random variable is defined as *X*
(*t _{i}* ) =

**40State the properties of Bernoulli
Process**.** **Sol: (i) It is a Discrete process

(ii) It
is a SSS process

(iii) *E*(*X _{i}
*)

**41.Define Binomial Process **

Sol:
It is defined as a sequence of partial sums {*S _{n}*
},

*S _{n}
*=

42.State
the Basic assumptions of Binomial process

Sol: (i) The time is assumed to be
divided into unit intervals. Hence it is a discrete time process.

(ii) At
most one arrival can occur in any interval

(iii) Arrivals
can occur randomly and independently in each interval with probability p.

**43.Prove that Binomial process is a
Markov process **

Sol: *S _{n}*
=

∴*
P*(*S _{n}
*=

(i.e.) the probability distribution of *S _{n}*
,depends only on

**44.
Define Sine wave process**.

Sol:
A sine wave process is represented as *X* (*t*) =
*A*sin(*ω**t*
+
*θ* ) where the amplitude
A ,or frequency *ω*
or phase *θ* or any
combination of these three may be random.

It is also represented as *X* (*t*) =
*A*cos(*ω**t*
+
*θ* ) .

**45.Prove that sum of two independent
Poisson process is also Poisson**.** **

Sol: Let *X* (*t*) =
*X*_{1} (*t*) +
*X* _{2} (*t*)

Therefore *X* (*t*) =
*X*_{1} (*t*) +
*X* _{2} (*t*) is a Poisson process with parameter (*λ*_{1} +
*λ*_{2}
)*t*

**1.. **The t.p.m of a Marko cain with three states 0,1,2 is P and the initial state distribution is Find (i)P[X2=3] ii)P[X3=2, X2=3, X1=3, X0=2]

**2. **Three boys A, B, C are throwing a ball each other. A always throws the ball to B and B** **always throws the ball to C, but C is just as likely to throw the ball to B as to A. S.T. the process is Markovian. Find the transition matrix and classify the states

**3. **A housewife buys 3 kinds of cereals A, B, C. She never buys the same cereal in successive** **weeks. If she buys cereal A, the next week she buys cereal B. However if she buys P or C the next week she is 3 times as likely to buy A as the other cereal. How often she buys each of the cereals?

**4. **A man either drives a car or catches a train to go to office each day. He never goes 2 days in a** **row by train but if he drives one day, then the next day he is just as likely to drive again as he is to travel by train. Now suppose that on the first day of week, the man tossed a fair die and drove to work if a 6 appeared. Find 1) the probability that he takes a train on the 3rd day. 2). The probability that he drives to work in the long run.

**WORKED OUT EXAMPLES**

Study Material, Lecturing Notes, Assignment, Reference, Wiki description explanation, brief detail

**Related Topics **

Copyright © 2018-2020 BrainKart.com; All Rights Reserved. Developed by Therithal info, Chennai.