Maths : Probability Distributions: Summary

**SUMMARY**

**• A random variable ***X*** ***is a function defined on a sample space S into the real numbers**ℝ** *** ***such*** ***that the inverse image of points or subset or interval of **ℝ **is an event in S, for
which probability is assigned.*

• A random variable *X*
is defined on a sample space *S* into
the real numbers *ℝ *is
called discrete random variable if the range of *X* is countable, that is, it can assume only a finite or countably
infinite number of values, where every value in the set *S* has positive probability with total one.

• If *X* is a
discrete random variable with discrete values *x*_{1}, *x*_{2},
*x _{3}*,...

• The function *f*(*x*) is a probability mass function if

(i) *f(*x_{k})
≥ 0 for *k* = 1,2,3,...*n*,... and (ii) ∑_{k} *f* (*x _{k}* ) =
1

• The **cumulative distribution function** *F*(*x*)
of a discrete random variable *X*,
taking the values *x*_{1},* x*_{2},* x _{3}*,... such that

*F *(*x*) =* P(X *≤* x*) =* *∑_{xi≤x}* f *(*x _{i}
*),

• Suppose *X* is a
discrete random variable taking the values *x*_{1},
*x*_{2}, *x _{3}*,... such that

*• Let S be a sample space and let a
random variable X : S → **ℝ** that
takes any value in a set I of **ℝ**.
Then X is called a continuous random variable if P*(

*• A non-negative real valued
function f*(*x*)*
is said to be a probability
density function if, for each possible outcome x, x *∈

*P*(*a *≤* x *≤* b*) =* ^{b}*∫

*• Suppose F*(*x*)* is the distribution function of a
continuous random variable X. Then the*

*probability density function f*(*x*)* is given by*

* f*(*x*) = *dF*(*x*) / *dx*
=
*F*′(*x*)
*, whenever derivative exists.*

*• Suppose X is a random variable
with probability mass or density function f*(*x*)* The expected value or mean or mathematical expectation of
X, *denoted by

*• The variance of the random variable
X *denoted by

*V*(*X*) =* E*(*X
– E*(*X*))^{2}* *=*
E*(*X – μ*)^{2}

(i) *E*(*aX + b*) = *aE*(*X*) + *b*,

where *a* and *b* are constants

Corollary 1: *E*(*aX*) = *aE*(*X*)

( when *b* = 0)

Corollary 2: *E*(*b*) = *b*

(when *a* = 0)

(ii) *Var*(*X*) =*
E*(*X*^{2}) –* (E*(*X*))^{2}

(iii) *Var*(*aX + b*) =* a*^{2}*Var*(*X*) where* a *and* b *are constants

Corollary 3: *V*(*aX *) =* a*^{2}*V*(*X*) (when *b* = 0)

Corollary 4: *V*(*b*) = 0 (when *a* = 0)

• Let *X* be a
random variable associated with a Bernoulli trial by defining it as *X* (success) = 1 and *X *(failure) = 0, such that

*• X *is called a Bernoulli random
variable and* f*(*x*) is called the Bernoulli distribution.

*• If X *is a Bernoulli’s random variable
which follows Bernoulli distribution with parameter* p, *the* *mean *μ* and variance *σ* ^{2} are

* μ=p*
and σ^{2} = *pq*

• A discrete random variable *X* is called binomial random variable, if *X* is the number of successes in *n*-repeated
trials such that

(i) The *n*-
repeated trials are independent and *n*
is finite

(ii) Each
trial results only two possible outcomes, labelled as ‘success’ or ‘failure’

(iii) The probability of a success in each trial, denoted as
*p*, remains constant

• The binomial random variable *X* equals the number of successes with probability *p* for a success and *q* = 1 – *p* for a failure in
*n-*independent trials, has ** a binomial distribution** denoted by

• If *X* is a
binomial random variable which follows binomial distribution with parameters *p* and *n,* the mean *μ* and
variance *σ*^{2} are *μ* = *np*
and *σ*^{2} = *np*(1 – *p*).

Tags : Probability Distributions | Mathematics , 12th Maths : UNIT 11 : Probability Distributions

Study Material, Lecturing Notes, Assignment, Reference, Wiki description explanation, brief detail

12th Maths : UNIT 11 : Probability Distributions : Summary | Probability Distributions | Mathematics

**Related Topics **

Privacy Policy, Terms and Conditions, DMCA Policy and Compliant

Copyright © 2018-2023 BrainKart.com; All Rights Reserved. Developed by Therithal info, Chennai.