当前位置:文档之家› 随机过程导论Chapter 1

随机过程导论Chapter 1

E[SN] = E[X1]E[N] Var[SN] = Var[X1]E[N] + E2[X1]Var[N]
15
1.1 Basic concepts
11. Jointly distributed random variable For any two random variables X and Y, the joint cumulative probability distribution function of X and Y is defined by F(a,b)=P{X≤a, Y≤b}, -∞<a and b<∞
∫ ∑
i
xn
xin p(x) f (x)dx
if X is discrete if X is continuous
12
1.1 Basic concepts
7. Variance of a random variable Var(X)= E [(X-E[X])2] = E [X2]-(E[X])2
∑ E[X]= xi p(xi ) i
b) If X is continuous random variable having a probability density function f(x), then the expected value of X is defined by

∫ E[X] = xf (x) dx −∞
14
1.1 Basic concepts
pound random variable
Let {Xi} be a sequence of i.i.d. (independently and identically distributed), nonnegative, and integer-valued random variables. Let N be a nonnegative and integer-valued random variable. The compound random variable SN is defined as the sum of X1,…XN, this random variable is often called the random sum.
For each event E of the sample space S, we assume that
a number P(E) is defined and satisfied that following three conditions: (i) 0 ≤ P(E) ≤ 1, (ii) P(S) =1, (iii) For any sequence of events E1, E2 …
P ( EF )
P ( E | F )=
P(F )
or P(EF)= P(F) P (E | F)
If E and F are independent, then
P(EF)=P(E)P(F) P( E | F )=P(E)
6
1.1 Basic concepts
4. Random variable The real-valued functions defined on the sample space are known as random variables.
If X and Y are independent, then Cov( X,Y )=0
17
1.1 Basic concepts
Law of total probability:
① Discrete case let X1,…, Xk be mutually exclusive and collectively exhaustive events. For any event A, we have
10
1.1 Basic concepts
c) Expectation of g(X) z If X is a discrete random variable with probability mass
function p(x), then for any real-valued function g(X),
P(E∪F) = P(E) + P(F) – P(EF) 5
1.1 Basic concepts
3. Conditional probability
Conditional probability is denoted by P(E|F).
It states that E occurs given that F has occurred.
An extremely important property of conditional expectation is
that for all random:
∑ E[ X Y = y]P{Y = y} (discrete)
E [ X ] = E[ E[X|Y] ]=
y ∞
∫−∞ E[ X Y = y] fY ( y)dy (continuous)
13
1.1 Basic concepts
9. Conditional variance of a random variable Var( X Y = y) = E[( X − E[ X Y = y])2 Y = y] = E[ X 2 Y = y] − (E[ X Y = y])2
Computing expectations by conditioning:
16
1.1 Basic concepts
z Expectation of random variables For both discrete and continuous random variables E[X+Y ]=E[X ]+E[Y ]
z Covariance of random variables Cov( X,Y )=E[XY ]-E[X ]E[Y ]
z For a continuous random variable X, the distribution
function F can be expressed as
∫ F(xi)
=
xi −∞
f
(x)dx
where f(x) is called probability density function of X
Chapter 1 Introduction of Probability Theory
1.1 Basic Concepts 1.2 Generating function for discrete random variables 1.3 Laplace transforms for continuous random variables 1.4 Some mathematical background 1.5 Classification of stochastic processes
7
1.1 Basic concepts
5. (cumulative) Distribution function Distribution function F(⋅) of the random variable X is defined for any real number b by
F(b)=P{X ≤ b}
② E∩F: is referred to as the intersection of E and F.
The event EF will occur only if both E and F occur.
If EF=Φ, then E and F are said to be mutually exclusive
z X and Y are both discrete random variables:
F(a,b) = ∑ ∑ p(x, y) x<a y<b
z X and Y are both continuous random variables:
F{X ∈ A,Y ∈ B}= ∫B ∫A f (x, y)dxdy
that are mutually exclusive, i.e., events for which
En Em = Φ when n≠m, then
U ∑ P⎜⎜⎝⎛
∞ n=1
E
n
⎟⎟⎠⎞
=
∞ n=1
P(En )
We refer to P(E) as the probability of the event E.
Discrete random variable: take on either a finite or a countable number of possible values. Continuous random variable: take on a continuum of possible values
8
1.1 Basic concepts
z For a discrete random variable X, the distribution
function F can be expressed as
∑ F(xi)
=
all
p(x)
x≤ xi
where p(xi) is the probability mass function of X, p(xi)=P{X=xi}
③ E: is referred to as the complement of E The event E will occur only if E does not occur.
4
1.1 Basic concepts
2. Probability defined on events
相关主题