ノートテキスト
ページ1:
§ 1. Probability a set) § 1.1. Properties of Poliability S. a sample space C A ES is called an Def 1.1-1. event A function P: [ events } → R is a Probability if (a) P(A) 0 for all evento A ; (b) P(S) = 1 (c) if A, A; --- An ---- are disjoint events (j.e, Ai ~ Aj = 4 ) if itj then P(A,UVA) = P(A₁) + + P(Ak) for each KEN. Thm 1.1 P(AUUAU) = P(A₁) ++p (Ak) + ..... A, B, & C are events (i) P(A) = 1-P(A°) (ii) P() = 0 (iii) if AB then P(A) = P(B) (iv) P(A) = 1 (v) P(AUB) = P(A) + P(B) = P(AB) - (vi) P(AUBU C) P(A) + P(B) + = P(c) -P(ANB)-P(Anc) -p (Bnc) +P(AnBac)
ページ2:
proof (i) Note that S = AUA° & A^A² = 4 Then = = 1= P(S) = P(AU A²) = P(A) + P(AF), which implies (b) PIA) = 1-PLA) (ii) Note that (c) 1 = P(S) = P(US) = p(4) + P(s) = p($) +1 <b => p(&) = 0. (c) ( B- Au(Bn A°) & An (Bn A²) = 4 (b) Hence, P(B) = P(A) + PC Bn A°) = P(A) + 0 = P(A) (c) P(B) = P(A) (a) (iv) Since As S, By ciiis. We have P(A) = P(S) = 1. (V) Note that A = (AnB²) U (ANB) disjoint B= (BOA) (AMB), disjoint B A Ang B AnB BAA AUB = (ANB) (BOA) U (AMB) disjoint
ページ3:
By (c), P(A) = P(AnB³) + P(AMB) (*)
P(B) = P(BOA) + P(ANB) (**)
P(AUB) = P(An B²) + P(BAA) + P(ANB)
|| (+)
PLA)-P(ANB)
P(B)-P(AB)
=
P(A) + P(B)-P(AB)
Example 1.1-3
A fair coin is flipped successively until the same face is
observed
on
Successive flips.
Let A he the event that it will take three
(ie, A = {x1 x = 3, 4, 5, 6, ... } ). Find P(A).
Method 1
or more
flips
P(A) = P( { x | x = 3 } ) + P( { x ( x=4}) + --- + p ( {x| x = k } ) + —---
Note that p( {× 1 × = 3}) = p ( [HTH}) + p ( { THT 3)
= (±)³+ (±)³ = 2(±)³
p ( { × 1 × = ₤2 3 ) = 2. (±)
x=
Thus, P(A) =
∞
12=3
∞
=2.
P({ x 1 x = k } ) = Σ 2. ( 1 ) ^
2(4)²= 2
k
(3
k=3
=
2². (£)³
3
ページ4:
Method 2.
Note that P(A) = 1- P(A") where A`= {x1) x = 2 } = { HH, TT }
PLA²) = P( {HH₁TT})
=
(+)² + (+)² = 1/125
Hence, P(A) = 1-1/2 = ½-
Example 1.1-5.
A
Survey
events
was taken of a group's viewing habits of sporting
on TV during the last year.
A = { watched football }
B = {
"
"
basketball }
C = {
"
"
base hall }
P(A)=0.43 & P(B) = 0.4 & P(C) = 0.32
=
PLANB) = 0.29, P(Anc) : 0.22, P(BAC) = 0.2 .
& PlanBnc) = 0.15
Then PLAUBUC) = P(A) + P(B) + P(C) P(AnB) -p (Anc) -P(Bnc)
= 0.59
+ P(AOBOC)
ページ5:
$ 1.3 Conditional Probability
Rolling a die
A fair six-sided die is rolled. Defined the fulluring events
A: the outcome is even
B: the outcome is greater than 3.
Given that
it is
a
die is greater
than
3,
what is the Prob. that
condition
even? P(A|B)
The conditional probability of
S
Def 1.3-1.
an event A,
given
that
event B has occurs.
,
is defined by
P(A|B) =
PLANB)
P(B)
Let us
A
=
provided that P(B) > 0.
answer the ahove example.
{2.4.63
B = {4,5,6}
Then P(AIB)=
Example 1.3-4
& AnB = {4,6}
P(ANB)
P(B)
=
3
½
=
2/3
Two fair four-sided dice are rolled
Let A he the event that
a sum of 3
is rolled
B "
"
"
"
a tum
of
3 or
a sum of 5
is rolled
ページ6:
+
y
((1.4) (2.4) (3.4) (4.4)
A = {(1,2), (2,1)}
3
(1.3) (2.3) (3.3) (43)
B = { (1.2), (2.1),
2
(1.2)(2.2) 13,2) (42)
1
(1.1) (2.1) (3.1) (4.1)
x
1 2 3. 4
P(AIB) PLANB)
=
P(A)
(1,4), (2,5), (3,2). (4.1)}
6
3
P(A) = 1/16 = ½, P(B) = 11/1
½
8
.
16
=
8
=
=
P(B)
P(B)
3/8
Properties
Given events A & B with P(B) >O
(i) P(AIB) 0; P(BIB) = 1
(ii) P(A|B)
=
1- P(AIB)
(ii) A₁, An, disjoint events, then
PLAUUAB) = P(A₁1B) + P(A₂|B) + ... + P(AKIB)
PLA₁UUAKUIB) = P(A|B) + P(A₂IB) + P(A31B) +---
prof (i) P(AIB) PLANB)
=
>0 & P(B) =
P(BOB) PIB)
=1
PCB)
P(B)
P(B)
(ii)
P(A|B) =
P(AB)
=
=
P(B)
B
P(B)-PLANB)
P(B)
= 1- P(AIB)
PLANB)
|-
P(B)
A
Ang An
ページ7:
(iii) P(AU AKIB) = P(LAAKJNB) P ([AB] U... [AB]) P(B) P(B) = P(A,B) + + P(AkB) P(B) = P(A, B) + + P(AIB). property. I multiplication Rule) PLANB) = = P(A) P(BIA) prided P(A)>0 = P(B) P(AIB) provided P(B)>0. Example 1.3-7 A barn contains 8 one- hump camels. I two- hump camels hump camels come out The camels come out of the barn randomly and one at A: 2 two- B: Seventh camel to come out has two- How to compute PLANB). a time among the first 6 camels 5-hump. P(A) = (2) (4) = 0.2937 P(BIA) = (15) 554 =0.5556 9 Hence, PLANB) = P(A) P(BIA) = 0.1632.
ページ8:
f P(AABIO Remark P(ABC) = P([AB] c) = P(AMB) P(c| AnB) Example 1.3-9. = P(A) P(BIA) P(c(ANB) A School girl has 5 blue & 4 white marbles in her left pocket & 4 blue & 5 white marbles in her night pocket 5B 4B ✗ What is the Proh. of her then 5W 4W drawing a blue marble from Left Right her right Pocket? BL drawing blue from left Pocket BR: we: " " " " right Pocket " white " " left Pocket P(BR) = PCBROBE) + P(BRA WR) = P(BL) P(BRIBL) + P(WL) P (BRT WL) 5 5 4 4 41 = + = q 10 9 10 90.
ページ9:
§ 2. Discrete Distributions
2.1~2.3 Random Variables of Discrete type & Expectation
Def 2.1-1.
S: a set :
XSR
a
a
sample space.
function
: a random variable.
Notation
Let As R. {XEA } = { s = S | X € A }
Example 2.1-1.
A rat is selected at random. The set of possible outcomes is
female & male.
The sample space S = { female, male } = { F. M}.
The random variable X (F) = 0 & X(M) = 1.
{ X = 1 } = {M} & {X = 0} = {F}.
Example 2.1-2
A experiment in which we roll a six-sided die.
S = {1, 2, 3, 4, 5, 63
X(s)
= S.
{X=3}, {x2}-{X=1, X=2}. {2X5] = [X=2. 3. 4. 5 },
Notation
Sis finite
or countable.
A random variable X: S → R is of
the discrete type.
ページ10:
The
range
of X RIX)= {X(s) | SE S } is at most countable.
f(x) = P(X = x): the
:
Def 2.1-2.
Proliability
mass function (pmf).
1. The pmf f(x) of a discrete random variable X is
that satisfies the following conditions.
a
function
(a) f(x) >o
for all x E RIX)
(b) f(x) = 1
XERIX)
(c) P(XEA) Σ f(x)
=
A & RIX).
XEA
:
2. F(x) = P(X ≤ x) = P(X = (-00,x]) is called the cumulative
"distribution function (cdf) of X (or distribution function of XJ
"XER"
Example 2.1-2 (Continue from the above)
f(x) = P(X = x) = ½ for x = 1.2.3.4.5.6.
and F(x) = P(X ≤ x ) = P(X = 1. 2. --·, [x])
0
=
if x <1
12/
if k ≤x < fe+l
if x 26.
J
the largest integer less or equal to x
e.g. k≤ x < k+1 => [x] = k
Example 2.1-3.
Roll a fair four sided die twice.
Let X
=
{
the larger of the two outcomes if they
the common value
are different
if they
ace
the same.
ページ11:
The sample space
S = { (d₁, dz)) d₁ = 1,2,3,4 & d₂ = 1,2,3,4}
X ( (1,2)) = X ( (2,1)) = X ((2,2)) = 2
=> f(2) = P(X = 2) = P( { (1,2), (2,1), (2,2)}) = 3/16
X((1.3))X((3.1)) = X ( (2.3)) = X ( (3,2)) = X ((3.3)) = 3.
=> f(3) = P(X = 3) = 5/16.
Def 2.2-1.
X is a random variable with its pmf f.
u(X)
Then the expected value (the expectation) of UCX is
E[u(X)] = Σ u(x) · f(x).
Theorem 2.2-1.
x
C1, C2 : Constants. U₁, U₂ = functions
(a) E[c] = C₁ ;
(b) E[ c₁u, (X)] = c E [u, (X)] ;
(c) E [Cu(X) + C₂ U₂ (X)] = C₁ E [U₁(X)] + C₂ E [ U₂ (X)].
Example 2.2-3.
Let X have the pmf fixs:
x
=
x = 1.2.3.4
10
E[X]
=
二.f(a)=(六)+2(六)+3.(%) +4.(%) = 3
_E[X²] = { x²ƒ‹x) = 1² ( ½ ) + 2)² ( 21 ) + ³ ³ ( 23 ) + 4² ( 16 ) = 10.
x=1
ページ12:
and E[X (5-X)] = 5 E[X] - E[X²] = 5.3-10 = 5. Example 2.2-5 An experiment has prohahility of success pε (0,1) and probability of failure 9 = 1-p. This experiment is repeated independently until the first success this happens Occurs; say on trail X. x- f(x) = P(x = x) = 9.9.9.9.8 = 97.p x-1 times geometric distribution X~ geo (p) for all x = 1, 2, 3, ---- 8 P We observe — fix) = Σ 9x+p = P(1+9+9+9+) = 1-9 = 1) ☑ x=1 x=1 — .... E[X] = x.fix) = 1. p + 2. qp + 3 q² p + .... x=1 then (1-9) E[X] = p + pq + pq² + ..... => ECX] == 1-9 1 = 1 =1 9 Def μ=E[X] = f(x) : x the mean of X x ² = Van (X) = E [ (X-μ)²] = Σ (x-µ³)² f(x): the Variance of X 5: the standard derivation of X. Remark 6²≥0 and 6²=E[X²] − (E[X])²
ページ13:
Example 2.3-2.
Auppose X has the pmf_f(x) = P(x=x)=%. x = 1,2,…, 6.
6
μ=E[X] = x ( ) = |+2+-+6
x=1
6
=
6
E[X³] = { x²³ ( ± ) = 1³½³ 2²+---+6² = %/6
x=1
6
96
then ²= E[X²] - (E[X])² = % - (½)² = 35/2
and 6 = 135/12~1.708.
Example 2.3-4
Let X have a uniform distribution on the fint m positive integers.
( That is, P(X = x) = 1/1 for each x = 1, 2, ... m )
m
u = E[X] = x (MM)
E[X²] =
m
x=1
x=1
=
m -2
m
1 m (m+1) m+1
=
2
=
(m+1) (2m+l)
6
x² ( ) = 1 m (m+1) (2m+1)
σ² = (7+) (2H)
6
property.
a, beR.
m
-
M+
m
1° E[ax+b]=aE[X] +b
=
2. Var (ax+b)= a² Van (X).
proof of 2
6
m²-1
12.
Van (ax+b) = E [{aX+b }² ] - {E[aX+b] }²
ページ14:
= - E[ a²x²+ 4a b X + b² ] − (a² E[X]²+ 4ab E[X] + b²) = a² E[X³] + 4ab E[X] + b² - (a² E[X]² + 4ab E[X] + b²) = a² ( E[X²) - E[X]²) = a² Van (X). Def. E[X"] = Σx" fix) : the r-th moment E [(X)] = E[X(X) (X-r+1)] : the r-th factorial moment. Def. Let X he a random variable of the discrete with type pmf f). Mlt) = E [ex] = Σ ex fix), - h<t<h for some h>0, is called the moment generating function (mgf) of X. Remark mgf uniquely determines the distribution of that random variable. In other words, Suppose that M₁ (t) is the mgf of X, with its pmf fi M₂ (t) " X2 " fz = and Milt) Malt), te (-h,h) then X, and X₂ have the same distribution; that is, f, (x) = ₤2(x) Vx. Example 2.3-7 If X has the mgf 3t M(t) = e* (² ²) + e²± (²) + e³ (t) then P(X=1): , P(X = 2) = —, and PCX = 3) = 1.
ページ15:
Property M'(0) = E[X], M"(0) = E[X³], -, M² (10) = E [X]. proof Note that M'(t) = Σ x e** fix then M'(0) = Σ x·1 fix) = E[X]. x r In general, M" (t) = Σx" etx fix) then M (0) = Σ x*·1. f(x) = E[X*]. Example 2.3-9. x x Suppose X has the geometric distribution (p); that is, the pmf of X x-| is fox) = 9x+p, x = 1, 2, 3,...- xx etx DO Then MH+) = E[ e²x] = 2 ** q* p = — — 1, (96*) * x=1 x=1 = || [ (qe³) + (get) ... ] = 1 get Pet ↓ 91-9et get<lt<-Inq = 1-9et Then M'(t) = (1-e) (Pet) - Pe* (-qe²) Pet = (1-9et)2 (1-98+)2 M" (t) = (1-9e*)² pe* - Pe³·2· (1-9eª) (-qe*) (1-90*)4 Hence, u = E[X] = '(0) = ½ = Pet (1+get) (1-9et)3 ²=E[X] - E[X]² = M"(0) - (M²(0))² = P(1+9) (1-913-p=797
ページ16:
§ 2.4 The Binomial Distribution A Bemoulli trail ( experiment) is a random experiment, the can he classified in outcome Success or failure two mutually ex clusive ways: V.S defective) ( female v.s male; life vs death; non defective Then, X is a random variable associated with Bemoulli trail : X( success) = 1 & X( failure) = 0. q Thus, the pmf of X is fix) = P(X = x) = p² (1-p)/x, x=0.1. 1 Bernoulli distribution μ = E [X] = ½ x f₁x) = x px (1-p)+x = p. x=0 x=0 1 ¸²= Var(X) = E[ (X-µ³²] = ≥ cx-p)²³ f(x) = p(1-p) = pq. x=0 In a sequence of independent Bernoulli trails, We are interested in the total mumber of success. Let X = number of success in n Bemoulli trails n-x then P(X = x) = (1) px (1-p) *-*, x = 0. 1. 2, n. We Say 募 often X follows Binomial distribution (n.p) (X~ Bin (n.p) ) Dr b(n.p)
ページ17:
if it pmf flx) = ( 1 ) P* (1-P)" where (2) = n! x! (1-x)! Example 2.4-1 & 2.4-7 The probability of germination of n-x x = 0, 1, 2, ..., n. a beet seed is 0.8. We plan to plant 10 seeds and assume that the germination of one seed is indep of the germination of another seed. Let X he the member of seeds that germinate. f(x) = (10) 10.8)* (0.2)" Then 2 10-x x = 0, 1, 2, ..., 10 10-x ≈0.000078 10-x P(X ≤2) = Σ (10) 10,8)* (0.2)" and x=0 8 8 P(X ≤ 8) = Σ f(x) = ²² (1°) (0.8)* (0.2) x=0 x=0 or PLX = 8) = 1- P(X = 9) - P(X = 10) = 1 - (19) (0.8)³ (0.2) - (10) (0.8) 10 Example 2.4.3 & 2.4-5 = 0.6242 An experience archer can hit the center of the target, the bullseye, in 20% of the attempts. Let X he the number of hitting the bullseye among n = 8 subsequent Then the Probability of hitting the bullseye twice is f(z) = ($) (0.2)² (0.8) = 0.2936. shots..
ページ18:
Recall
n
(a+b)" = ±² ("). a*b".
Remarks
x=0
11-x
binomial expansion.
1° X ~ Bin (1. p) <=> X ~ Bernoulli cp)
2. Let X ~ Bin (n.p), then
M (t) = E [ex] = Σ ex fix) = ^^
=
x=0
11-x
tx
e
*** (*) P* (1-p)**
x=0
Σ (2) (pet)* (1-p) = [pe² + 1-p]".
x=0
n+
Thus. M'(t) = n [ Pe² + 1-p]" · (pet)
=>>
11-2
M"(t) = n (n) [ pe² + 1-p]² (pe³)² + n [Pe*+ 1-p]" (Pe*)
· M = E[X] = M'(0) = np
and σ² = E[X²] - (E[X])² = M"(0) - ( μ'(o)² = np (1-p) = npq.
Example 24-10
Suppose that observation over a long period of time has
disclosed that.
on
the
average,
one but of ten items produced
by a process is defective. Select 5 items independently from
the production line and test them.
Let X he the number of defective items among n=5.
Then X ~ Bin (5, 0.1).
Also, E[X] = 5 (0.1) = 0.5 & Var (X) = 5 (0.1) (0.9) = 0.45
In addition, PLX ≤1) = (5) 10.1)° (0.9)5 + (5) (0.1)' (0.954
= 0.9185.
ページ19:
§ 2.5 The Hypergeometric Distribution N. objects Nz objects First class Second class N+N₂ =N fix) = P(X = x) = We Say A collection of n objects is selected from those N objects at random and without replacement. Let X he the number of objects selected that belong to the first class. (EN) (~) , max (n-N₂,0) ≤ x ≤ min (n. N₁). (N) that X has a hypergeometric distribution with N₁, №₂, and n. We denote it by X ~ HG (N₁, N₂, n). Example 2.5-2. In a small pond, there are parameters 50 fish, ten of which have been tagged. If a fisherman's catch consists of seven fish selected at random and without replacement. X denotes the mumber of tagged fish, the probability that exactly tuo tagged fish are P(X = 2) = (12) (40) (500) =0.2964. caught is
ページ20:
Example 2.5-3 A lot consisting of 100 fuses is inspected by : 5 fuses are chosen at random and tested; if all five blow at the correct amperage the lot is accepted. Suppose the lot contains 20 defective fuses. If X is the number of defective fuses in the sample of five. the probability of accepting the lot is P(X = 0) = (20) (80) (150) = 0.3193. 80 , x = 0, 1, 2, 3, 4,5. More general, fix) = (^2)(32) Remarks X~ HG (NN₂, n) 1. E[X] = x = = n 2 Σ x x+0 N₁! (N) N₂ (150) Σx. = x+0 (№32) * ! (N₁-x)! (~~) (N-1) (x-4)1 x+0 N₂ (1-1) (n-1-(x-1)) (N-1) (*)(M) (N) N₁ (N-1)! = (M)Σ x+0 (x+)! (N₁-x)! (二) (N₂ PLY = x+), Y~HG(N1, №₂, n−1) n(~) Σ =n() (N) since Σ (M-1) (n-1-(x+1) = 1. x+0 (N-1)
ページ21:
2° n n (DA) DA) n-x EX(X-1)] = x(x-1) x=0 = Σx(x-1) X=2 (N) = M x(x-1) 2 =N. (N-1) x N.! N₂! x! (N₁-x)! (1-x)! (N₂-n+x)! (N) (N.-2)! N₂! (x-2)! (N,-x)! (1-x)! (N₂-n+x)! (N) Now, we note that (N) N! N (N-1) = n! (N-h?! n(n-1) ( N-2) n-2 Hence, EXCX-1)] = N₁ (N, −1) h(n-1) y (№1-2 ) ( № 2 - y) N₂ N(N-1) y=0 (N-2) 1 1 y=x-2 n(n-1) =N. (N-1) N (N-1) Hence, Van (X) = E[X (X-1)] + E[X] − (E[X])² = N. (N-1) = n - ( u ) - Nu n(n-1) + N N (N-1) n (~~) (~~) (N-1) (冊)(冊)
ページ22:
§ 2.6 The Negative Binomial Distribution
Consider a sequence of independent Bernoulli trails until exactly
Ir successes occur, where I is a fixed positive integer.
Let X he the number
of
trails needed to observe the r-th success.
In other words, X is the trail number on which the r-th success is
observed.
Then the pmf of X is
x-r
g(x) = (x-1) p² (1-p)
- *~~ ·p
Y~ Bin (x-1, p & P(Y=1-1)
binomial distribution.
r-th success
We
Say
that X has a
Remarks
(k).
10
1. how) = Σ
K=0K!
negative
wk where hows = (1-1)
-r-
observe that '(w) = r (1-w)" => hco = v
then (0)
k!
-Y-2
r
h" (w) = r (r+1) (1-w)" => "(0) = r (r+1)
(k)
h (w) =
=
-Y-K
(x)'
=> h (100) = x (V+1) ... (Y+K-1).
Y (1+1) ….. (Y+k+1) (1-1)
(1+k+)... (1+1).r
Hence, we obtain
ki
(V+k-1)!
=
=
K! (Y+)!
Y-I
DO
(1-15)=
Y+k-1
wk
K=0 Y-1
DO
{√ (11) w
x=k+r_x=Y
x-r
ページ23:
DO where 9 = 1-P x-r 00 x-r We note that — glx) = — (*) prq *-* = p². ± (*) q* X=Y x=Y X=Y = pr. (1-9) = pr. pr=1. 2. If r=1, then X has a geometric distribution; that is, ·P = (1-p) = (1 p)² = k k k g(x) = (1-p)x+1px = 1, 2, 3, .... DO R) 2 x- Hence, PIX> k) = Σ (1-p) p = x=12+1 (1-P)* 1-(1-p) Thus, P(X ≤ 1) = = (1-p)*p = 1- P(X>k) = 1 - (1 - p)² = 1-9 Example 2.6-1 x=1 Some biology students were checking eye color in a large number of fruit flies. For the individual fly, suppose the probability of white eyes is 1/4 and the probability of red eyes is ¾/4. The probability that at least four flies have to checked for to observe a white fly is P(X ≥4) = P(X > 3) = q³ = ( 4 )³ = −27 P(X ≤ 4) = 1-9 ² = 0.6836. and 3 = 0.4219 64 P(X = 4) = 9 ³ p = ( 3 ) ³ ( 4 ) = 0.1055. Color eye
ページ24:
Example 2.6-1 ( Continue from above) eyes. (a) The probability that three out of eight fruit flies have white (b) The Probability that the third white eyed fruit fly is found on the eighth check. Sol. (a) Note that we shall compute PCX = 3) where X ~ Bin (8,¼4) Then P(X = 3) = ( 3 ) · (4) ³ (4) * ≈ 0.2077. (b) We shall compute P(X = 8) where 2 x-3 g(x) = (x-1) (4)² ( 4 ) *³ (4) Then P(X = 8) = g(8) = ( ! ) (4)² (3) ~ 0.0779. Remark 00 tx DO M(t) = El ex] = — e** (x-1) p* (1-p)** = (pe³)*· Σ (**) [ (1-p) e* ]' = (Pet) x=r | [1-(1-pet] Cl-peti then M'(t) = r. (pet) [1-(1-p) e*] --- -1-2 X=Y and M" (t) = r ( Pe²)" (-r-1) [1-(1-p) e³] [-(1-p) e*] +r (pet) (pet) [1-(1-1)*] M'(0) = /p x-r
ページ25:
and μ"(0) = = = (1+1 - p) p² Hence, we conclude μ = + and Example 2.6-2. 2 Y (PH-P) r(1-p) σ = p² p² p² a free can make Sequence of a Suppose that during practice a basketball player throw 80% of the time. Furthermore, assume that a free-throw shooting can he thought of as trails. independent Bernoulli Let X he the minimum number of free throws that this player must attempt to make The pmf of X is Then a total of ten shots. g(x) = (x-1) (0.8)" (0.2)* 10 x-10 x=10,11,12,- 1 & σ²= 10.(0.2) = = 3.125 (0.8)2 =12.5 μ = 10. 0.8 Remark & 5 = √√3.125 = 1.768. Let M he the mgf of a given random variable X. Suppose Mk (0) existo for all k. then Mit) = M (0) + M'(0) M"(0) H t + t² +---+ M(110) t+..... 2! k!
ページ26:
Example 2.6-4 Let the moment of X he defined by E[X"] = 0.8, r = 1, 2, 3 ... M(t) = M(0) + Mr) (0) T=1 Y! = 0.2 +0.86 tr tr = 1 =1+E[X] tr = 1+(0.8) 总点 T=1 V! t Y=I Y! 1.t 8 + 1 = 0.2 + 28. e² = (0.2) e°*+ (0.8) e¹t Y=0 Y! 0.8. Thus, PCX = 0) = 0.2 & PCX = 1) = 0.8.
他の検索結果
このノートに関連する質問
Undergraduate
數學與統計
求解謝謝🙇🏻♀️🙇🏻♀️只要一小題就好
Undergraduate
數學與統計
求解😖
Undergraduate
數學與統計
證明 f⁻¹(f(x)) = x , f(f⁻¹(x)) = x 是成立的
Undergraduate
數學與統計
求救🆘 請問這題微積分該如何求解 感謝🙏
Undergraduate
數學與統計
113年地方特考 三等統計學 第四題 請問: 第二張圖片內 拒絕域的F值要怎麼算出來 這份考卷只有附Z分配、t分配表 並未附F分配的表
Undergraduate
數學與統計
問一下這題
Undergraduate
數學與統計
想問這題:為什麼微分後一直算不出分母為729的分數?求解析🙏🙏謝謝!!
Undergraduate
數學與統計
請問37、43題怎麼證明(需要用delta和epsilon)
Undergraduate
數學與統計
請問要怎麼求水平/垂直漸進線?
Undergraduate
數學與統計
這題偏微分幫幫忙了🙏
News


コメント
コメントはまだありません。