Try   HackMD

機率 - 2 (Expectation ~ Conti R.V.)

Expectation

  • ex.
    • Suppose
      XPoisson(λ=15,T=1)
      . What is the PMF of
      X
      ? How to define the expected value of
      X
      ?
    • sol
      • pX(x)={eλT(λT)xx!, x=0,1,0,otherwise

        E(X)=x=0xeλT(λT)xx!=x=0eλT(λT)x(x1)!=λTx=0xeλT(λT)x1(x1)!=λT

        x=0xeλT(λT)x1(x1)!=1
        since it look like the sum of PMF of another Poission R.V and the sum of the probability is 1
  • Def (Expect Value / Mean / Expectation)
    E[X]:=xSxpX(x)
    • S
      : the set of possible value
    • another notation:
      μX=E[X]
  • Def (Expectation using CDF)
    E[X]=i=1(xixi1)(1FX(xi1))
    • Denote
      FX(x0)=0
    • Image Not Showing Possible Reasons
      • The image was uploaded to a note which you don't have access to
      • The note which the image was originally uploaded to has been deleted
      Learn More →
  • ex.
    • Suppose
      X
      is a discrete R.V. and the set of possible value
      A={2,4,}
      . The CDF is
      FX(t)=11t2
      . What is
      E[X]
      ?
    • sol
      • E[X]=i=1(xixi1)(1FX(xi1))=21+i=221(i1)2=2+π23
  • Thm (Expectation of a Function of r.v.) Let
    X
    be a discrete R.V. with the set of possible value
    S
    and PMF
    pX(x)
    . And let
    g()
    be a real valued func. Then
    E[g(X)]=xSg(x)pX(x)
    • Also called Law of the unconscious statistician (LOTUS)

    pf
    Define a R.V

    Y:=g(Y)
    E[g(X)]=E[Y]=y:y=g(x)xSypY(y)=y:y=g(x)xSyx:g(x)=ypX(x)

    =(x,y):g(x)=yxSypX(x)=xSg(x)pX(x)

  • Prop
    E[g(X)+h(X)]=E[g(X)]+Eh(X)

    pf

    E[g(X)+h(X)]=(g(x)+h(x))pX(x)=g(x)pX(x)+h(x)pX(x)
    =E[g(X)]+E[h(X)]

Moment and Variable

  • moments
    function expectation
    g(X)=X2
    2-nd moment
    g(X)=Xn
    n-th moment
    g(X)=(XμX)2
    2-nd central moment
    g(X)=(XμX)n
    n-th central moment
    g(X)=etX
    moment generating func.
  • Def (Variance)
    Var[X]:=E[(XμX)2]
    • also denote as
      σX2
  • ex
    • Suppose we are given a random variable
      X
      . We need to output a prediction of
      X
      (denoted by
      z
      ). The penalty of prediction is
      (Xz)2
      . What is the minimum expected penalty?
    • sol
      • g(z):=E[(Xz)2]=E[X22zX+z2]=E[X2]2zE[X]+z2

        =(zE[X])2+E[X2](E[X])2E[X2](E[X])2=Var(X)
    • Variance = minimum expected quadratic penalty
  • Thm
    Var[X]:=E[X2](E[X])2

    pf

    Var[X]=E[(XμX2)]=E[X2]2μXE[X]+μX2
    =E[X2]2(E[X]2)+(E[X])2=E[X2](E[X])2

  • Prop
    • Var[X+c]=Var[X]
    • Var[aX]=a2Var[X]
    • E[X2](E[X])2

      pf

      Var[X]=E[X2](E[X])0

  • Def (Existence) Let
    X
    be a random variable. Then, the n-th moment of
    X
    (i.e.
    E[X]
    ) is said to exist if
    E[|Xn|]<
    • ex.
      • Let
        zn=(1)nn
        ,
        n=1,2,
        . Let
        Z
        be a random variable with the set of possible values
        {zn:n=1,2,}
        and the PMF
        pZ(z)
        as
        pZ(zn)=6(πn)2
        . What is Var[X]?
      • Var[Z]=E[Z2](E[Z])2

        E[|Z2|]=n=1(n)26(πn)2=

        So
        Var[Z]
        DNE.
  • Thm If
    E[|X|n+1]<
    , then
    E[|X|n]<
    • pf
      • E[|Xn|]=xS|X|npX(x)=x:|x|1xS|X|npX(x)+x:|x|>1xS|X|npX(x)

        <1+x:|x|>1xS|X|n+1pX(x)<

E[X]
and
Var[X]
of Special Discrete R.V.

  • Bernoulli R.V.:
    X
    Bernoulli
    (p)
    • E[X]=p
    • E[X]=p(1p)
  • Binomial R.V.:
    X
    Binomial
    (n,p)
    • E[X]=np
      • E[X]=k=0nk(nk)pk(1p)nk=k=0nkn!k!(nk)!pk(1p)nk

        =npk=1n(n1)!(k1)!(nk)!pk1(1p)nk=np
      • (n1)!(k1)!(nk)!pk1(1p)nk
        is the probability of Binomial(n-1, p), so the sum would be 1
    • Var[X]=np(1p)
      • E[X2]=k=0nk2n!k!(nk)!pk(1p)nk

        =k=1n(k1)n!(k1)!(nk)!pk(1p)nk

        +k=0nn!(k1)!(nk)!pk(1p)nk

        =n(n1)p2k=2n(n2k2)pk2(1p)nk+np=n(n1)p2+np

        Var[X]=E[X2](E[X])2=n(n1)p2+np(np)2=np(1p)

Continuous Random Variables

Probability Density Function (PDF)

  • Def (PDF) Let
    X
    be a random variable. Then,
    fX(x)
    is the PDF of
    X
    if for every subset
    B
    of the real line, we have
    P(XB)=BfX(x)dx
    • P(XR)=1
    • P(Xt)=tfX(x)dx
    • P(aXb)=abfX(x)dx
    • P(aX<b)=abfX(x)dxP(X=b)=abfX(x)dx
  • Check valid (3 axioms of probability)
    • P(xR)=1fX(x)dx=1
    • P(XA)0AfX(x)dx0
    • P(Xi1Ai)=i1P(XAi)AifX(x)dx=i1AifX(x)dx
  • CDF & PDF
    • Let
      X
      be a random variable with a CDF
      FX()
      and a PDF
      fX()
    • PDF -> CDF:
      FX(t)=P(Xt)=tfX(x)dx
    • CDF -> PDF:
      fX(x0)=FX(x0)
      when
      fX()
      is continuous at
      x0
    • like Fundamental Thm. of Calculus
    • Given CDF, the derived PDF is not unique
    • Image Not Showing Possible Reasons
      • The image was uploaded to a note which you don't have access to
      • The note which the image was originally uploaded to has been deleted
      Learn More →

Uniform Random Variables

  • Def A random variable
    X
    is uniform with parameters
    a,b (a<b)
    if its PDF is
    fX(x)={1ba, a<x<b0, otherwise
  • ex
    • Let
      X
      be a random variable with CDF
      F(t)
      . Define another random variable
      Y=F(X)
      . What type of random variable is
      Y
    • sol
      • FY(t)=P(Yt)=P(F(X)t)

        To simplify, let
        t=0.5
        . Then we are finding that the probability of
        F(X)0.5
        , which is just 0.5
        So
        P(F(X)t)={1, t1t, 0<t<10, t0
  • Image Not Showing Possible Reasons
    • The image was uploaded to a note which you don't have access to
    • The note which the image was originally uploaded to has been deleted
    Learn More →
    • 目的:對於一個已知 CDF 的分佈,我們如何從中 sample
    • pf
      • The CDF we generate is
        P(Xt)=P(F1(U)t)=P(F(F1(U))F(t))

        =P(UF(t))=F(t)
    • 假設
      X
      為一個連續隨機變數,其 CDF 為
      FX
      。此時,隨機變數
      Y=FX(X)Unif(0,1)
      。ITS 即是將該過程反過來進行:首先對於隨機變數
      Y
      ,我們從 0~1 中隨機均勻抽取一個數
      u
      。之後,由於隨機變數
      FX1(Y)
      X
      有著相同的分布,
      x=FX1(u)
      即可看作是從分布
      FX
      中生成的隨機樣本。 Ref