Probability and Statistics

  • Law of Total Variance

    Conditional Expectation Let X and Y be two discrete random variables. The conditional probability function of X given Y=y is (1)Pr(X=x|Y=y)=Pr(X=x,Y=y)P(Y=y) Thus the conditional expectation of X given that Y=y is (2)E(X|Y=y):=xxPr(X=x|Y=y) Clearly the conditional expectation E(X|Y) is a function of Y, or put it another way, a random variable depending on Y, instead of X.

  • Gaussian Distribution

    Gaussian Distribution One-dimensional Suppose 1-d random variable xN(μ,σ2), then its density function is (3)p(x)=1σ2πe12(xμσ)2 To verify that it integrates to 1,

  • Unconscious Statistics

    Law of the Unconscious Statistician In probability theory and statistics, the law of the unconscious statistician (LOTUS), is a theorem used to calculate the expected value of a function g(X) of a random variable X when one knows the probability distribution of X but one does not know the distribution of g(X).

  • Whitening

    Whitening Data whitening is the process of converting a random vector X with only first-order correlation into a new random vector Z such that the covariance matrix of Z is an identity matrix.

  • 随机变量的收敛

    依概率收敛(convergence in probability)

  • 特征函数

    定义 感性认知 根据泰勒级数我们可以得知,两个函数\(f(x),